scispace - formally typeset
Search or ask a question
Book

Qualitative Data Analysis

TL;DR: In the field of qualitative data analysis, qualitative data is extremely varied in nature. It includes virtually any information that can be captured that is not numerical in nature as mentioned in this paper, which is a generalization of direct observation.
Abstract: Qualitative data is extremely varied in nature. It includes virtually any information that can be captured that is not numerical in nature. Here are some of the major categories or types: In-Depth Interviews In-Depth Interviews include both individual interviews (e.g., one-on-one) as well as "group" interviews (including focus groups). The data can be recorded in a wide variety of ways including stenography, audio recording, video recording or written notes. In depth interviews differ from direct observation primarily in the nature of the interaction. In interviews it is assumed that there is a questioner and one or more interviewees. The purpose of the interview is to probe the ideas of the interviewees about the phenomenon of interest. Direct Observation Direct observation is meant very broadly here. It differs from interviewing in that the observer does not actively query the respondent. It can include everything from field research where one lives in another context or culture for a period of time to photographs that illustrate some aspect of the phenomenon. The data can be recorded in many of the same ways as interviews (stenography, audio, video) and through pictures, photos or drawings (e.g., those courtroom drawings of witnesses are a form of direct observation). Written Documents Usually this refers to existing documents (as opposed transcripts of interviews conducted for the research). It can include newspapers, magazines, books, websites, memos, transcripts of conversations, annual reports, and so on. Usually written documents are analyzed with some form of content analysis. sumber : http://www.socialresearchmethods.net/kb/qualdata.php
Citations
More filters
Journal ArticleDOI
TL;DR: Inductive content analysis is used in cases where there are no previous studies dealing with the phenomenon or when it is fragmented, and a deductive approach is useful if the general aim was to test a previous theory in a different situation or to compare categories at different time periods.
Abstract: Aim This paper is a description of inductive and deductive content analysis. Background Content analysis is a method that may be used with either qualitative or quantitative data and in an inductive or deductive way. Qualitative content analysis is commonly used in nursing studies but little has been published on the analysis process and many research books generally only provide a short description of this method. Discussion When using content analysis, the aim was to build a model to describe the phenomenon in a conceptual form. Both inductive and deductive analysis processes are represented as three main phases: preparation, organizing and reporting. The preparation phase is similar in both approaches. The concepts are derived from the data in inductive content analysis. Deductive content analysis is used when the structure of analysis is operationalized on the basis of previous knowledge. Conclusion Inductive content analysis is used in cases where there are no previous studies dealing with the phenomenon or when it is fragmented. A deductive approach is useful if the general aim was to test a previous theory in a different situation or to compare categories at different time periods.

14,963 citations

Journal ArticleDOI
TL;DR: The authors operationalize saturation and make evidence-based recommendations regarding nonprobabilistic sample sizes for interviews and found that saturation occurred within the first twelve interviews, although basic elements for metathemes were present as early as six interviews.
Abstract: Guidelines for determining nonprobabilistic sample sizes are virtually nonexistent. Purposive samples are the most commonly used form of nonprobabilistic sampling, and their size typically relies on the concept of “saturation,” or the point at which no new information or themes are observed in the data. Although the idea of saturation is helpful at the conceptual level, it provides little practical guidance for estimating sample sizes, prior to data collection, necessary for conducting quality research. Using data from a study involving sixty in-depth interviews with women in two West African countries, the authors systematically document the degree of data saturation and variability over the course of thematic analysis. They operationalize saturation and make evidence-based recommendations regarding nonprobabilistic sample sizes for interviews. Based on the data set, they found that saturation occurred within the first twelve interviews, although basic elements for metathemes were present as early as six...

12,951 citations

Journal ArticleDOI
TL;DR: In this article, the authors develop one of perhaps multiple specifications of embeddedness, a concept that has been used to refer broadly to the contingent nature of economic action with respect to cognition, social structure, institutions, and culture.
Abstract: This chapter aims to develop one of perhaps multiple specifications of embeddedness, a concept that has been used to refer broadly to the contingent nature of economic action with respect to cognition, social structure, institutions, and culture. Research on embeddedness is an exciting area in sociology and economics because it advances understanding of how social structure affects economic life. The chapter addresses propositions about the operation and outcomes of interfirm networks that are guided implicitly by ceteris paribus assumptions. While economies of time due to embeddedness have obvious benefits for the individual firm, they also have important implications for allocative efficiency and the determination of prices. Under the conditions, social processes that increase integration combine with resource dependency problems to increase the vulnerability of networked organizations. The level of investment in an economy promotes positive changes in productivity, standards of living, mobility, and wealth generation.

9,137 citations

Journal ArticleDOI
TL;DR: Although the general inductive approach is not as strong as some other analytic strategies for theory or model development, it does provide a simple, straightforward approach for deriving findings in the context of focused evaluation questions.
Abstract: A general inductive approach for analysis of qualitative evaluation data is described. The purposes for using an inductive approach are to (a) condense raw textual data into a brief, summary format; (b) establish clear links between the evaluation or research objectives and the summary findings derived from the raw data; and (c) develop a framework of the underlying structure of expe- riences or processes that are evident in the raw data. The general inductive approach provides an easily used and systematic set of procedures for analyzing qualitative data that can produce reliable and valid findings. Although the general inductive approach is not as strong as some other analytic strategies for theory or model development, it does provide a simple, straightforward approach for deriving findings in the context of focused evaluation questions. Many evaluators are likely to find using a general inductive approach less complicated than using other approaches to qualitative data analysis.

8,199 citations

Journal ArticleDOI
08 Jan 2000-BMJ
TL;DR: Qualitative research produces large amounts of textual data in the form of transcripts and observational fieldnotes, and the systematic and rigorous preparation and analysis of these data is time consuming and labour intensive.
Abstract: This is the second in a series of three articles Contrary to popular perception, qualitative research can produce vast amounts of data. These may include verbatim notes or transcribed recordings of interviews or focus groups, jotted notes and more detailed “fieldnotes” of observational research, a diary or chronological account, and the researcher's reflective notes made during the research. These data are not necessarily small scale: transcribing a typical single interview takes several hours and can generate 20–40 pages of single spaced text. Transcripts and notes are the raw data of the research. They provide a descriptive record of the research, but they cannot provide explanations. The researcher has to make sense of the data by sifting and interpreting them. #### Summary points Qualitative research produces large amounts of textual data in the form of transcripts and observational fieldnotes The systematic and rigorous preparation and analysis of these data is time consuming and labour intensive Data analysis often takes place alongside data collection to allow questions to be refined and new avenues of inquiry to develop Textual data are typically explored inductively using content analysis to generate categories and explanations; software packages can help with analysis but should not be viewed as short cuts to rigorous and systematic analysis High quality analysis of qualitative data depends on the skill, vision, and integrity of the researcher; it should not be left to the novice In much qualitative research the analytical process begins during data collection as the data already gathered are analysed and shape the ongoing data collection. This sequential analysis1 or interim analysis2 has the advantage of allowing the researcher to go back and refine questions, develop hypotheses, and pursue emerging avenues of inquiry in further depth. Crucially, it also enables the researcher to look for deviant or negative cases; that is, …

7,637 citations

Trending Questions (1)
What are the different types of data analysis interviews?

The different types of data analysis interviews include in-depth interviews and group interviews (focus groups).