Beyond initial impressions

Editor’s note: Charles H. Ptacek is president of Charles, Charles & Associates Inc., a Gold Canyon, Ariz., research firm.

In business-to-business market research, the collection of qualitative data stands in contrast to quantitative data collection procedures, which impose a predetermined framework upon participants. The final product is not a series of crosstabulations reported in a stark, dry format. Rather, it is, or should be, an active set of narrations, quotations and interpretations that come alive with feelings. In many instances, the chief value of the final product will be its presentation of what the respondents had to say, in their own words.

While it is probably fair to say that by the best of current standards, analysis of qualitative data is a mysterious, half-formulated art, the success or failure of qualitative research resides in the investigative and interpretive skills of the researcher and his tools.

There are three basic types of analysis frameworks that can be used with the collection of B2B qualitative information. The most basic level of analysis is commonly used when time and cost limitations are severe. The researcher simply prepares a brief, impressionistic summary of the principal findings, depending mainly on his own memory. The second and probably most commonly-employed qualitative analysis uses the traditional fact-sheet or matrix approach. Here, the analyst listens and relistens to recordings of the interviews, copying down significant segments, fitting the respondents’ reactions into a more general scheme derived from his understanding of the history and present status of the business problem.

When the analyst is imaginative, discerning and skillful, reports using this traditional analysis can be stimulating, creative and fascinating, as well as infuriating, to anyone who is not prepared to take the analyst at his word. As noted by one authority, with this kind of analysis the respondents’ manifest reactions make roughly the same contribution to the final report as the patient’s free associations make to a psychoanalyst’s case report.

Morphological content analysis, a third method of qualitative analysis that can be employed with B2B surveys of knowledgeable persons, is designed to preserve the most significant interview material more or less intact and still allow for imaginative interpretations. This method of analysis preserves the most relevant sections of each interview and presents the interview material in an organized coherent framework. It provides the analyst with an opportunity to identify key relationships, interpret where inferences are needed and point out implications.

As a mode of observation, content analysis is essentially an operation of coding communications in terms of some conceptual framework. In morphological content analysis (MCA), as in other research methods, you must refine your conceptual framework and develop specific methods for observation in relation to that framework. Hence, coding in MCA involves the logic of conceptualization and operationalization. Morphological content analysis is neither fast nor cheap; however, it can have a high return-on-investment when it comes to qualitative depth analysis.

Range of impressions

A qualitative analysis of in-depth B2B communications is usually complicated by the wealth and range of knowledgeable persons’ comments. Although other analyses such as the fact-sheet matrix approach rely on preconceptions, which tend to oversimplify and undervalue the analysis process, MCA captures the range of impressions and observations on each topic discussed and provides a framework for interpreting them in light of hypotheses generated by the MCA analytic technique. The MCA qualitative procedure can best be described as a psychological inquiry involving a number of in-depth tasks that are performed interactively (Figure 1).

The first step of a MCA involves an examination of the information collected from knowledgeable persons. All interviews should be recorded and transcribed for this analysis. To begin with, both the recordings and transcripts are reviewed a number of times. The analysis step is analogous to developing a data information file or codebook, and there are two general procedures that can be used to operationalize the areas of discussion during the coding process: deduction and induction.

Deduction represents reasoning from the general to the specific. In our situation, this means that classification is accomplished by defining general areas and then grouping the specifics accordingly. For example, the original research objectives can provide a preliminary list of areas of discussion concentration for classification purposes. Transcript segments are bracketed and coded by subject matter. The coding is simply the analyst’s notes, written in the margin of the transcript, on what the bracketed segment is about and where it should be classified. Bracketed segments of discussion concentration are commonly represented by abbreviations or tags and this process is often referred to as tagging. The transcripts are then organized into areas of investigative interest following the detailed coding process. With this approach, respondent data are distilled and described according to the information objectives and category codes that were established for the project.

Categories emerge

In the inductive approach, the classification scheme is not imposed by the researcher; rather, the categories emerge from discovery analysis. Individual observations become the means by which an organization or typology evolves. Using this exploratory reduction procedure, the emerging themes or patterns are analogous to the procedures of cluster and factor analysis. Specific observations are used to build toward general patterns. In most situations, the inductive approach is used when the research problem is related to exploratory goals or in the early stages of hypothesis generation.

Most recently, the computer has enabled social scientists to automate the content analysis process. The analyst’s task consists of inputting the text and the output serves as the basis for subsequent interpretation and synthesis. Compared with human-coded interpretive modes of analysis, one of the most important advantages of computer-aided content analysis is that the rules for coding text are made explicit via the development of custom or standard dictionaries, and, once formalized, the computer provides perfect coder reliability.

Some readily-available computer-assisted text and data analysis tools include QDA Miner, WordStat and Simstat. Simstat is a Windows-based full-purpose statistical package not unlike SPSS for Windows. TextSmart by SPSS represents software designed primarily for the analysis of open-ended survey responses and uses cluster analysis as well as multidimensional scaling (MDS) techniques to automatically analyze key words and group text into categories. The Windows version of the TEXTPACK program, which was also originally designed for the analysis of open-ended survey responses, provides a multi-unit data file output that can be imported in statistical analysis software such as SPSS or SAS. TEXTPACK has been extended over the years to cope with many aspects of computer-aided text analysis.

Interpretive investigation

Following the analysis step, an interpretive investigation is performed on each area of discussion concentration. This step is marked “interpretation” and involves reviewing the discussion within each area of concentration, developing subcategories of responses, interpreting these responses, and finally, developing a hierarchical organization of information.

If one is using a computer algorithm to perform this stage of the content analysis, then this investigation will proceed from a strict counting of words to a broader categorizing procedure. For example, a dictionary-driven WordStat analysis enables the researcher to quantify units of meaning by grouping words or phrases together employing heat plots that show relative correlation between words, phrases or categories using a color spectrum. Hierarchical clusterings of these units of meaning can be displayed using dendograms (tree groups).

The clustering of data is a method for inferring meaningful interpretations from the patterns that emerge. TextSmart has an easy-to-use Windows interface that allows for quick sorting of words into frequency and alphabetical lists and produces graphics such as bar charts and two-dimensional MDS plots for interpretation.

The synthesis step includes integrating this information and interpreting how it relates to various issues and hypotheses. The integration process involves the systematic combination of discussion elements and the generalization of research findings. New ideas and insightful interpretations result from new combinations and associations of respondent discussion elements. For example, understanding a network of interrelated unmet customer needs could provide the basis for new product concepts that address a generalized void in the B2B marketplace.

Clearly, this step can be considered highly subjective and value-added and hence, depends largely upon the analyst’s ability to uncover creative interpretations of the information. With MCA, this is made possible by morphological synthesis - forming new relationships between areas of concentrations derived from the juxtaposition of discussion elements.

Morphological synthesis pertains to the analysis of structure created from the interpretation stage of the content analysis. Once the structure is created, forced relationship techniques can be used to investigate provocative new patterns of interpretations. By coming up with different combinations and variations of the discussion elements, you create innovative new ideas and interpretations. The output derived from MCA is far more creative, objective and meaningful than can be derived from a cursory treatment of qualitative information (an industry standard).

Three general classes

Content analysis is especially appropriate for three general classes of research situations. The first two classes of research are common in political science, journalism and communications research, where quantification is the most distinctive feature of content analysis. What is implied by the quantification emphasis is that the communication data be amenable to statistical methods not only for a precise summary of findings but also for interpretation and inference.

In most instances, the quantification process is a matter of simply noting the presence or absence of a category within the collected data set. This lowest level of measurement can be performed on words, collections of words or even themes. The most straightforward example of this is the use of previously-mentioned dictionary-driven, word-processing programs to do basic text analysis. The value of computer-enhanced applications of content analysis is that they fill a methodological gap between small-group discussions, which may give impressionistic understanding but no statistical interpretations, and large quantitative surveys, which often produce pages of numbers with no depth of meaning.

The third class of research is more common in personal research applications where some form of content analysis is necessary when the respondent’s own language and mode of expression is crucial to the investigation. Such is the case in B2B marketing research where individual depth interviews and professional group interviews are employed as the data collection methodology. With regard to morphological content analysis, the emphasis is on creative analysis, synthesis and interpretation of respondent discussions - who says what, to whom, how and why, and with what effect? As a mode of observation, MCA requires a considered handling of “why” to achieve an in-depth understanding and interpretation of professional participant attitudes and behavior.

Sample application

To explore some of the concepts outlined above, let’s look at a sample application of MCA involving the concept of home entertainment systems of the future.

The emergence of cable and satellite TV and the penetration of peripheral electronic equipment such as VCRs, DVD/Rs, video-game consoles and personal computers represent technological developments external to the TV monitor. Since so many peripheral electronic products interface with the TV monitor or panel display, it is possible that the video display will become the central component of the home entertainment system of the future. For this to happen, the display must meet the requirements set forth by the peripheral-equipment manufacturers and ultimately meet the wants and needs of consumers adopting the peripheral equipment.

The purpose of this exploratory research assignment was to investigate the viability of several new TV product enhancements and to document expectations for them. In total, eight focus groups were held with dealers of peripheral-electronic equipment. Each session lasted approximately two hours and involved nine to 11 participants. All sessions were recorded and transcripts were prepared for each group session. Over 500 pages of transcripts were generated and over 1,000 specific responses or discussion elements were coded, edited and taken from these transcripts and organized into subcategories for the morphological content analysis.

Unmet needs

The chief areas of discussion concentration revealed a network of interrelated unmet needs surrounding a generalized void for integrated video entertainment systems. A summary of product requirements as presented by the professional electronic equipment dealers and associated market-driven enhancements consistent with the entertainment system void as well as the planned peripheral enhancements are presented in Figure 2.

Results from this content analysis demonstrated to the client that the planned peripheral television enhancements did not address the network of unmet needs associated with peripheral equipment requirements. Likewise, the idea that all one needs to do is to add a nice-looking ensemble to the product offering or enhance the sound quality and other componentry would fall short of meeting these requirements. For example, ensembles may be an important element of the entertainment system because they enhance the aesthetic appeal and at the same time solve practical space and storage problems, but by no means do they satisfy all the wants and needs.

To address only a portion of the problems that make up the network of unmet needs would not fill the void revealed by the content analysis and, hence, could result in introducing a new-product failure. Moreover, although the monitor or panel display may become the controlling focal component of the home entertainment system of the future, its status as a legitimate piece of furniture is likely to decline over the next 10 years. Based on the results of this study and the morphological content analysis, the client revised its new product concept to better meet the requirements of the evolving marketplace.

More profound

The use of the word “depth” implies seeking information that is more profound than is usually accessible by traditional research methods. MCA is designed to appraise all of the facts and boundary conditions needed for an in-depth deduction or induction of relevant interpretations. Because of its suggestive power, the process enables a researcher to make new relationships between component facts by systematically combining these elements into new ideas and innovative interpretations. More importantly, it insures a fruitful type of profound thinking and exploration of all the possible practical applications of the results.

MCA is probably more appropriate for diagnostic interviewing such as individual B2B depth interviews, although it can be used with some of the more elaborate group depth designs as demonstrated. We have found that group depth interviews employing MCA are particularly useful in the development phases of a research program.

Perhaps the major function of MCA in these qualitative B2B research designs is to generate creative and fruitful hypotheses. With regard to individual depth interviews, we have employed MCA in many different technical product assignments and have never been disappointed.

Morphological content analysis is no panacea, but it does meet and usually exceeds the analysis and interpretation objectives associated with professional in-depth B2B diagnostic interviewing. This form of discovery analysis is critical in understanding and documenting the essence and meaning of communication in its natural state - the presentation of what participants had to say, in their own words, and what it means to the client’s situation.