Mixed methods research combines qualitative and quantitative data to provide a more complete understanding of complex research problems. This guide covers: 1) when to use mixed methods, 2) six core research designs (convergent, explanatory, exploratory, embedded, transformative, multiphase), 3) integration strategies using joint displays, 4) APA JARS-Mixed reporting standards, 5) quality assessment criteria, and 6) common mistakes to avoid. For graduate students conducting rigorous research, mixed methods offers comprehensive insights but requires careful planning and genuine integration.
As a graduate student designing your research study, you’ve likely encountered the classic dilemma: should you use qualitative or quantitative methods? The answer might be both.
Mixed methods research (MMR) has evolved from a novel approach to a mainstream methodology across disciplines. According to Creswell and Plano Clark (2017), mixed methods is now widely accepted in education, health sciences, social sciences, business, nursing, and psychology. The fundamental premise rejects the “paradigm wars” of qualitative versus quantitative, instead adopting a pragmatic approach where the research question dictates the methods, not philosophical allegiance.
However, mixed methods is not simply “sprinkling” open-ended questions into a survey. It requires deliberate design, rigorous execution of both strands, and genuine integration. The hardest but most crucial step is integration—making connections between qualitative and quantitative data to produce meta-inferences that neither method alone could provide.
This comprehensive guide synthesizes current best practices from leading methodologists (Creswell & Plano Clark, Tashakkori & Teddlie, Fetters & Curry) and official standards (APA JARS-Mixed) to help you understand, design, implement, and report high-quality mixed methods research.
Mixed methods research is both a method and methodology that involves:
“Collecting, analyzing, and mixing both quantitative and qualitative data in a single study or series of studies to provide a more comprehensive understanding of research problems than either approach alone.”
Key Components:
While researchers combined methods for decades, modern mixed methods emerged as a distinct, systematic approach in the late 1980s and early 1990s. Pioneers like John W. Creswell, Vicki L. Plano Clark, Abbas Tashakkori, and Charles Teddlie developed detailed typologies and frameworks that transformed MMR from ad hoc combinations to sophisticated, fully integrated approaches.
The field has evolved from simple combination to contemporary standards emphasizing genuine integration through tools like joint displays (Fetters & Curry, 2013). The American Psychological Association’s JARS-Mixed guidelines (2020) now provide specific reporting standards, cementing mixed methods as a mature, respected methodology.
Mixed methods is particularly valuable when your research question requires BOTH numerical patterns AND rich, contextual understanding. Specifically:
Decision Flow:
Do you need BOTH numerical patterns AND rich, contextual understanding?
├─ Yes → Mixed Methods appropriate
└─ No → Use single method (QUAL or QUAN)
Mixed methods is now widely accepted across diverse fields:
If you’re in any of these fields, learning mixed methods will enhance your research toolkit significantly.
Advantages:
Limitations:
Choosing the right design is crucial. Here are the six core designs you should know, with examples to help you select appropriately.
Purpose: Collect and analyze QUAL and QUAN data simultaneously, then merge findings to compare/validate results.
Timing: Concurrent (QUAL ↔ QUAN)
Process:
When to Use: When you want to compare findings, validate results, or provide comprehensive description.
Example:
– Quantitative: Survey 200 patients on treatment satisfaction (numerical scores)
– Qualitative: Interview 15 patients about their experiences (themes)
– Integration: Compare themes with survey results in joint display to see if qualitative explanations align with quantitative patterns
Purpose: QUAN data collected first, QUAL data collected second to explain initial quantitative results.
Timing: Sequential (QUAN → QUAL)
Process:
When to Use: When survey results need explanation, outliers need exploration, or you want to drill down into unexpected patterns.
Example:
Purpose: QUAL data collected first to explore phenomenon, then QUAN data to test/measure those findings.
Timing: Sequential (QUAL → QUAN)
Process:
When to Use: When phenomena are not well-understood, measurement tools don’t exist, or you need to develop instruments grounded in lived experience.
Example:
Purpose: One data type has secondary priority within a larger primary method.
Timing: Can be concurrent or sequential
Process:
When to Use: When one method can’t answer all questions, but one method is dominant. Good for students with limited time/resources.
Example:
Purpose: Integrates social justice/action agenda throughout mixed methods study.
Key Features:
When to Use: When your research aims to address inequities, empower communities, or drive social change.
Purpose: Multiple phases where each phase builds on previous ones (often used in program development/evaluation).
Structure: QUAL → QUAN → QUAL → QUAN (iterative cycles)
Example:
When to Use: Complex, longitudinal projects involving program development, implementation, and evaluation.
What is your primary purpose?
├─ EXPLORATORY (understand phenomenon)
│ └─ Exploratory Sequential (QUAL → QUAN)
├─ EXPLANATORY (explain results)
│ └─ Explanatory Sequential (QUAN → QUAL)
├─ COMPARATIVE/TRIANGULATION
│ └─ Convergent Parallel (QUAL + QUAN simultaneous)
└─ COMPLEMENTARY WITH HIERARCHY
└─ Embedded (one primary, one secondary)
For students starting out, explanatory sequential or embedded designs are often most manageable, as they allow you to build on existing strengths while learning the other method.
Integration is the deliberate connecting of qualitative and quantitative components. It’s the defining feature of true mixed methods research. Without genuine integration, you merely have two separate studies reported together.
Fetters, Curry, and Creswell (2013) propose integration at three levels:
What: Choosing designs that inherently integrate methods (convergent, sequential, embedded)
How: During study planning phase, select a design that creates integration points by design
This occurs during data collection and analysis through four approaches:
Approaches:
Definition: Tables or figures that display qualitative and quantitative results together to facilitate comparison and meta-inference.
Recommended Structure:
| Theme/Finding | Quantitative Results | Qualitative Insights | Integration Interpretation |
|---|---|---|---|
| Student motivation | 72% reported high motivation (M=4.2/5, SD=0.8) | Themes: autonomy, relevance, instructor feedback | Mixed methods support that both structured instruction and supportive environment contribute to motivation |
| Program effectiveness | Post-test scores: +15% improvement (p<.01) | Participants described “turning point moments” of understanding | Quantitative gains align with qualitative accounts of specific learning experiences |
Best Practices:
Source: Fetters et al. (2013) Achieving Integration in Mixed Methods Designs—Principles and Practices (7460+ citations)
Key Decision: Same participants vs. different participants?
Quantitative Strand:
Qualitative Strand:
Integration Challenge: Aligning samples that serve different purposes. Solution: For sequential designs, use quantitative phase to identify participants for qualitative phase; for concurrent designs, parallel samples acceptable as long as they’re drawn from same population.
Quantitative Methods: Surveys (questionnaires, standardized scales), experiments/quasi-experiments, secondary data analysis, structured observations, tests/measurements
Qualitative Methods: Individual interviews (semi-structured, unstructured), focus groups, observations (participant/non-participant), document analysis, case studies, field notes
Integration Tip: Coordinate timing—collect both simultaneously for convergent designs, or sequence deliberately for explanatory/exploratory designs.
Quantitative Analysis: Descriptive statistics (means, frequencies, distributions), inferential statistics (t-tests, ANOVA, regression, correlations), use appropriate software (SPSS, R, Stata)
Qualitative Analysis: Thematic analysis (most common for beginners), content analysis, grounded theory coding, discourse analysis, use qualitative software (NVivo, ATLAS.ti, Dedoose)
Principle: Each strand must stand on its own methodological rigor before integration.
Comparing Results: Look for:
Addressing Discordance: When QUAL and QUAN results disagree:
1. Re-examine data quality for each strand
2. Consider contextual factors
3. Explore theoretical explanations
4. Present as valuable insight—divergence often reveals the complexity of the phenomenon
Simple Example Joint Display:
| Theme/Finding | Quantitative Results | Qualitative Insights | Integration |
|---|---|---|---|
| Student motivation | 72% reported high motivation (M=4.2/5, SD=0.8) | Themes: autonomy, relevance, instructor feedback | Mixed methods support that both structured instruction and supportive environment contribute to motivation |
| Program effectiveness | Post-test scores: +15% improvement (p<.01) | Participants described “turning point moments” of understanding | Quantitative gains align with qualitative accounts of specific learning experiences |
Mixed methods quality assessment must address both strands AND their integration.
Traditional validity/reliability criteria are insufficient for MMR. They propose legitimation (continuous quality assurance process):
The American Psychological Association’s Journal Article Reporting Standards for mixed methods (2020) are essential reading. Key requirements:
Method Section: Methodological purpose statement, rationale for mixed methods, integration description (when/how/priority), procedures (sequence diagram if sequential)
Results Section: Complete quantitative results, complete qualitative results, integration results with joint displays (required)
Discussion Section: Meta-inferences (synthesized conclusions), implications emerging from integration
Download the complete JARS-Mixed checklist: apastyle.apa.org/jars/mixed-methods
Problem: “I used mixed methods because it’s trendy” or vague reasoning.
Solution: Explicitly state how the research question requires both QUAL and QUAN: “This question required both [quantitative pattern identification] and [qualitative understanding of mechanisms] because…”
Problem: Using convergent when sequential would better address the question.
Solution: Use the decision flowchart in this guide. Ask: “Does one phase inform the other?” → Sequential; “Do I need to compare/validate at same time?” → Concurrent; “Is one method clearly secondary?” → Embedded.
Problem: Running out of time, rushed analysis, incomplete integration.
Solution: Double time estimates for a single-method study. For theses/dissertations, plan 1.5-2x longer. Consider embedded design if severely time-constrained.
Problem: Collect both datasets but don’t know how to connect them.
Solution: Before data collection, answer these questions:
– When will integration occur (design, data collection, analysis, interpretation)?
– How will datasets be connected (joint displays, transformation, narrative)?
– What is integration priority (equal, QUAL-dominant, QUAN-dominant)?
– Who will perform integration (single researcher or team)?
Problem: “Sprinkling” open-ended questions in survey and calling it mixed methods.
Solution: Ensure: distinct QUAL and QUAN strands, genuine mixing with clear integration point(s), joint displays or explicit comparison in results section.
Too Early: Integrating before separate analysis is complete
Too Late: Integration only mentioned in discussion, not results
Solution: Integrate at multiple stages: design phase, data collection phase, analysis phase with joint displays, interpretation with meta-inferences
Problem: Forcing convergence when QUAL and QUAN disagree.
Solution: Discordance is valuable data, not a problem. Explore why they differ (sampling bias? measurement? context?), present divergence as insight into phenomenon’s complexity, and quote: “When qualitative and quantitative findings diverge, we don’t force agreement; we investigate meaning.”
Problem: Failing to report: philosophical foundation, integration plan, joint displays, researcher positionality/reflexivity.
Solution: Use APA JARS-Mixed checklist as pre-submission checklist; never omit joint displays.
Abstract: Include “mixed methods” and design type (e.g., “convergent parallel mixed methods design”)
Introduction: Problem statement → Research questions; rationale for mixed methods; brief mention of design
Methods: Overall design; philosophical foundation; participants/sampling with sample alignment justification; data collection procedures (separate but linked); data analysis (separate for each, then integration); integration plan (when/where/how)
Results: Quantitative results; qualitative results; Integration section: Joint display(s) + narrative interpretation
Discussion: Summary of key findings; meta-inferences (integrated conclusions); implications (theoretical/practical); limitations (each strand + integration limitations); future research
Typical Question: “How do teaching methods affect student outcomes AND how do students experience these methods?”
Common Design: Explanatory sequential, convergent
Example: Flipped classroom study—quantitative test scores + qualitative interviews on learning experiences; joint display connects performance differences to engagement themes
Typical Question: “What is intervention effectiveness AND how do patients experience it?”
Common Design: Embedded RCT with qualitative process evaluation, explanatory, convergent
Example: Diabetes self-management—HbA1c levels + patient interviews; integration contextualizes clinical outcomes with lived experience
Typical Question: “How do social policies affect behaviors AND how do people make decisions?”
Designs: All designs used; exploratory sequential for understudied populations; convergent for comprehensive community studies
Example: Unemployment impacts—survey statistics + life history interviews; integration explains statistical patterns through narratives
Typical Question: “What are market trends AND how do customers feel about them?”
Common Design: Explanatory sequential, convergent, multiphase
Example: New product adoption—sales figures + focus groups; integration identifies barriers to adoption
Mixed methods research represents a powerful, sophisticated approach to complex research questions that cannot be adequately addressed by qualitative or quantitative methods alone. Success requires:
Recommended Resources:
Need help with foundational research skills? Check these related resources:
Designing and executing a rigorous mixed methods study is complex. Our team includes methodologists with PhDs in research methodology who can help you:
Get expert help with your mixed methods research project. We’ll match you with a methodologist experienced in your discipline.
Mixed methods research offers the most comprehensive approach to complex research questions by combining the breadth of quantitative data with the depth of qualitative insights. However, success requires:
The most common student error is superficial integration—collecting both data types but failing to genuinely connect them. Avoid this by planning integration deliberately before data collection, using joint displays systematically, and addressing integration throughout the manuscript (methods, results, discussion).
Your immediate next steps:
Mixed methods is a powerful methodology that, when done correctly, produces research with rich insights and practical impact. While it demands more upfront planning and effort than single-method studies, the payoff in comprehensive understanding is substantial. Good luck with your research!