Home ⮕ Chapter 2 - RESEARCH DESIGN & BASELINE DATA COLLECTION
This chapter outlines the research design and describes the approach for gathering baseline data, building on the objectives and research questions introduced in Chapter 1. It details the methodological framework, sampling strategy, data collection methods, and measurement indicators that will be used to assess Work for Impact’s social impact.
Table of content
To ensure a comprehensive evaluation of Work for Impact’s achievements, the research design must capture both broad trends and individual experiences. A mixed methods approach provides this dual perspective.
As well explained in John W. Creswell’s “Research Design: Qualitative, Quantitative, and Mixed Methods Approaches”, a mixed methods design combines the numerical strength of quantitative data with the depth of qualitative insights, making it especially well‑suited for evaluating Work for Impact’s social impact. Quantitative measures (e.g., surveys capturing contractor satisfaction, wage changes, and diversity metrics) provide clear, comparable benchmarks, while qualitative methods (e.g., in‑depth interviews and focus groups) uncover the personal experiences and contextual factors that explain how and why those changes occur. This “best of both worlds” approach aligns with the pragmatic goal of producing actionable evidence and the transformative commitment to elevating the voices of marginalized participants.
Among the various mixed methods typologies, the convergent parallel design is particularly appropriate. In this model, quantitative and qualitative data are collected simultaneously but analyzed separately, then merged to produce a comprehensive interpretation. By running surveys and interviews in parallel, the researcher can validate findings across methods - for example, confirming that a rise in reported empowerment scores aligns with contractors’ own stories of increased confidence and opportunity. This convergence not only strengthens the credibility of the conclusions but also ensures that the study’s outcomes reflect both the measurable impact and the lived realities of Work for Impact’s stakeholders.
Work for Impact’s core work centers on global talent recruitment and team integration - leveraging its digital platform to create partnerships with like-minded organizations, align with legal and regulatory frameworks, and engage marginalized communities to promote fair, transparent labor practices. Complementing this, the company’s Pathways Program delivers education and one‑on‑one mentoring to unlock the potential of young talent with limited access to services.
Building on the conceptual framework above, this section operationalizes the main constructs by providing clear definitions and specifying the corresponding indicators as well as their data types. These metrics guide the data collection and analysis to evaluate Work for Impact’s social impact and validate its Theory of Change.
A) Ethical Outsourcing
Definition: Through its digital platform Work for Impact matches purpose‑driven organizations with a highly-vetted global talent pool - particularly young professionals in underserved regions - ensuring equitable remuneration, secure contractual engagements, and strict adherence to ethical labor standards.
B) Educational Programs & Mentoring (Pathways)
Definition: Designing and delivering technology-focused training, micro-degree certifications, and personalized mentoring to equip young talent from underserved communities with essential skills, confidence, and career readiness for meaningful employment.
(who was reached and what was delivered)
A1) Access to equitable, ethical remote employment
Definition: The extent to which young talent from underserved or marginalized regions can secure remote work assignments - via the Work for Impact platform - that adhere to fair‑wage, secure‑contract, and transparent labor standards.
Indicators:
B1) Educational Programs & Mentoring (Pathways)
Definition: Designing and delivering training, micro‑degree certifications, and personalized mentoring to enhance professional skills, self‑confidence, and career readiness among talent from underserved communities.
Indicators:
(what is the experienced change as a result of the outputs)
Outputs → Outcomes, Overview
A2) Ethical Outsourcing - Outcomes
B2) Educational Program - Outcomes
This section defines the sampling approach for each data collection stream - surveys and interviews - across the two core activities and stakeholder groups. The goal is to ensure coverage, diversity, and sufficient depth to address the research questions.
To capture a full spectrum of stakeholder experiences, two tailored, census-style surveys will be administered: one to all active individual contractors on the WFI platform (N=322 as of April 2025) and one to all individuals who completed an educational program in the past 24 months (25 participants). Additionally, one survey will capture the experience of the participants of the on-going cohort of the Pathways Program (10 participants, women only).
Each survey is customized to reflect the specific touchpoints of global talent recruitment and integration or training/mentoring, and will be distributed via email with assurances of confidentiality and an estimated 10-minute completion time. A reminder notice will be sent during the two-week response period. Rates will be monitored by region and program cohort to ensure balanced representation.
The surveys include age, gender, and region to track participant diversity and address any response gaps.
The survey questions and links can be found in Appendix A. All surveys are sent out on May 8th 2025.
Interview slots were allocated in direct proportion to each region’s share of open contracts on the WFI platform, ensuring that regions with higher volumes of active engagements - such as Asia Pacific and Latin America - received more interview opportunities while trying to maintain representation across all six defined regions.
Contractors per region
To enrich the quantitative findings from the survey, the researcher will adopt a two-fold qualitative strategy with the following samples:
Case Study Review:
A selection of 3–5 existing Pathways case studies - already compiled by Work for Impact - will be examined to identify common themes, success factors, and illustrative participant stories.
Targeted Alumni Interviews:
In consultation with WFI staff, 1-3 program graduates will be purposely selected to represent diverse experiences (e.g., different regions, varying levels of reported skill gains).
Each semi-structured interview (45–60 minutes) will be conducted via video call, with detailed note-taking to capture key insights on job readiness, confidence, fair pay negotiation, and community impact.
By combining rich, pre-documented narratives with fresh, focused conversations, this approach ensures both broad and deep understanding on how the educational programs translate into real-world outcomes.
A target of 50+ completed surveys provides sufficient data points to calculate reliable descriptive statistics (means, distributions, cross-tabulations by region or demographic group), while approximately 15 in-depth interviews are expected to achieve thematic saturation. Together, this mixed-methods sample size strikes an effective balance between breadth (quantitative representativeness) and depth (qualitative nuance), supporting credible, actionable findings.
This section details the procedures and sources used to gather the quantitative and qualitative data required to measure outputs and outcomes for both Ethical Outsourcing Activities and Educational Programs & Mentoring.
Quantitative data refers to numerical information that can be measured and analyzed statistically. In this study these are indicators such as counts of job placements, average contract lengths, or likert-scale survey responses. Qualitative data captures rich, descriptive insights - like interview transcripts or open-ended survey comments - that reveal the “why” and “how” behind those numbers.
By combining both in a mixed methods approach, one gains the breadth of clear, comparable metrics (quantitative) and the depth of personal experiences and contextual understanding (qualitative). This dual strategy ensures that the evaluation of Work for Impact is both rigorously evidence-based and deeply grounded in stakeholders’ lived realities.
The researcher will gather quantitative indicators from two sources:
Exports from Work for Impact’s systems will provide structural measures such as number of partner companies and job listings, placement counts by region and gender, contract details (length, wage rates, template usage), etc.
A structured questionnaire hosted on a secure platform (Google Forms) will collect perception-based and self-reported output indicators, like demographics & engagement filters (region, gender, engagement type), perception items (Likert 1 = Strongly Disagree to 5 = Strongly Agree) on accessibility, fairness, transparency, empowerment, autonomy, and quality of life, etc.
The survey is designed for clarity and brevity, targeting a completion time of not more than 10–12 minutes.
The researcher will use semi-structured interview guides customized for each stakeholder group (contractors, clients, internal staff) to explore thematic areas such as ethical outsourcing experiences, contract clarity, CSR impacts, program quality, skill gains, and community effects. Each guide will include open-ended prompts with follow-up questions (e.g., “Can you describe how your contract terms influenced your sense of security?”). Interviews will last 45–60 minutes, conducted via video call, with the researcher taking detailed notes to capture key themes and representative comments.
After analyzing interview and survey findings, the researcher will facilitate one virtual workshop to validate and, if necessary, refine the Theory of Change based on stakeholder feedback.
The quantitative data will be summarized using basic descriptives - means and percentages - for each output and outcome indicator defined in the framework. The author will compare key metrics (e.g., empowerment, job stability, quality of life) across regions to measure and validate the impact.
Qualitative interview and workshop notes will undergo thematic analysis. The researcher will code for themes aligned to the Theory of Change (e.g., empowerment, skill growth, community impact), validate these themes through quick peer review (WFI employees), and link them back to the corresponding indicators.
Chapter 2 detailed the convergent-parallel mixed-methods design and baseline data-collection strategy. It defined key outputs and outcomes, outlined quantitative and qualitative indicators, described census-style surveys of contractors (N = 322) and Pathways alumni, and explained semi-structured interviews with contractors and clients. Procedures for data export, survey distribution, and interview logistics were specified, along with a streamlined plan for descriptive statistical and thematic analyses.
Complete Data Collection: Finalize surveys, confirm interview transcripts, and pull final data from WFI statistics.
Conduct Analysis: Summarize survey data with basic statistics (averages, percentages) and review interview notes to identify recurring insights tied to each outcome indicator.
Validate Theory of Change: Prepare and facilitate a stakeholder workshop to review preliminary results and adjust the ToC model.
Draft Findings: Prepare Chapter 3, mapping evidence from outputs to outcomes, and integrate for internal feedback.
Finalize Report: Integrate revisions and hand off to design for publication.