Pediatric mental health in the U.S., including Michigan’s Upper Peninsula, is at a crisis point, with nearly one in five youths facing mental health challenges and 40% of high schoolers reporting persistent sadness or hopelessness. Schools are overwhelmed by surging demand for counseling, yet many lack the staff to meet these needs, leaving too many children without timely support.
Michigan Tech's BDS Lab, under the guidance of Dr. Hembroff, are creating a direct response to this urgent need, harnessing the power of AI-driven, personalized wellness check-ins, and real-time resource matching to deliver just-in-time support our aim is to empower students and families while equipping schools and community partners with actionable insights to make informed decisions.
By embedding preventive, evidence-based interventions into students’ daily lives and ensuring strict privacy compliance, we are working with K-12 schools, families, community based organizations (CBOs) and others, to develop a compassionate, secure, and community-centered bridge to better mental health, wellness, academic success, and brighter futures for all.
This is an 18-month project, begining October 1, 2025 and ending March 31, 2027
• Months 1–3: Stakeholder interviews; user stories; privacy/security specs; Co-Design Workshop #1 (end of M3: wireframes, workflows).
• Months 4–6: Further development on mHealth and Web student, family, and mental health support app (front end), FHIR backend, AI engine; embed offline + voice; monthly sprint reviews.
• Months 7–8: Pilot #1 in three schools; staff/students/family onboarding; usability with ~25-50 students/family & 10–15 staff.
• Months 9–12: Refine risk algorithms; streamline referrals; expand resource library.
• Months 13–15: Pilot #2 (updated app); usability with ~75-100 students/family & 20-25 staff; Co-Design Workshop #2.
• Months 16–18: Full evaluation; partner debriefs; documentation; scale-up plan.
• Standing advisory bodies: Student Advisory Group (n=12), Parent/Caregiver Advisory Group (n=7), CBO & School/Health Partner Advisory Group (n=7).
• Workshops: end of Month 3 and Month 15 to assess live demos, journey mapping, accessibility, and feedback.
• Ongoing cadence: advisory meets Months 4, 8, 12, 16; transparent updates via project website; train-the-trainer to sustain and scale.
• Michigan Tech / BDS Lab (Lead): PM, research & evaluation, software dev, FHIR
infrastructure, security/privacy oversight, training, dev/testing coordination.
• Pilot Schools (districts in Houghton & Keweenaw Counties): deployment, student/MH support/teacher/staff usage, referral workflows; support usability testing.
• UPHCS: convening partners; clinical/technical guidance; validation; scale-up planning.
• CBOs: service catalogs; resource metadata; barrier analysis; validate referral metrics; co-author planning reports.
• Regional philanthropy (e.g., SHF, Copper Shores): engage in workshops/reviews to
guide scale-up & sustainability.
• Usability & Effort (CES, 1–7)
Instrument: Customer Effort Score collected after key workflows (daily check-in, resource lookup, referral).
Targets: median CES ≥ 6.0 in Pilot #1; maintain or improve in Pilot #2; cut by school/grade and trended monthly.
• Likelihood to Recommend (NPS, 0–10)
Instrument: Net Promoter Score using the standard 0–10 item.
Targets: NPS ≥ 10 in Pilot #1; ≥ 15 by Pilot #2; report detractor/passive/promoter mix with “why” comments.
• Acceptance & Use
Quarterly student app usage summarizing CES, NPS, acceptance, key themes, accessibility, and privacy metrics; Month-9 mid-project report; Month-18 final report with scale-up recommendations.
Target: ≥ 70% “Yes” by end of Pilot #2 (tracked by school/grade).
• Qualitative UX & Fit (semi-structured interviews)
Method: paired interviews with the four-eyes principle (interviewer + observer).
Analysis: inductive thematic analysis (open → axial → selective coding) to surface usability themes, barriers, and equity concerns; backlog ranked by severity/frequency.
• Feature Utility & Gaps
Track “most liked” (e.g., Resources, dashboard) and “missing” features (e.g., customization, journaling, affirmations); run a UI consistency audit (terminology/navigation) and report time-to-fix and regression checks. Target: close ≥ 80% priority UX issues before Pilot #2; 100% of critical issues before Month 18.
• Accessibility & Equity (device access)
Survey smartphone/smartwatch access per school; ensure core features work without a wearable. Targets: ≥ 95% of students can use all core features without a smartwatch; keep watch-dependent features optional.
Privacy & Data-Sharing Comfort
Measure comfort with school access to mental-health data and preferences for selective sharing; incorporate dynamic consent UX. Targets: ≥ 80% report understanding of how data are used; majority report adequate control via consent settings.
• Reporting cadence
Quarterly dashboards summarizing CES, NPS, acceptance, key themes, accessibility, and privacy metrics; Month-9 mid-project report; Month-18 final report with scale-up recommendations.