Pilot Mental Health

Understanding Pilots’ Perceptions of AI-Mediated Mental Health Support

Commercial pilots face significant barriers to seeking mental health support due to regulatory risk, stigma, and career implications. This mixed-methods study explored how aviation professionals (pilots and air traffic controllers) evaluate trust, privacy, and risk in AI-mediated mental health tools, informing design principles for trustworthy AI in safety-critical environments.

Project/Client

Southwest Airlines Delta Airlines Air Traffic Controllers Association (IFATCA) University of Washington University of Melbourne

My Role & Team

UX Researcher Project Lead of 17-person interdisciplinary research team including : UX Researchers, Human Factors Researchers, Aviation Experts, and Graduate Researchers.

Methods

Mixed-methods Research Design Semi-structured Interviews Large-scale Survey Design (6,000+ responses) Qualitative Thematic Analysis Quantitative Data Analysis Statistical Testing (Regression, ANOVA, t-tests) Data Visualization

Tools USed

Quantatitive: Qualtrics, SPSS, Tableau Qualitatitive: Nvivo, Miro, Figma AI Tools: ChatGPT, Claude

The Impact

6000+

global survey responses analyzed

13

pilot interviews conducted

2

publications, inc. ACM CHI 2026

A Project That Defined Trust in AI-Mediated Mental Health Support in Aviation

Mixed-methods research with commercial pilots, helped us identified critical trust and disclosure patterns, shaping how aviation professionals approach mental health support. The findings revealed significant barriers to seeking help and the need for aviation-specific AI-enabled support systems designed around privacy, safety, and professional risk. The work informed design principles for trustworthy AI in safety-critical contexts and was published at CHI 2026.

"Pilots need somewhere they can talk things through without it immediately becoming a reportable issue."

Commercial Airline Pilot

Research accepted to ACM CHI 2026, leading conference in Human-Computer Interaction.

6000+

global survey responses analyzed

13

pilot interviews conducted

2

publications, inc. ACM CHI 2026

A Project That Defined Trust in AI-Mediated Mental Health Support in Aviation

Mixed-methods research with commercial pilots, helped us identified critical trust and disclosure patterns, shaping how aviation professionals approach mental health support. The findings revealed significant barriers to seeking help and the need for aviation-specific AI-enabled support systems designed around privacy, safety, and professional risk. The work informed design principles for trustworthy AI in safety-critical contexts and was published at CHI 2026.

"Pilots need somewhere they can talk things through without it immediately becoming a reportable issue."

Commercial Airline Pilot

Research accepted to ACM CHI 2026, leading conference in Human-Computer Interaction.

6000+

global survey responses analyzed

13

pilot interviews conducted

2

publications, inc. ACM CHI 2026

A Project That Defined Trust in AI-Mediated Mental Health Support in Aviation

Mixed-methods research with commercial pilots, helped us identified critical trust and disclosure patterns, shaping how aviation professionals approach mental health support. The findings revealed significant barriers to seeking help and the need for aviation-specific AI-enabled support systems designed around privacy, safety, and professional risk. The work informed design principles for trustworthy AI in safety-critical contexts and was published at CHI 2026.

"Pilots need somewhere they can talk things through without it immediately becoming a reportable issue."

Commercial Airline Pilot

Research accepted to ACM CHI 2026, leading conference in Human-Computer Interaction.

The Spark

A Pattern Too Clear to Ignore

While conducting a literature review in a Directed Research Group at the University of Washington, I began examining mental health in aviation. Study after study pointed to the same pattern: aviation professionals experience significant psychological strain, yet many avoid seeking support due to professional and regulatory consequences.

The deeper I read, the clearer the gap became. The industry often attributes the problem to stigma, but the research suggested something more complex, revealing concerns around trust, disclosure, and career risk.

Why do aviation professionals avoid seeking mental health support when needed the most?

I led a mixed-methods study analyzing 6,000+ survey responses from pilots and air traffic controllers worldwide, revealing patterns of healthcare avoidance, withheld disclosure, and reliance on informal advice. To understand the reasons behind these behaviors, I conducted semi-structured interviews with commercial pilots to explore how trust, regulation, and professional culture shape mental health disclosure.

“Concerns about certification and career consequences often discourage aviation professionals from seeking mental health support.”

Chawla et al., CHI 2016

A Pattern Too Clear to Ignore

While conducting a literature review in a Directed Research Group at the University of Washington, I began examining mental health in aviation. Study after study pointed to the same pattern: aviation professionals experience significant psychological strain, yet many avoid seeking support due to professional and regulatory consequences.

The deeper I read, the clearer the gap became. The industry often attributes the problem to stigma, but the research suggested something more complex, revealing concerns around trust, disclosure, and career risk.

How do radiologists make decisions when the signs aren’t obvious?

When someone close to me was diagnosed with breast cancer, I saw how much uncertainty surrounds the diagnostic process. It made me curious about how design and machine learning could bring more clarity and confidence to real-world diagnosis. That experience stayed with me, and I got the chance to pick the project back up with the University of Washington to explore those possibilities further.

"Misdiagnosis and overdiagnosis remain key challenges in breast cancer imaging, where conventional mammography may fail to detect lesions."

Thomassin-Naggara et al. (2024)

A Pattern Too Clear to Ignore

While conducting a literature review in a Directed Research Group at the University of Washington, I began examining mental health in aviation. Study after study pointed to the same pattern: aviation professionals experience significant psychological strain, yet many avoid seeking support due to professional and regulatory consequences.

The deeper I read, the clearer the gap became. The industry often attributes the problem to stigma, but the research suggested something more complex, revealing concerns around trust, disclosure, and career risk.

Breakups can feel like grief – sudden, isolating, and overwhelming.

And yet, the digital tools available often emphasize little more than ‘just move on.’ We asked, what would it look like to treat heartbreak not just as pain to ignore, but as an experience to grow from?

"85% of US adults report experiencing a romantic breakup, with 1/3 of those individuals experiencing clinically significant depressive symptoms"

Verhallen et al. (2019)

The challenges

A SYSTEM DESIGNED AROUND DISCLOSURE RISK

Mental health support in aviation exists within a regulatory environment where disclosure can carry professional consequences. While support resources are available, concerns about certification, career impact, and privacy often shape whether aviation professionals feel safe seeking help.

What We heard

  • “If something goes on your medical record, it can follow you for the rest of your career.”
  • “Most pilots talk to other pilots and not even bother to even talk to a doctor.”
  • Aviation professionals frequently described weighing mental health support against potential certification consequences.

How might we enable AI-mediated mental health support to feel safe and trustworthy for aviation professionals?

A SYSTEM DESIGNED AROUND DISCLOSURE RISK

Mental health support in aviation exists within a regulatory environment where disclosure can carry professional consequences. While support resources are available, concerns about certification, career impact, and privacy often shape whether aviation professionals feel safe seeking help.

What We heard

  • “If something goes on your medical record, it can follow you for the rest of your career.”
  • “Most pilots talk to other pilots and not even bother to even talk to a doctor.”
  • Aviation professionals frequently described weighing mental health support against potential certification consequences.

How might we create tools that make cancer diagnostic data more interpretable, transparent, and actionable for radiologists?

A SYSTEM DESIGNED AROUND DISCLOSURE RISK

Mental health support in aviation exists within a regulatory environment where disclosure can carry professional consequences. While support resources are available, concerns about certification, career impact, and privacy often shape whether aviation professionals feel safe seeking help.

What We heard

  • “If something goes on your medical record, it can follow you for the rest of your career.”
  • “Most pilots talk to other pilots and not even bother to even talk to a doctor.”
  • Aviation professionals frequently described weighing mental health support against potential certification consequences.

How might we create a digital experience that adapts to the psychological realities of breakup recovery, including attachment styles, identity loss, and emotional dysregulation, while remaining clinically grounded and deeply human?

the research Process

the research & Technical Process

the research Process

Secondary research

Our purpose was to develop a foundational understanding of pilot mental health, reporting barriers, and emerging AI-supported interventions in aviation.

What we Asked
  • What are the major mental health challenges faced by pilots?
  • How do regulatory reporting systems influence disclosure and help-seeking?
  • What role could AI-supported tools play in providing mental health support in aviation?
what we found
  • Pilots face significant stigma and career risk concerns when reporting mental health issues within existing certification systems.
  • Mental health support often exists through peer support programs and unions, operating alongside formal medical systems.
  • Emerging research suggests AI-mediated support tools may offer private, accessible pathways for reflection and early intervention.

Findings informed a peer-reviewed publication at the AHFE Conference.

Stakeholder Interviews

To understand how mental health support operates within aviation, I conducted stakeholder interviews with aviation professionals and domain experts. These conversations helped map the regulatory environment, available support resources, and how pilots navigate reporting requirements.

Stakeholders noted that Aviation Medical Examiners (AMEs) have broad authority to question and evaluate pilots during medical certification, which can make pilots reluctant to seek advice or disclose emerging mental health concerns.

How does the aviation mental health system actually work?

Mental Health Disclosure Is Tied to Medical Certification
Disclosures Oversimplified Through Standardized Forms
Pilots Avoid AME Advice Due to Certification Risk

Mixed-MEthods Surveys

To understand broader perceptions of mental health in aviation we conducted a large-scale mixed-methods survey combining likert-scale questions with open-ended responses. The survey was distributed across aviation communities and public channels, resulting in 6,000+ responses that provided insight into how people perceive mental health support in safety-critical professions.

98%

view mental health as a concern within the indistry

76%

don't trust regulator mental health policies (FAA, EASA etc.)

63%

want to seek support but felt like they couldn't

13%

actually use peer support and union based resources

Semi-Structure Interviews

To better understand why pilots make mental health decisions within aviation systems, we conducted 13 semi-structured interviews with pilots across different experience levels. These conversations helped uncover the motivations and concerns shaping decisions around care-seeking, disclosure, and trust.

8 key themes.
Trust, risk, and the role of AI in pilot mental health.

What pilots told us

We asked pilots to respond to vignette-based scenarios designed to explore how they navigate mental health challenges in aviation and whether AI-mediated support could realistically help in those situations.

  • Pilots weigh certification risk before every support decision, often choosing silence over uncertainty.
  • AI is most acceptable as a private “thinking space” for sense-making before speaking to another person.
  • Trust depends less on the tool itself and more on who controls the data behind it.

Secondary research

We started by looking into existing research to understand why diagnosing breast cancer is often so complex. Radiologists interpret features like shape, margin, and density differently, and even small changes can lead to different outcomes. This is especially true in borderline cases or when images aren’t clear. These insights helped us focus on where and why the problem exists and to design tools that support clinical judgment.

Human error factor
  • False negatives in mammography range from 12–30%, depending on case complexity and image quality
  • Up to 50% of cancers in dense breasts may go undetected without additional imaging
  • Benign findings can be flagged as dangerous, leading to unnecessary biopsies and anxiety
Gaps in existing tools
  • AI models are often hidden behind a blackbox where most don’t explain why a case is flagged, limiting clinical trust
  • Generic outputs like “malignant: 84%” may lack clinical value and doesn't answer "why?"
  • Tools often miss real-world workflows – a gap in human-centered design
Opportunities for innovation
  • Tools to show "why" a tumor is flagged by surfacing the specific features influencing the decision
  • Radiologists want tools that support clinical judgment, not to automate the process
  • Outputs should adapt to case-specific contexts like dense breast tissue or borderline features

Machine Learning Model

The machine learning model was built early on, during my time at University of Nottingham, as a way to explore how tumor characteristics could predict malignancy and patterns, especially whether those patterns aligned with how radiologists make decisions. At that stage, I didn’t know this would evolve into a design project. But training the model helped uncover which features were most influential, which later became critical input for designing an interface that could surface meaningful, case-specific insights and support clinical reasoning.

quantitative data exploration

After deciding to turn this into a design project, I revisited the model through deeper quantitative analysis to unpack how its predictions worked in detail. I wanted to explore which features to emphasize, how uncertainty showed up in the data, and where edge cases might cause confusion. These visualizations helped shape the design direction, especially around what to prioritize, how to handle ambiguity, and how to build trust through clarity and transparency.

Secondary & Market Research

I began with secondary and market research to understand how common breakups are, how deeply they affect people, and whether there were any patterns in how we experience or cope with them. This helped identify gaps in how breakups are discussed and in the tools currently available for recovery.

Breakups Mirror Greif & Identity loss
  • Breakups can trigger emotional distress similar to grief and trauma
  • Loss of relationship disrupts self-concept and emotional stability
  • Rumination is especially strong when breakups are unexpected or lack closure
Recovery Is Deeply Personal
  • Anxiously attached individuals experience higher emotional distress and are more likely to ruminate or seek reconnection
  • Personality traits like introversion, extroversion, and ambiversion affect how individuals seek support or process emotion
  • Reflective processes support emotional recovery and long-term growth
The Market Is Large — and Underserved
  • 75 million US adults will go through a breakup at least once in their lifetime
  • Mental health app market is approximately at $3.2B today and is growing to $6.5B by 2033
  • Breakup recovery is a hidden demand in the mental health market that remains largely unaddressed by existing digital solutions.

Competitive Analysis

I conducted a competitive analysis to understand how existing breakup and emotional recovery tools approach the problem and where gaps exist. Many apps relied on generic, self-guided content that does not adapt to users’ emotional needs. Evaluating usability, tone, and engagement patterns helped identify opportunities for more personalized and emotionally supportive experiences.

Ther is a Lack of breakup-specific content
  • Most apps focus on general wellness, not romantic breakups
  • When breakups are mentioned, content is often static or superficial (not personable)
There is Limited Personalization
  • Few apps adapt content based on user needs, context, or progress
  • Personalization, when present, is often surface-level (e.g. goal or mood selection)
It's Primarily A self-guided experiences
  • Support tends to be indirect, through content tone or structure and not through interaction
  • Few offer a sense of being emotionally accompanied or guided over time with real-time growth progress

Surveys

After the competitive analysis and literature review, I designed a survey to understand what people actually experience after a breakup at a larger scale. This helped validate early patterns, capture insights from a diverse group of participants, and identify key segments to inform future design decisions.

What we asked:
  • What was the most emotionally challenging part of your breakup experience?
  • What types of support (if any) did you seek out after your breakup?
  • What kind of tool or resource do you wish you had during your healing process?

The findings validated a real, unmet need for breakup support that feels personal, responsive, and human – shaping the foundation of our product vision.

What we found:
  • Many felt overwhelmed and emotionally alone during the healing process
  • Generic advice and self-help content felt impersonal or unhelpful
  • Respondents wanted structured guidance that adapts to their emotional state
  • Validation and relatability were more important than clinical tone

Interviews

After the survey, we conducted 12 follow-up interviews with participants who had recently experienced a breakup. These conversations helped uncover deeper emotional needs and why existing tools often fall short in supporting people through breakup recovery.

8 key findings. Raw, emotional, high impact.

They revealed why breakups disrupt everyday life and emotional stability, and the need for compassion, community, and support. These insights shaped Repose into a focused recovery tool designed to offer structure, emotional guidance, and self-directed healing.

impactful themes that emerged
Support Often Feels One-Dimensional
Healing Is Nonlinear and Unpredictable
Feeling Seen Matters More Than Advice

Insights & Triangulation

Triangulated Research Matrix

Insights were validated across secondary research, large-scale surveys, and semi-structured interviews. Triangulating findings helped identify consistent patterns shaping how pilots evaluate mental health concerns and navigate support decisions.

3 Methods. 5 Key Insights.
Cross-Validated Findings.

Behavioral Decision Model

Interviews revealed that pilots rarely move linearly through support decisions. Instead, certification risk, system distrust, and uncertainty about support pathways create loops that delay help-seeking.

“It’s not that pilots don’t want help. It’s that the system makes you think twice about asking.”

Triangulated Research-to-design Matrix

Before designing the interface, I needed to understand how diagnostic decisions break down – in models, in data, and in clinical workflows. I triangulated three methods to create a research-to-design matrix that helped validate patterns across sources and identify high-confidence insights. The matrix surfaced ten core findings that revealed where errors happen, what users actually need, and how AI predictions can be made more interpretable. These insights became the foundation for every UI decision that followed.

3 methods; 10 findings; 4 that deeply shaped the interface.

By combining model behavior, pattern analysis, and literature on breast cancer diagnostic processes, I mapped out the most impactful pain points: unreliable feature weighting, lack of transparency, edge cases, and cognitive overload. Each design decision below directly addresses these breakdowns with targeted interface responses.

Enhanced Diagnostic Precision with Mean & Worst Metrics

Radiologists don’t just look at a tumor’s ‘average’ size, they also zero in on its single most abnormal spot in a single patient. Missing that one extreme region can lead to under-diagnosis.

"Existing breast imaging studies reported the entropy, mean, minimum, and maximum as important features."

Lee et al. (2020)

MEan
  • Calculated by sampling X (e.g. radius, area, texture) at dozens or hundreds of points on the same tumor and taking their average. Reflects the lesion’s typical size, shape, or heterogeneity.
  • Without mean, radiologists lose important context – every lesion has a baseline appearance and the model detects whether this appearance is benign or malignant.
Worst
  • From the very same measurements, radiologists select the largest (or near-largest) value. Highlights the single most abnormal “hot spot” that may warrant targeted biopsy.
  • Worst metric measurements prevent dangerous outliers from hiding in the average, ensuring that even small but aggressive regions of the tumor are flagged for further clinical attention.

Mapping the Emotional Journey of Breakup Recovery

Breakup recovery is rarely linear. We synthesized insights from research, surveys, and interviews to map how emotions and support needs shift over time, helping identify moments when people feel most overwhelmed and what support Repose should provide at each stage.

Jobs-to-be-done Framework

Many people described feeling stuck after a breakup, seeking clarity, reassurance, and a sense of progress. Using the Jobs-to-be-Done framework helped translate these needs into clear goals that could guide product decisions, shaping Repose’s value proposition and feature priorities.

Feeling seen
“When I feel like I wasn’t enough, I want to hear from others who’ve been through it, so I don’t feel broken and alone.”
Stop Rumination
“When I keep replaying the breakup, I want to find clarity to make sense of what happened, so I can stop fixating and spiraling.”
Regain emotional saftey
“When I’m overwhelmed and panicked, I want to feel emotional relief by grounding myself, so I can get through the day.”

Mvp Strategy

Design Implications & Iterations

I co-built the interface using Streamlit and iterated directly in code (using vibe-coding), guided by user needs and model behavior. Streamlit allowed me to maintain full control over the model logic while rapidly prototyping interfaces that stayed true to the algorithm’s outputs. Unlike visual design tools, Streamlit let me directly connect model predictions with interface elements, making it easier to test ideas in real time, adjust how probabilities were framed, surface uncertainty, and experiment with interactive features like sliders, graphs, and confidence estimates.

V1 - Basic Inputs, No Guidance

  • The first version was a straightforward input form where users manually entered four tumor metrics to generate a prediction. While functional, it offered no interpretive support, making the experience feel opaque and limiting users’ ability to trust or make sense of the output.

V2 - Sliders and Contextual Info

  • The second version introduced interactive sliders, population averages, and brief metric descriptions to improve usability and reduce friction. This helped users understand what they were adjusting, but the model’s reasoning was still unclear and users couldn’t easily connect inputs to outcomes.

V3 - Transparent and Decision-Supportive

  • The final version focused on interpretability and trust. It added confidence labels, similar-case comparisons, and a feature-level visualization showing how each metric influenced the result. These changes transformed the tool into a decision-support interface that aligned more closely with user needs (radiologists) and mental models of breast cancer tumor diagnostics.

↖︎ Launch App

Benign
Malignant

Business Model Canvas & Customer Segmentation

We created a Business Model Canvas and mapped key customer segments to define Repose’s value, target audiences, and potential paths to reach and support them. This helped ensure the MVP aligned with both user needs and business viability.

Design Principles

Design Directions for AI-Mediated Mental Health Support

Interview insights informed design principles emphasizing privacy, low-risk engagement, and reflection before disclosure, which were shared with stakeholders developing AI mental health tools for aviation.

Mvp Designs & UI

Repose: Overview

To ensure Repose was viable and grounded in real user needs, we created a Business Model Canvas and mapped out our customer segments early on. This helped us define our core value, identify key audiences, and clarify how we would reach and support them. These insights gave us the confidence to move forward with a focused MVP, ensuring our solution remained aligned with both user needs and business goals.

Repose: Onboarding

A step-by-step welcome flow that gathers your companion, attachment style, and personality to tailor every lesson and reminder to your needs.

What It does

Greets users with a warm introduction, highlights core benefits, and invites them to begin their healing journey

Design Decisions

  • Kept to a minimum so newcomers aren’t overwhelmed when starting their healing journey
  • Single, centrally-placed CTA for clarity "Get Started"
  • Soft purple palette to feel soothing and hopeful

What It does

Prompts users to choose a plant or animal companion, immediately tailoring the journey and building an emotional bond

Design Decisions

  • Tapping cards is more playful than dropdowns for quick selection
  • Emoji-style art to build an emotional bond quickly
  • Grid of 8, so choice feels substantial without scrolling

What It does

Asks about your typical relationship pattern to customize lessons and exercises around attachment needs

Design Decisions

  • Cards instead of radio bullets to feel more tactile
  • Short labels and micro-copy to reduce cognitive load
  • “Not sure” option so no one feels forced to answer

What It does

Captures personality/social preference (introvert, extrovert, ambivert) to adjust how and when reminders and content are delivered

Design Decisions

  • Three-option card layout matches mental model of introvert/extrovert/ambivert
  • Highlight on tap to reinforce selection

Repose: Core Features & Navigation

A bottom tab bar granting instant access to your daily tools, guided micro-lessons, community discussions, and profile settings for a seamless healing experience.

What It does

Central dashboard for daily coping tools all in one glance – breathing, affirmations, journaling, and habit tracking

Design Decisions

  • Bottom-nav icon labelled “Home” for instant recognition
  • Four main action cards for rapid access to daily tools
  • Buddy avatar at top to remind users of their companionship

What It does

Bite-sized, personalized audio lessons organized by topic and healing stage, with clear duration and play controls

Design Decisions

  • Horizontal pill navigation lets users switch topics without leaving the screen. Topics are ordered based on onboarding selections to deliver personalized content.
  • Play buttons and duration so users know what they’re signing up for

What It does

Anonymous forum where users can browse or join discussions on common breakup challenges, share stories, and find peer support

Design Decisions

  • List-style topics to feel familiar (like forum threads)
  • “Join conversation” CTAs prompt active engagement
  • Anonymity note at bottom to reassure privacy

What It does

User settings hub: swap your healing buddy, set daily check-in reminders, and review personal data to keep the experience tailored to you

Design Decisions

  • Swap-out buddy at top to reinforce personalization
  • Toggle & time picker for reminders so it’s clear and easy to adjust
  • Export data CTA to easily export journal entry data from the app

Repose: Grounding Hub

A central dashboard of mindfulness exercises – breathing, affirmations, habit tracking, and journaling – designed to help you stay present and build healthy routines.

What It does

A guided, timed breathing exercise with a simple countdown to help users calm their nervous system and reduce anxiety.

Design Decisions

  • Countdown in large type to focus attention
  • Single “Start” button so there’s no ambiguity
  • Card layout matches other Hub tools for consistency

What It does

Series of positive, self-compassion statements presented on screen or via voice to counter negative thoughts and boost mindset.

Design Decisions

  • Microcopy in quotation marks to signal “self talk”
  • Secondary “Play voice” CTA so users can choose reading vs listening

What It does

Daily checklist for small, customizable actions (e.g. drink water, take a walk) that encourages building consistency through streaks.

Design Decisions

  • Inline “Add new habit” button at bottom so building a list feels natural
  • “X” icons on each item for quick removal followed by confirmation dialogue
  • Streak language (“Start your streak!”) to gamify

What It does

Dual-mode journaling (free-form mood journal or prompt-based session) for users to reflect on feelings, track patterns, and gain insight.

Design Decisions

  • Two tabs (New Entry / History) to separate creation from reflection
  • Supported both mood-based free journaling and guided-prompt journaling so users can choose their preferred reflection style

NExt Steps & Reflections

Reflections: Understanding Pilot Perspectives

Speaking directly with pilots was the most rewarding part of this research. As someone who once dreamed of becoming a pilot, it was incredibly meaningful to explore the realities of aviation from their perspective and understand the challenges they face around mental health support.

Next Steps: Designing Supportive AI Tools

Moving forward, I will continue collaborating with Involo (CRMSON) to explore gamification best practices, focusing on what makes tools feel supportive and engaging for pilots. I also plan to expand the work through data visualizations and further analysis to better communicate insights and inform future AI-mediated support systems.

Reflections: Understanding Pilot Perspectives

Speaking directly with pilots was the most rewarding part of this research. As someone who once dreamed of becoming a pilot, it was incredibly meaningful to explore the realities of aviation from their perspective and understand the challenges they face around mental health support.

Next Steps: Designing Supportive AI Tools

Moving forward, I will continue collaborating with Involo (CRMSON) to explore gamification best practices, focusing on what makes tools feel supportive and engaging for pilots. I also plan to expand the work through data visualizations and further analysis to better communicate insights and inform future AI-mediated support systems.

Reflections: Understanding Pilot Perspectives

Speaking directly with pilots was the most rewarding part of this research. As someone who once dreamed of becoming a pilot, it was incredibly meaningful to explore the realities of aviation from their perspective and understand the challenges they face around mental health support.

Next Steps: Designing Supportive AI Tools

Moving forward, I will continue collaborating with Involo (CRMSON) to explore gamification best practices, focusing on what makes tools feel supportive and engaging for pilots. I also plan to expand the work through data visualizations and further analysis to better communicate insights and inform future AI-mediated support systems.

Other Projects

Psychology Driven Strategy for a Breakup Healing App

Qualitative Research Methods | Customer Segmentation & JTBD Mapping | Research-Strategy Translation

Designing A Breast Cancer Diagnostic Tool for Radiologists

Quantitative Methods | Machine Learning & Data Visualization | AI Based Prototyping