Research Strategy & Process

Introduction

As a senior‐level UX researcher, I approach every project with a combination of curiosity, rigor and pragmatism. My goal is to cultivate a deep understanding of people and systems so I can help cross‑functional teams build products and services that are both effective and compliant. Having worked across regulated healthcare, justice and enterprise platforms, I have refined a process that balances discovery, analysis and strategic planning.

Philosophy

  • Human‑centered and evidence‑based. Decisions should be grounded in real user needs, behaviors and data rather than assumptions or trends.

  • Process‑aware. In complex environments, understanding the current operational workflow is essential. I map processes end‑to‑end to reveal bottlenecks, variation and policy constraints.

  • Strategic and actionable. Research must translate into clear, prioritized recommendations that guide design and roadmap decisions.

  • Collaborative and transparent. Success comes from partnering with product owners, designers, developers and subject‑matter experts, maintaining traceability from requirements to insights.

End‑to‑End Research Process

1. Align & Plan

  • Define the core problem, business goals, success metrics and research questions with stakeholders.

  • Map existing knowledge and identify assumptions and gaps.

  • Choose appropriate methods (qualitative, quantitative or mixed) and create a timeline that fits the product development cycle.

  • Ensure compliance considerations and privacy requirements are built into the plan.

2. Discover & Immerse

  • Conduct generative research—interviews, contextual inquiries, diary studies and surveys—to understand user jobs, pain points and motivations.

  • For regulated systems, facilitate process mapping sessions with subject‑matter experts to document current‑state workflows and variations. This step surfaces bottlenecks, inconsistencies and work‑arounds.

  • Leverage remote tools to reach distributed users and stakeholders, combining moderated and unmoderated methods.

3. Analyze & Synthesize

  • Transcribe and code qualitative data to identify patterns and themes across roles and contexts.

  • Combine quantitative data (usage analytics, performance metrics) with qualitative insights to validate findings.

  • Use affinity mapping, journey maps and service blueprints to visualize the user experience and operational process.

  • Where appropriate, employ AI‑assisted synthesis tools (e.g., Marvin, ChatGPT) to accelerate pattern recognition while ensuring human oversight.

4. Recommend & Inspire

  • Translate findings into personas, archetypes and process maps that highlight key needs and variation.

  • Develop clear problem statements and opportunity areas tied to impact and feasibility.

  • Create prioritized recommendations—such as workflow optimizations, UI design changes, or new features—that guide the product roadmap.

  • Facilitate co‑creation workshops with cross‑functional teams to explore future‑state concepts.

5. Validate & Iterate

  • Conduct usability testing on prototypes and process flows to validate design hypotheses. Use both remote and in‑person methods.

  • Iterate on designs based on user feedback and stakeholder input, balancing usability with regulatory requirements.

  • Monitor success metrics and gather post‑launch data to assess impact and identify new research questions.

6. Document & Communicate

  • Maintain organized documentation: research plans, interview guides, process maps, affinity diagrams, personas and recommendations.

  • Present insights and strategy to executives, product managers and engineers in a concise, compelling narrative.

  • Provide traceability from research findings to design decisions and system requirements. This ensures compliance and alignment across teams.

Example Impact: UCare Online Member Account & Provider Search

  • Current state mapping. At UCare, I led workshops with business and technical SMEs to document the end‑to‑end member account enrollment process. By mapping variations across markets, I identified bottlenecks and operational gaps.

  • Discovery & synthesis. Through interviews and behavioral analytics, I uncovered pain points in navigation and data presentation. I combined qualitative themes with metrics to identify where members struggled to complete tasks.

  • Strategic recommendations. I delivered a prioritized roadmap of improvements that included workflow optimizations, simplified information architecture and new self‑service capabilities. These recommendations informed the redesign and increased engagement by 32 %.

Leadership & Collaboration

  • Mentorship and scaling research. I coach junior researchers on methodology selection, synthesis techniques and stakeholder management.

  • Research operations. I build templates, repositories and process guides to ensure consistency and efficiency across projects.

  • Cross‑functional partnerships. I work closely with legal, compliance and engineering to ensure that research respects privacy regulations and that recommendations are feasible and scalable.

Visualizing the Research Process

To make my research approach tangible, I created a high‑level flowchart summarizing how I move from initial alignment through discovery, synthesis, recommendations and validation:

Tools & Methods

Tools & Methods

  • Generative: contextual inquiries, ethnography, stakeholder interviews, surveys, diary studies.

  • Evaluative: usability testing (moderated/unmoderated), heuristic evaluation, cognitive walkthroughs, prototype A/B testing.

  • Process mapping: journey mapping, service blueprints, current‑state/future‑state workflow diagrams.

  • Quantitative & Qualitative: I employ a mixed‑methods approach, balancing statistical analysis (e.g., surveys, usage analytics, A/B testing and Likert‑scale measurements) with deep qualitative inquiry. This combination enables me to triangulate findings, validate hypotheses at scale, and explore the “why” behind behaviors and attitudes.

  • Feature prioritization (Kano analysis): When evaluating potential features or enhancements, I use Kano analysis to categorize requirements into “must‑have,” “performance,” “attractive” and “indifferent” classes. This method helps identify which features will delight users, which are baseline expectations and how feature decisions impact overall satisfaction and roadmap prioritization.

  • Data analysis: thematic analysis, affinity mapping, statistical analysis, AI‑assisted synthesis.

  • Communication: Figma, FigJam, Miro, Dovetail, Confluence, Tableau.

  • Research & AI tools: Figma, FigJam, Marvin, AI‑assisted research and synthesis tools (ChatGPT, Gemini), Optimal Workshop, Qualtrics, Google Analytics, Power BI. These platforms enable rapid prototyping, collaborative workshop facilitation, advanced qualitative/quantitative data collection and analytics, and AI‑driven insight generation.

Case Study Highlights

Criminal Justice Records Management System (RMS) Redesign

Challenge: The existing RMS platform, used across multiple departments, included inconsistent workflows, role‑specific variations and inefficiencies that made it difficult for officers and administrators to complete daily tasks.

Research approach:

  • Facilitated contextual inquiry sessions, observation ride‑alongs and role‑based interviews with more than 20 end‑users and administrators.

  • Conducted process‑mapping workshops to document current‑state workflows, capturing variations by role and department.

  • Synthesized findings into journey maps and service blueprints to highlight friction points and policy constraints.

Impact:

  • Identified critical workflow bottlenecks and recommended streamlined processes that reduced duplicate data entry and decreased report‑completion time.

  • Developed a prioritized roadmap with usability improvements, revised permissions logic and enhanced mobile workflows.

  • Led cross‑functional design workshops to co‑create future‑state concepts, resulting in a unified user experience across roles.

Below is a simplified diagram summarizing the redesign process:

Provider Search Portal Enhancement

Challenge: Members and providers struggled to locate accurate provider information due to complex navigation and out‑of‑date data presentation in the existing search portal.

Research approach:

  • Conducted user interviews, task analyses and heuristic evaluations to identify pain points in the search and filtering experience.

  • Analyzed usage analytics to pinpoint drop‑off points and search abandonment rates.

  • Collaborated with data‑management teams to understand underlying data structures and opportunities for streamlining provider information.

Impact:

  • Recommended a simplified search interface with contextual filters, improved information hierarchy and clearer status indicators.

  • Partnered with engineering to establish data‑quality checks that increased provider accuracy and reduced support calls.

  • The enhancements led to higher task completion rates and improved satisfaction scores among both members and providers.

Closing Statement

My research strategy combines rigorous inquiry with practical guidance. By documenting current processes, discovering user needs, synthesizing data and translating insights into action, I help organizations deliver experiences that meet business objectives while delighting users. This end‑to‑end approach—and my ability to lead, mentor and collaborate—demonstrates readiness for senior UX research roles.