← All work
Infocare Healthcare · Mobile App

Closing the gap between
appointments.

Patients drift from their care plans when there's nothing connecting them between clinic visits. SoteriaMe was designed to close that gap: building the habits and trust that keep patients engaged with their treatment when no one is watching.

RoleProduct Designer
PlatformiOS & Android (Mobile)
Year2022
DomainDigital Health
SoteriaMe: patient dashboard, symptom history, and messaging screens

At a glance

The one-minute version

The problem
Patients were disengaging from their care plans between clinic visits. Clinicians were spending disproportionate time on administrative follow-up: chasing missed appointments, resending instructions, manually prompting medication adherence.
What I owned
End-to-end design lead: research, information architecture, wireframing, and final UI. Collaborated directly with a PM, clinical advisors, and engineering.
Result
Piloted across clinics in the United States. Patients completed core tasks (appointment check, medication reminders, symptom logging) without instruction. Clinicians reported improved visibility into patient engagement between visits.

The problem

The gap between appointments

Infocare's desktop platform handled scheduling, records, and clinical workflows well. What it couldn't address was what happened to patients after they left the building. Without a connection to their care plan between visits, patients missed medications, forgot instructions, and drifted. Clinicians absorbed the cost as administrative overhead.

1 in 3
Patients forgot their next appointment date within a week of their visit
~40%
Of clinician admin time spent on follow-up that could be automated (internal estimate at project start)
0
Existing in-product tools to support patients between appointments before this project

Infocare had tried to solve this with email. It hadn't worked. Patients in chronic care especially tend to have complex and variable digital literacy, and a generic email from a clinic name they half-recognised wasn't moving the needle. The brief was to design something that felt personal, trustworthy, and genuinely easy to use.

Mary O'Connor
Mary O'Connor, 46
Patient managing a chronic condition
Background
Attends regular clinic visits, owns a smartphone but has low digital confidence. Relies heavily on written notes and phone calls.
Goals
Stay on top of appointments and medication, feel supported between visits, access information in plain language.
Pain points
Often forgets instructions once she leaves the clinic, feels overwhelmed by too much information at once, finds existing apps too complex.
Design needs
Clear reminders, step-by-step guidance, and a simple way to log and share symptoms.
Dr. Javier Morales
Dr. Javier Morales, 38
Clinician at a busy urban hospital
Background
Tech-savvy but time-poor. Manages dozens of patients daily and wants tools that integrate smoothly into existing workflows.
Goals
Spend less time on repetitive admin, quickly see which patients are engaged, and focus on meaningful care.
Pain points
Too much manual admin, limited visibility into patient progress, and fragmented communication tools that create extra work.
Design needs
Automated reminders, at-a-glance engagement data, and a secure, lightweight messaging channel.

Research & discovery

Patient access was gated. Clinicians filled the gap.

GDPR and clinical governance frameworks limited direct patient access from the start. The approach relied on triangulation: eight interviews with consented patients through Infocare's clinical partners, four clinician workshops across three clinical sites, desk research and competitive analysis, and prototype testing with internal clinical advisors.

Running clinicians as the primary research lens filled a gap that direct patient access couldn't. Clinicians carry daily working knowledge of what patients forget to say in appointments, where they disengage, and what they misunderstand. That knowledge fed directly into the information architecture.

What the research established

Appointments and instructions were the primary memory failure point
Patients forgot both what they were told and when they needed to return. There was no system reinforcing the information once the appointment ended.
Clinician follow-up was largely manual and reactive
Reminders, medication check-ins, and progress nudges were all initiated by clinic staff. Automating those prompts in-product was a direct response to a documented operational cost.
Trust and data safety were patient-side blockers
In the interviews we ran, patients consistently raised data safety before usability. "Who can see this?" and "Where does it go?" came up before any question about features. Answering those questions through the interface, at the point where patients entered data, became a core design requirement.
Research artefact 01

Affinity map

Research inputs Clinician workshops Patient interviews Desk research Competitive analysis
Patient Needs
Appointment reminders in plain language
Medication nudges at the right time
Simple symptom logging
Single unified view of their care plan
Support for low digital confidence
Open questions
What's the right notification frequency?
How do we serve older, less confident users?
How much simplicity is too simple?
Clinician Needs
Automated follow-up prompts
At-a-glance engagement indicators
Integration with existing EHR workflows
Signal layer, not a data feed
Lightweight, no added reading burden
Open questions
What signals matter vs. create noise?
How do clinicians want to consume patient data?
Whose job is it to act on the signals?
Trust & Compliance
Inline data visibility labels
Privacy-first onboarding
Trustworthy, official appearance
GDPR & clinical governance compliance
Structural reassurance, not policy links
Open questions
Where does data live? Who has access?
How do we prove trustworthiness through design?
What are our obligations for this data type?
Healthcare Context
Mobile-first for patient access
Scalable across different clinic types
Barrier: low digital adoption in target demographic
Existing tools: phone calls and paper notes
App must not compete with clinical judgment
Open questions
Can one app serve very different clinic types?
What replaces the phone call for urgent needs?
How do we handle compliance edge cases?
Dot-voted priorities
Priority 1 4 votes
Patients forget appointments, the system needs to remember for them
Priority 2 3 votes
Engagement drops between visits, the gap is the product problem
Priority 3 3 votes
Privacy concerns are high, trust must be structural not cosmetic
Priority 4 2 votes
Clinicians lack visibility, they need signals not a second inbox
Research artefact 02

Journey map

POSITIVE STRESSED 01 02 03 04 Overwhelmed / anxious Hopeful but unsure Overloaded Anxious / uncertain
01
Discovery
Patient actions
Searches for app or receives clinic referral
Reads app store listing and reviews
Hesitates before downloading
Key design need
Trust before commitment. Patients need reassurance this is official and safe before they'll download it.
02
Registration
Patient actions
Creates account with clinic code
Reads consent screen
Completes basic profile
Key design need
Plain-language explanation of what data is shared, when, and with whom, before they're asked to provide it.
03
First use
Patient actions
Lands on dashboard for the first time
Looks for their next appointment
Sets first medication reminder
Key design need
One obvious primary action. Not a dashboard requiring exploration, a single clear answer to "what do I do now?"
04
Ongoing use
Patient actions
Logs symptoms between visits
Checks reminders and messages
Wonders who can see what they've entered
Key design need
Persistent, visible context, "Shared with your care team" or "Only you can see this", at the point of entry not buried in settings.

What this shaped

The arc had a clear message: anxiety doesn't go away; it just changes form. Three design principles came directly from this.

Discovery Earn trust before asking for data. Every onboarding decision was weighed against whether it built or eroded trust before the patient had committed to the app.
First use One thing at a time. The overload peak killed dashboard complexity. The design answer was a single obvious primary action, not a screen that required orientation.
Ongoing use Make data visibility persistent and contextual. The ongoing anxiety came down to one question: who could see what. Every data entry point needed a visible answer to that question.

Design approach

Designed for patients. Useful to clinicians.

The dual constraint shaped every decision: simple enough for a patient with high cognitive load and variable digital literacy; signal-rich enough for a clinician with no time to read a second inbox. Every design problem was framed the same way: who needs this, when do they need it, and what happens to the other person if we get it wrong?

Early concept sketches: screen inventory and information architecture Hand-drawn wireframe sketches showing six initial screen concepts: Home, Ongoing Treatment, Profile, Thread, Messages, Med List

Paper sketches mapping the initial screen inventory: six core views before any wireframing. Home, Ongoing Treatment, Profile, Thread, Messages, Med List. Six views also meant six navigation items, which immediately raised the question of whether that cognitive overhead was appropriate for a user who might open the app for one purpose and leave in seconds.

01
Information architecture
Answer the one question patients have when they open the app

Early wireframes tried to surface everything at once: medications, appointments, symptom history, messages, wellness tips. Clinical advisors confirmed what the research implied: patients often arrive at a tool with cognitive load already high. A dashboard that required scanning before acting was going to be abandoned.

Early design showing a generic Healthcare Dashboard with appointments list, profile grid, and health stats Before
Revised design showing personalised greeting, next appointment card, and task list hierarchy After
02
Trust and data transparency
Designing trust into every data entry point

The conventional fix for data anxiety is a consent screen at onboarding, which treats trust as a legal requirement and puts all the weight on a moment when patients are already overwhelmed.

We added persistent, plain-language visibility labels at every point where patients entered data: "Shared with your care team" or "Only you can see this." Those small typographic decisions changed patient behaviour more than any structural design change in the product.

03
Accessibility
Accessibility for the actual patient population

The patients using SoteriaMe were managing chronic conditions, often older and carrying multiple diagnoses. Reduced visual acuity, variable digital confidence, and elevated cognitive load were built into the design assumptions from day one; the team treated WCAG AA as a floor and designed beyond it.

The symptom severity scale originally used colour only: red, amber, green, clean and immediately legible to anyone with normal colour vision. Redesigning it to use both colour and icon label meant abandoning the cleaner version, but it was the only design that worked for patients with colour vision deficiencies, a significant portion of the target demographic. A more visually minimal scale would have failed them silently, and we would never have seen it in testing.

Final design: patient-facing screens Four final patient app screens: dashboard, symptom history, messaging, and profile

These four screens cover the three core patient jobs (checking the next appointment, managing reminders, logging symptoms) plus the profile screen where the data transparency controls live.

Detail flows: reminder configuration and symptom logging Eight screens showing the reminder setup flow and symptom logging flow in detail

Testing was built to validate three hypotheses: that the IA hierarchy reduced time-to-first-action, that inline trust labels changed how patients expressed data anxiety, and that the accessibility baseline held under real-use conditions. Two of those three had been rebuilt at least once before testing began; the goal was to find what was still wrong.

Testing & validation

Testing and what it changed

Testing ran across two structured rounds before the pilot. The objective was to find failure points early enough to fix them. Both rounds surfaced real problems and changed the product.

Round 1 — Clinical advisor review

4 clinical advisors (GP, specialist nurse, clinical informatics lead, patient experience) walked through the full prototype against scripted scenarios: onboarding, dashboard first use, reminder setup, symptom logging, and messaging.

Their brief was to flag anything clinically inaccurate, structurally confusing, or likely to cause a patient harm by omission. They produced a prioritised issue list. Two issues were marked blockers and rebuilt before Round 2.

Round 2 — Patient testing

5 participants, ages 34–67, all managing chronic conditions, recruited through Infocare's clinical partners. Varied digital literacy, with two describing themselves as "not very good with phones." Sessions were 30 minutes each, think-aloud protocol, no instruction given on task completion.

Three tasks, no prompting: (1) locate your next appointment, (2) set a reminder for your evening medication, (3) log how you're feeling today, then find out who can see that entry.

What the numbers showed

Task 01 — Appointment
5/5
completed, no prompting Held in Round 2

11 seconds avg from dashboard landing, including both participants with low digital confidence.

Task 02 — Reminder setup
3→5
failures → completions Rebuilt between rounds

3/5 couldn't locate or mis-navigated reminder setup in Round 1. After elevating it as a primary action, all 5 completed without prompting.

Task 03 — Data visibility
4→0
unprompted trust pauses Resolved by design

4/5 paused before submitting to ask who could see their entry. After inline labels were placed at point of entry ("Shared with your care team" / "Only you can see this"), zero participants hesitated in Round 2.

Outcome

A working pilot. An honest account of what we'd measure differently.

SoteriaMe was piloted through Infocare's clinical partners across a small number of US clinics. The pilot ran without the instrumentation to measure engagement at scale, so what we have is qualitative.

"It's easier to remember what the doctor said when it's all written down here."
Patient, usability testing
"I can see when patients are actually engaging, not just waiting until the next visit."
Clinician, usability testing
Reminders, symptom tracking and messaging Three final screens: active reminders list, symptom history with trend chart, and secure patient-clinician messaging

Both quotes are describing the same thing: the app created a record patients could reference between appointments, and gave clinicians visibility into what patients were doing between visits. That was what it was built to do.

Beyond the pilot, SoteriaMe served a second purpose Infocare cared about equally: demonstrating that they could build a credible patient-facing digital product. Their existing reputation was built entirely on desktop clinical infrastructure. A working mobile health app opened new commercial ground, and was used directly in conversations around NHS and private healthcare contracts.

What I'd measure differently

If I were running this project again with a proper measurement framework, I'd have established a baseline appointment-miss rate before launch, tracked medication log completion rates in-app, and run a cohort comparison between engaged and disengaged users against downstream clinical outcomes. None of that was feasible within the project's scope. Worth naming anyway.

Both user groups were placing trust in the product: patients that their data was safe, clinicians that what they saw was accurate. That's a harder brief than it sounds, and it shaped every decision from the IA to the inline data labels.