The third user

👋🏼

Design brief have failed to consider another user apart from clinicians, patient, manufacturer…

The third user- Caretaker

Medical device design has two named users: the clinicians who prescribes, and the patient who uses. In home cardiac monitoring, there is a third person in the room. They charge the device, responds to alerts, relay data to the care team, and make real-time decisions about what needs attention and what can wait until morning. They are functionally a user. They are structurally invisible. Hospital-at-home models are accelerating, Post-discharge monitoring is extending further into domestic environments. The population aging into chronic cardiac conditions is doing so in households where a family member absorbs the operational old, unpaid, untrained, and unacknowledged by the device in their hands. The third user is already there, in significant numbers, running clinical grade hardware with no clinical preparation and no interface designed for their context.

This is not a gao that better instructions will close. It is a design process failure, and it starts at the brief.

The brief omits them first
When a medical device design team writes an intended use statement, they are also writing an exclusions. The user population defined in that document determines who participates in formative research, who appears in summative usability testing, and ultimately what contexts of use the interface is designed to support. For hoome cardiac monitoring, that population has 2 entries: the clinical user who prescribes and interprets, and the patient user who wears and self-reports. The caregiver who operated the device in the space between them does not have a regulatory category. So they are not included as the system produces the brief has not careted a slot for them

The consequences shows up in the numbers. Across digital health usability studies, informal caregivers participated in testing in only 37% of studies that identifies them as a target user group as the lowest representation of any user group examined, including patients and clinicians. That statistics is not a resourcing problem. It reflects how the user model was drawn upstream. You cannot run usability testing for a user you have not named, and you cannot name a user whose context the brief was not built to capture.

This is the upstream failure. Everything that follows at the interface level is downstream consequence of it.

What interface asks them to do

The interface problems are worth naming specifically, because they are not random. They cluster around the same root: the device design for a user who already knows what the information means. A cardiac patch fires an alert at 2am. The caregiver reads it, alert surface tells theme threshold was crossed, it does not tell them which threshold, what clinical significance of that value is, or what, if anything they should do. That ambiguity is not a content gap. It is a design decision, made early, that an alert needs to communicate urgency to a clinical systems, not a decision support to a non-clinical operator. the caregiver is left to decide, alone, whether to wake the patient, call the care team, or wait. That is clinical judgement being made with an interface that was never designed to support it.

Onboarding compounds the problem. When caregiver onboarding exists at all, it is typically patient onboarding with minor substitutions. It does not account for the conditions under which caregiver actually takes over device management: time-compressed, often mid-discharge, coordinating across family members with different levels of involvement and different threshold for alarm. A 2023 usability study of connected home monitoring devices found that informal caregivers are rarely considered in the medical device design process, despite occupying a central operational role. The onboarding failure is not about complexity. It is about mismatch, between the stable, motivated, single user the interface assumed, and the person who is actually holding the device.

Data relay surfaces the same structural gap from a different angle. The care team needs week’s readings. The caregiver tries retrieve them. The interface, designed for either patient self-review or clinical portal access, offers no clear path for a non-patient, non-clinical user to extract and share the relevant data. Again,this is not a missing feature. It is the direct consequence of a design process that modeled 2 users and considered the task set complete.

Why The brief does not change

The process failure has a structure, and it is worth naming because it explains why these same problems recur across product generations.

Medical device design inherits its user model from the regulatory pathway. The intended use statement shapes the human factors engineering plan, which shapes who is recruited for research. Caregivers do no map cleanly onto FDA user categories for cardiac monitoring cause they are no the patient, not the clinicans, and not clearly defined as a secondary user in most device submissions. So the human factors engineer works with the users the framework recognizes, and the caregiver remains outside the brief.

Market dynamics reinforces this. The purchasing decision for a home cardiac monitoring system is made by a health system or a prescribing clinicians, not by the family member who will operate it. The device is optimized for the user whose satisfaction influences procurement. The caregiver’s experience is downstream of that decision, largely invisible to it and not tracked in any outcome measure the manufacturer is accountable to. There is no feedback loop connecting caregiver interface failures back to the design team. The brief does not change because nothing in the current system creates pressure for it to change.

What changes if the brief names them

The constructive argument is also the simpler one, the methodology to fix this already exist. Human factors engineering, formative usability research, and contextual inquiry are standard tools in medtech. They are capable of surfacing caregiver needs with precision with the specific alert interpretation, tasks, the onboarding conditions, the data relay workflows, the failure modes that matter. The tools are not the gap, the user definition is.

Research on participatory design with family caregivers has found that engaging them from problem definition is not brought at the testing stage, but present when the brief written is meaningfully changes what the design addresses. That upstream shift is modest in cost relative to the consequence of a caregiver misreading a cardiac alert because no one designed for the moment they would be reading it.

What a brief that names the caregiver actually requires is not a separate product or simplified interface. It requires a name user profile, the caregiver who tool over device management mid-discharge, under the time pressure, without clinical background, often managing multiple family members with different levels of inovolvment. Specific tasks: lert triage, device maintainence, data reply. Specific failure modes: what does the caregiver do when the alert fires and no guidance is available? What dies correct caregiver do when the alert fires and no guidance is available? What does correct caregiver behavior look like and has anyone in design process defined it?

The sensor in a modern cardiac wearable is often excellent. The data it produces is clinically valuable. The gap is at the layer where that data meets a person who was never in the room when the interface was designed. Closing that gap does not require a new device. It requires a different question at the start of the process who is actually going to run this, in what conditions, with what knowledge and what does the interface need to do for that person?

The device cannot answer a question that brief never asked. Who is actually going to run this, at what hour, under what conditions, with what knowledge? Until that question is asked, upstream, at the point where the user model is being written, not at the point where the interface is bein tested, the third user will keep inheriting tools designed for someone else. Human factors engineering built the rigor that made cardiac devices safe for clinical environments. The same rigor applied earlier, to a broader user definition, is what makes them usable in domestic ones. That shift does not require a new device, it just requires a different brief.


Next
Next

Shared sensory environment