ancestra

Multi-generational intelligence for modern health

MENTOR

Nathan Shedroff

type

Speculative Design

AI Interaction

Healthcare

Conversational UX

team

Manasvi Shah + 3 Designers

(Oct - Nov 2025)

MY contribution

UX/UI Design

Visual System

UX Research

System Design

overview

Healthcare has a blind spot it doesn't talk about.


It checks your numbers. It stores your events. It tracks your steps. But it never asks the one question that might actually explain why you feel the way you do: How did your ancestors live?


Ancestra is a conversational AI agent that connects your present-day health patterns to the experiences your ancestors encoded into your biology. My contributions spanned the full design stack: UX research to frame the problem space, interaction design to architect the conversational system, and UI and visual system design to bring the product to life.

the problem

Modern health tools have fragmented us into data points. Wearables track your present. Medical records store your events. Symptom apps track your moments. DNA tests analyze your genes.


But none of them connect. And none of them ask what happened before you were born.


This fragmentation is not just inconvenient. It is a diagnostic failure. The science of epigenetics has established that ancestors' experiences can alter how genes express themselves across two to three generations. Children of Holocaust survivors show measurably altered stress responses. Grandchildren of Dutch Hunger Winter survivors carry documented metabolic differences despite never experiencing famine. (Heijmans et al., 2008, PNAS)


The research exists. The data exists. The need exists. What's missing is the interface.

We don't collect more data. We make sense of what you already have.

foundations

Before designing a single screen, we grounded the work in two things: the science and the stakeholder ecosystem.


The science : DNA is fixed (your cookbook). Epigenetics is the set of bookmarks on that cookbook: which genes are active, which are quiet, shaped by what your ancestors lived through. The critical distinction is that these bookmarks can change. This became the ethical spine of the product. Ancestra gives users agency, not fatalism. Inherited, not inevitable.


The stakeholder tensions : Mapping 12 stakeholders across three rings of proximity surfaced the conflicts that shaped our hardest decisions. Insurance discrimination risk made user-controlled data sharing a structural constraint, not a feature. Third-party disclosure shaped how the AI handles conversations about family members who never consented to be in them. Every tension produced a sharper, more precise design.

design challenge

Making AI feel like a person, not a product.


The hardest problem on Ancestra was not the information architecture. It was this: how do you design an AI that handles some of the most emotionally charged information a person can receive and makes them feel understood rather than processed?

The AI must advocate for the user, not manage them.

This drove four concrete language rules that governed every conversation flow.


Soft qualifiers over deterministic language - The AI never says "you will" or "this means you have." It uses "may," "might," "could be connected to" because epigenetics describes tendencies, not certainties.


Consent before sensitive territory - Before entering topics involving trauma or difficult family history, the agent asks: "Before we continue, is it okay if we explore this?" The pause is not friction. It is respect.


One question at a time - A single clear question per message creates a conversational rhythm that feels like talking to a person, not filling in a form.


Always explain uncertainty responsibly - When the science is uncertain, the AI says so, then explains what is known. Credibility comes from honesty about limits, not from performing confidence.

the conversation

Two users. Two emotional states. One design principle.


The language rules only matter if they hold up under pressure. Two scenarios tested them from opposite ends of the emotional spectrum.


Sakura, 27, Los Angeles

Japanese heritage. Both parents' families from Hiroshima. Struggling with thyroid symptoms, fatigue, and weight gain. Frustrated by years of tests that came back "normal" while still feeling unwell.


When the conversation surfaced a possible connection between her grandmother's survival of the Hiroshima bombing and her metabolic patterns, the critical design moment was the reframe: shifting the narrative from "my body is broken" to "my body is doing what it learned to do."

Max, 21, New York

Mother just diagnosed with breast cancer. Family history includes ovarian and breast cancer on the maternal side. Came in terrified, wanting to understand risk for his 18-year-old sister Riley who was not part of the conversation.


The most important design moment was the simplest: giving Max a word-for-word script he could use with a doctor that day.

Different users, different emotional states, the same two principles. Reframe the narrative. Then give them a next step.

the product

Five verbs. One system.

Listens

To your entire family history, not just current symptoms

Integrates

All your health data in one place, connected

Identifies

Root causes, not just symptoms

Explains

Everything in plain language, with sources

Acts

Books appointments, navigates insurance, removes friction

Onboarding

Sequenced to earn trust before asking for data. Identity first, intent before history, origin before conditions, explicit privacy commitment before any data access, and a clear medical scope disclaimer that is prominent rather than buried.


The minimum viable input is self-reported family history. Free, private, and already sufficient for meaningful insights. DNA is optional by design. Requiring genetic testing would price out the users who need this most and signal that the product is about data collection rather than understanding.

reflections

The ethical constraints on this project felt like walls at first. Privacy architecture. Medical scope limits. Third-party disclosure. Insurance discrimination risk.


By the end, they were the most generative inputs we had.


Ethics sharpens design - Every time a stakeholder tension forced a redesign, the result was sharper. The privacy architecture is stronger because we took the insurance risk seriously. The conversation tone is more precise because we took the risk of medical overreach seriously.


Conversation is an interface - It requires as much craft as any visual design. Every word carries weight. The difference between "you will develop" and "you may be predisposed to" is the difference between a product that panics users and one that empowers them.


Insight without action is useless - Knowing you might have a hereditary risk means nothing if you do not know what to do next. The metric that matters is not time in app. It is the rate at which users take a meaningful health action after using Ancestra.

Ethical constraints are not friction. They are design requirements that nobody wrote down yet.

Ancestra was a classroom project designed to real-world specification. All personas, scenarios, and conversation flows were created as design artifacts. The epigenetic research cited is real and published.

Manasvi Shah © 2026