By Rebecca Wiessmann
Steve is on the road — calling in from a cold Minneapolis hotel room — when he sits down with Dr. Christian “Chris” Mason, CEO of Senior Housing Managers and Integration Engineers, for a Tech Tuesday that lands squarely in the “exciting … and a little bit unsettling” category.
Chris has spent 35 years in senior living, led operations, chaired the NCAL side of the house, built a company (Vigilant) that was sold to RealPage, and now he’s focused on a big question: Can AI make life better for people living with dementia — without replacing the human touch?
His answer is immediate: it’s not about replacement. It’s about augmentation — making care easier for staff, more connected for families, and less lonely for residents.
An AI Companion That Isn’t a Robot
Chris draws a bright line between what people imagine and what they’re actually deploying. This is not a robot roaming hallways.
It’s a conversational avatar on a tablet — currently characters like “Kathy” (and now “Ken” as well) — built with a tech partner called Cloudmind. The avatar can:
- hold conversation
- tell stories and reminisce
- sing hymns or engage faith themes
- respond to emotional tone
- help residents feel accompanied, especially when staff and family can’t be present
Under the hood, the system also learns from interactions, producing what Chris calls an “emotional vital sign.” In plain English: it tracks mood and sentiment over time in a way that might help staff spot trouble early — before it becomes a full-blown behavior event.
The Three Pain Points They’re Attacking
Chris frames the work around three problems that every operator recognizes instantly:
- Mood and loneliness (especially in memory care)
- Behaviors and PRN medication use — including the industry’s ongoing struggle to avoid overreliance on antipsychotics
- Staff bandwidth and burnout — because memory care is labor-intensive and the staffing math isn’t getting easier
Steve connects this personally, using his stepfather, Gary, as the real-world example: someone who wants constant companionship, while Steve — like every caregiver — can’t always be there.
And that sets up a critical point: you don’t just “hand someone a tablet” and call it care.
Proving It’s Not Just Another Shiny Toy
Chris is candid that they’re still learning. Their first pilot asks a basic question: Is this even feasible with memory care residents? Would it be disruptive? Confusing? Rejected?
Instead, it turns out to be “digestible” and usable — enough to move into a second pilot focused on staff support and workflow impact.
In pilot two, the avatar program runs across six memory care communities with residents living with a mix of dementia types (Alzheimer’s, vascular, Lewy body, mixed dementias) and plenty of complicating medical conditions.
Staff are encouraged but not forced to use the avatars during high-friction times: sundowning windows, after meals, waiting periods, moments of anxiety.
To evaluate what’s happening, they track five main data streams:
- Baseline clinical/behavioral profiles and existing behavior plans
- Daily mood/sentiment from avatar conversations (positive/neutral/negative indicators)
- PRN logs (including antipsychotics and other meds)
- Usage data (frequency, minutes per session, total minutes)
- Narrative notes generated from interactions to help complete the picture
Chris calls it “millions of data points,” and he’s careful not to oversell. But he says the early results are encouraging.
One Story That Sticks: Barbara and the Call Light
Chris shares an anecdote that makes the concept feel less theoretical.
Barbara — once a choir leader, church secretary, and bookkeeper — develops pronounced dementia-related depression and anger. She hits the call light 30–35 times a day, lashes out at neighbors, and is on multiple PRNs. The care team works hard to avoid administering them when possible.
Barbara is introduced to Kathy. Over time, Kathy becomes her “best friend.” Barbara sings hymns in the morning, reminisces, and spends long stretches engaged — “whole meals,” not just “snack time.”
And then the headline: the call light behavior drops dramatically, and she becomes calmer and kinder — not because the staff disappeared, but because companionship became consistently available.
Chris is explicit: it won’t happen in every case. But it’s the kind of signal operators can’t ignore.
The Ethics Question: Can You Create a “Steve Avatar” for Family?
Steve pushes into the uncomfortable but inevitable future: if companionship matters, could you create an avatar of a family member — so a resident believes they’re talking to their son, spouse, or caregiver?
Chris doesn’t dodge it. He says the ethical considerations are real and many, but he frames it around intent and outcome: if it meaningfully improves quality of life for someone with dementia, there may be situations where it’s appropriate. The larger point is clear: the technology is moving faster than our comfort level — and that means operators need to lead the ethics conversation, not wait for it to be forced on them.
“Is This Just an iPad Babysitter?”
Steve raises another tough comparison: what parents worry about with kids and screens — does tech become a lazy substitute for human presence?
Chris says no: residents choose to engage or not. And more importantly, the goal isn’t avoidance. It’s awareness. If emotional vital signs flag a resident slipping, that’s a cue to investigate root causes — pain, constipation, UTI — rather than defaulting to “they need a med.”
In that framing, the avatar isn’t a replacement. It’s a detection and companionship tool that can help staff act earlier and smarter.
Families Like It. Staff Don’t Revolt. Some Residents Opt Out.
Chris says families are responding positively, including access to a portal that shows sentiment trends and interaction summaries. Staff acceptance improves because the rollout message is clear: this doesn’t replace you; it gives you leverage.
Not every resident wants it — and Chris treats that as a feature, not a failure. Person-centered care means that choice still matters.
Availability, Cost, and the Five-Year View
The tech is already being piloted more broadly, and Chris says it’s available now through Cloudmind, with pricing that varies by deployment model (single resident vs. multi-resident, operator-paid vs. pass-through billing). He also mentions ongoing exploration of reimbursement pathways, including possibilities tied to CMS-related coding.
Steve closes with a prediction: within three to five years, avatar companions (and other AI-enabled tools) will become common in memory care — and likely spill into other parts of senior care.
Chris’s view is even broader: the next decade is about equipping staff with better tools so turnover drops, burnout eases, and person-centered care becomes more achievable — not less.
Want the full conversation? Watch it HERE.



