Towards Accessibility for Smart Assistants

I am taking a course in Ubiquitous Computing for Assistive Technology. This is my project proposal for that course (and a way for me to get started on blogging).

Alice: Smart Alexa

Personal Assistants (ft. Alexa, Siri, Google Assistant) are now being sold in millions of devices. Right from ordering an Uber to telling people what to wear for the day, these speech recognition machines can be helpful associates for almost everyone. However, this glittering new technology cannot be used by twenty million North Americans because of speech deficiencies. They are practically unusable for people with cognitive diseases like Alzheimer’s Disease (and other types of dementia), and Autism – given how difficult it is for them to maintain a topic of conversation and how they fail to recover gracefully if they or the user deviate during a talk.

So, we propose a smart pervasive speech-based assistant (Alice) with the ability to interact with people with cognitive deficiencies, in particular people with dementia. It would, like the Amazon Echo, sit somewhere in the house, and help the target complete daily activities while engaging them in conversations. Apart from that, Alice would also remind the users to take the right medication (and make sure that they do), push them to go out of the house and encourage social interaction – all of which of been shown to delay the onset of dementia in elderly patients. After every conversation, Alice would run linguistic analysis to come up a cognitive score that we would track over time, letting the caregiver know of any red flags that they can address.

For purposes of prototyping, we would start this project by running Alice on top of Alexa/Google Assistant. We would target specific daily tasks – like making coffee, warming up food or reading a book as our entry points into “speech acts” where Alice would start conversing with the target user. The goal would be to ensure the completion of the task at hand while engaging with the user, and doing it only through the use of speech. Alice will be capable of detecting when the user has missed a step or has deviated off-topic and will gracefully handle any trouble indicating behaviors. We would aggregate all these conversations and run them through standard linguistic tests for cognitive/dementia assessment and provide the results to an assigned caregiver through a web interface.

Most of the research in the area of pervasive tracking includes professional setup and has different moving parts. By building a personal assistant, we want to do this just through speech, while providing the patients with a companion that would learn and adapt. Developing it as an add-on for Google Assistant/Alexa would mean most of the users will not need an extra device, as they could just use their phones, computers or any smart home devices. While we do not want to replace a human caregiver’s place, we want to make their jobs much more comfortable – by taking over helping the patient in their daily mundane tasks. The ability to do offline calculations also provides some insight into the cognitive health of the patient – so they do not have to wait for their annual medical check-up to know something is not right.



One thought on “Towards Accessibility for Smart Assistants

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s