Stroke is the leading causing of physical disability in the United States. Nearly 80,000 people suffer from a stroke annually causing serious, long-term impairments. The recovery process for a patient is expensive and difficult. Insurance usually only covers the first few weeks of therapy, leaving patients without a support system. 
Wabbit provides affordable, gamified physical therapy for stroke patients. 
By combining augmented reality and predictive machine learning, Wabbit creates customized exercise plans for stroke patients.  Game actions in the game (such as pulling vegetables, watering plants) correspond with physical therapy exercises, making the recovery process more engaging and motivating. Patients can now easily access affordable physical therapy in the palm of their hands. 
Based off of patient data and machine learning algorithms, patients receive daily tasks for them to complete in the game. As users progress through the game, the application will adjust activities accordingly.
Interactions on the application are designed to be one-handed for flexibility and ease of use. Users can navigate to a physical therapy task of their choice.
Once a task is selected, the patient can follow the interactive on screen instructions for the exercise. An assistive animation is over laid on top of the game action to help them practice the proper exercise form.
A task report, along with a progress report, is given to the patient at the end of the game. The reports serve as a source of motivation for the patient to reflect on the progress made since many patients lack a support system during the recovery phase. 
For this project, we were asked to create mobile service for the Comprehensive Stroke Center of the Allegheny General Hospital. The Allegheny General Hospital was one of the first medical centers in the country and the first in western Pennsylvania to be designated as an Advanced Comprehensive Stroke Center. The mobile service had to include two technologies from the following list: augmented reality, computer vision, predictive and real-time analytics, conversational user interface and chatbots.
To begin, we started by mapping out the different stakeholders of the Comprehensive Stroke Center. We tried to encompass all parties that a stroke patient and the stroke center would influence. In the healthcare system, many parties are involved in a patient’s recovery, so we also did additional research to gain a better understanding of how the health care system and emergency response teams function. 
To figure out the problemspace that we wanted to work in, we began conducting research online about stroke, stroke response, and stroke recovery.
In addition to conducting research about the subject, we also talked to a doctor from the Allegheny General Hospital to gain more insight. Interviewing a doctor, a stakeholder in our area of research, was extremely useful in helping us decide where we wanted to focus on.
A few points that we took away from our research and our interviews:
        Time is crucial in stroke response.
        75% of stroke survivors report lasting effects.
        Insurance companies only pay for a few weeks of rehabilitation.
        After a patient is sent home, there is generally no follow-up. 
        Often, stroke survivors have to seek their own recovery networks or have none at all. 
After compiling our research and different ideas, here were some of the ideas we explored: music therapy, using AR to help with rehabilitation tasks, creating support networks for stroke patients, and AR projection for reading comprehension.
In our initial concept sketches, we focused on shaping and narrowing down the idea for gamified stroke recovery. One of the design choices we had to make was deciding how the user would interact with the AR objects. While using a VR headset with a landscape phone orientation might make it easier for a user to interact with the environment, often times these VR headsets are not cheap or easy to use. Consequently, we designed our app to be a one handed experience so that stroke survivors can interact with AR objects while on their phones. 
Taking the feedback we gained in class, we began creating digital screens. We all tried a variety of different tactics and design principles. After analyzing the different methods and things we all tried, we combined them to create a unified system and jumping into detailed prototyping. To simplify things for this project, we decided to focus on one interaction in the app. We chose to prototype the daily start screen, one interaction for completing a task, and the daily ending screens.
One of the biggest challenges of this project was prototyping the AR interactions. We first
tried to prototype the application in Unity, using the AR kit plug-in. However, while it did
create the most realistic interactions, it was very difficult to add occlusion and make the
hand interactions look real. Consequently, we used Cinema4D, AfterEffects, and Photoshop to make the AR interactions. We also created a hand animation microinteraction to aid a user in understanding how to complete a task.
If we get a chance to take Wabbit forward, the first thing we would like to figure if how the technical limitations that come with augmented reality, example depth perception, drifting etc. would affect the gameplay and the experience for a patient. We would also like to work with professionals, such as physical therapists and doctors, to test and further develop Wabbit to be an effective option for stroke patients. 

You may also like

Back to Top