INTRODUCTION
Lifelike is an iOS application designed to enhance textbook learning experiences by providing a better spatial aid through augmented reality. It is a platform that equips any textbook with AR learning tools. In many textbooks and books, we’ll find pictures and diagrams designed to help aid students’ learning process. Often times, it can be difficult for students to make sense of what they see. Likelife aims bring these pictures and diagrams into a 3D space via AR. 
HOW IT WORKS
DEMO
To see the code for the API and the app, please check out this link.
IDEATION
For this project, my partner and I wanted to create an iOS application in the education space. After discussing and researching, we settled on the initial concept for Lifelike, an app that serves as visual-spatial aid through augmented reality for special education learners. The way a student would use our application, is that while reading a textbook, they can use the camera on their mobile device, which our application would recognize key images, to render 3D models to accompany those visual aids. A mobile device allows the user to easily interact with and observe the model to better their understanding of the concept.
To verify and validate the concept that we came up with, we held discussions and interviews with educators, students, and other potential stakeholders. A few key points that were brought up during the conversations were:

1. Students that are most likely to benefit from this app (special needs students) are the ones most likely not to use the application.
2. Students will not go out of their way to use such features. Gamification and other forms of extrinsic motivation will probably be needed for students to reap the benefits of this technology.
3. Lifelike’s concept would only apply to certain subjects (ie. biology, history, physics, etc)
4. Lifelike would be better off targeting higher education students. Student that would make use of the technology are usually high achieving.
Taking this feedback, we restructured our approach and conducted user testing to hone in on the details of our concept.
USER TESTING
There were a few goals for the user testing. First, we wanted to gather more feedback on the concept of the application. Would the application be useful for the user? Would it add any additional value to their educational experience? Next, we also hoped to gather feedback on the various interactions of the application (both the user flow and individual interactions). Lastly, we wanted to ask users about their opinions on image recognition. Image recognition was one of the features that could be extraneous to the user’s experience.
We tested with college students at CMU with various backgrounds from ECE to Mechanical Engineering to Physics where this application might be used. We had them walk through high fidelity prototypes (see 'Prototyping' section).
Overall, feedback on the concept of the Lifelike was generally positive. Users thought the application had education value and could potentially be very helpful with understanding and breaking down concepts. AR would be very useful as it provides a 3D interaction and additional information without the user having to Google information.
However, many of them noted that most of their classes do not require textbook reading. Granted, some of the users we talked to have not recently taken classes/subjects that this would used (ie. sciences, 3D math, etc.) Using this feedback, we conducted further research and narrowed down on more specific use cases for our app.
As we were coding the application, we also ran additional user tests to verify and hone in on interactions.
PROTOTYPING
Before coding the application, we prototyped the AR interactions using a technique suggested by Apple at WWDC. Take a look at an example prototype that we made.

You may also like

Back to Top