TPL 7

Grant Travis Ng
3 min readNov 2, 2020

Thesis Presentation here

Multi-sensory specifications

This week I took pictures of the multi-sensory room in the School of Dentistry last Friday morning.

I spent about two hours going through the floor and taking photos, AR scanned the rooms and took 360 videos of the room. I used the GO pro fusion camera, which has two lenses that will stitch all the images together in the GO pro application.

GO pro hero 6 360 fusion camera

When I met up with Dr. Kosinski, he showed me the multi-sensory room and the type of tech that the room has. The room does look very calming and therapeutic with the color of the lights. There is also a computer that is controls the projector. The only things that are functional in the room are changing the color of lights on the side of the room and changing the colors of the hanging clouds. The computer requires an admin login.

Multi-sensory room with therapeutic chair and bubble tube.
The clouds in the multi-sensory room and the control panel near the door.

In the multi-sensory room, there is a Dell computer that is hooked up in the ceiling where I downloaded the project but I still need to download the Unreal engine version. I looked at the specs and I think the computer can handle the engine. The graphics will be different, but the functionality should still be the same. I plan on creating a 360 video for one room in the coming weeks and see if I can recreate a 3D represented version of the wheelchair accessible dental chair.

Dr. Kosinski showed me how the chair works and how it moves. After seeing all the different types of chairs on the floor, I began to wonder about the implications of using my application. The concept of demonstrating how the chairs function would be an interesting problem to focus on for people who are anxious about coming in the dentistry office.

Wheelchair accessible dental operating chair.

My next steps are to use the 360 videos I took and incorporate my project into the multi-sensory room. Download Unreal and start testing UDP messaging in the room with my iPhone and VPN. I will also delve in the “abstract art” for the type of character I would like to create for this experience. I am still interested in doing a performance of my own in some way. So I will look into using facial tracking or motion capture.

Thesis outline here

Overall structure of the thesis

AR/VR Technology

  • The AR camera on the iOS
  • AR apps
  • Unreal
  • Facial Tracking

VRT applications

  • Ptsd
  • OT
  • Therapy/VR Spas
  • Snozelean
  • AR applications
  • ABA therapy
  • Anxiety

Autism/Anxiety

  • Behavior disabilities
  • Working as a ABA therapist
  • My relationship with the Autism demographic
  • Transition to focus on anxiety and a broader demographic
  • Multi-sensory room
  • My thesis statement

Method

  • How I made my application
  • Process logs and iterations
  • What works/didn’t work
  • Work at the RLab and with Todd

Dr. Kosinski

  • Background of Multi-sensory room
  • Setting up the Multi-sensory room
  • Observations
  • User experience
  • Feedback from advisors and contacts
  • Findings

Conclusion

  • Impact on community and field
  • What future iterations should focus on
  • Future of the tech

--

--