TPL5

Grant Travis Ng
3 min readOct 18, 2020

Outline for Thesis:

My thesis google doc is here

What problem or context is your work responding to?
How does your work relate to existing work other people have been working on?

The main problem is applying ABA treatment remotely. Therapy for ABA treatment must be having a hard time applying their treatment during covid times. My work is mainly targeting aspects of therapy that could be enhanced or alleviated with facial tracking technology. There have been used of facial tracking applications in detecting emotions and diagnosing children with autism, but there has not been much use for using the application for ABA treatment.

Using the application in a real world setting:

Therapeutic applications with Virtual Reality Therapy (VRT). VRT has been currently been used to treat various psychological disorders such as PTSD, virtual rehabilitation, depression, occupational therapy and more (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3536992/). Recently there has been research in using the technology for children with ASD. Research has shown that VRT can increase social cognition, communication and emotional recognition.

Working with Amy Hurst, Regine Gilbert and Ron Kosinski, I have decided to tackle the problem of de-escalating problem behavior for children with either ASD or has emotional disorders when going into new environments.

The current problem: After talking with Dr. Kosinski about his multi-sensory room, he has expressed some issues with some children having difficulty transitioning into the dentist room. The children are unfamiliar with dentist check up and elicit problem behaviors such as screaming, non-compliance, and aggression. My app intends to de-escalate the problem behavior and provide an easier transition into dentist room. Children will hopefully find the app interesting enough to want to clam down and relax in the virtual world.

So, what are you making actually?

An 3D avatar application that can be used as a tool for ABA therapists or Speech pathologists.

For the dentist project, I am making an application where the dentist or dental assistant can remotely puppet a character that the child can see. So the dentist or assistant are seen as a friendly 3D character instead of a “scary doctor”. A live facial tracking application will be used for the children to engage with.

How did you make it?

  • Using Unreal
  • LiveLink/iOS ARKit
  • Makehuman and reallusion
  • Maya/blender
  • Multiple VPN servers (Zerotier/neorouter/Softether)

Does it work? How did you change and tweak it until it worked?

There is a rough prototype that has facial tracking and motion capture working with an Unreal and iOS app.

I plan on having the staff of Dr. Kosinski’s multi-sensory room be the testing ground of the application. Right now the prototype works for remote face tracking through a VPN. The prototype is not ready for the kids side to experience a world yet.

I want to see if the dentist can use the app and project something on the wall first and test out the application. I plan to create some effects and a calming environment in Unreal and play around with some effects. I still want to incorporate the mandelbulb and use fractals as a aesthetic.

What does your project mean for other people?

Mainly as a possible tool for remote therapists/health care providers to apply virtual therapy for an individual who needs it.

Next steps:

  • Setting up Demo with Ron for Friday. Once his team tests the application out, I’ll start to build the space and aesthetic of the world.
  • Look into the tutorial https://www.youtube.com/watch?v=Psr_Duf0cN0&ab_channel=ArtHiteca
  • Look into Unreal particle effects and shaders
  • Meeting with Amy this week
  • Follow up with Regine and contacts in California

--

--