Virtual Reality (VR) technology was primarily used by video gamers to play 3-D games. Not anymore. The technology has changed so much over the years that now medical, sports, training, industrial, and many more applications use it. And it has significant business potential.
So what is VR?
Manufacturer STMicroelectronics has spent a great deal of time in researching the subject, defines VR as follows:
- Virtual Reality (VR) – An immersive experience with a fully occluded view with content that is rendered (i.e., CGI) or captured video and/or images or combinations thereof.
- Augmented Reality (AR) – A fully transparent headset or glasses through which the viewer will see the real world with superimposed information in the user’s field-of-view. The content is typically overlaid text, images, or graphics.
- Mixed or Merged Reality (MR) – Here, there are some differences in definition. Still, a commonly accepted one is a device that combines a camera with a VR headset to deliver live video/image feed to the display. The “mixing” or “merging” is the integration of the real-world images from the camera with the rendered or captured video/images.
- eXtended Reality (XR) – A fully transparent device/headset that delivers realistic, 3D, holographic-like content in the user’s visual field-of-view. Here virtual and physical objects are indistinguishable.
A VR hardware platform is a head-mounted display (HMD) or goggles with embedded electronics and VR software that interfaces with video, audio, and sensors to create an environment that a human being can interact with.
As shown in Figure 1, the VR hardware platform includes a high-performance MCU processor, memory, audio, and optic electronics integrated into a single chip or chipset. Because it is battery-powered, the designers must create it to be energy efficient yet high performance. In real-life applications, latency has to be kept to less than four milliseconds. Otherwise, the user will experience motion sickness because the video observed cannot catch up with the real environment as the user’s head turns.
What is the VR market potential?
According to ABI Research, a technology research and advisory firm, 35 million VR head-mounted displays (HMD) will be shipped in 2024, with market revenues reaching $25 billion. The revenue will come mostly from the consumer segment (media/entertainment), with notable enterprise revenues in education. Revenues from other segments are expected to grow in the future.
The SyncThink technology: Using VR for insight into the brain
In the early 2000s, neurosurgeon Jamshid Ghajar MD, Ph.D., had an idea. He wanted to come up with a way to measure people’s attention span through eye movement objectively. Early in his studies, he found eye movement abnormalities to be very common in patients with concussion. A few years later, he presented these findings to the US Department of Defense, was awarded several grants to study the military population, and began developing EYE-SYNC technology (Figure 2).
In 2014, after years of studying military personnel returning from the Middle East with deficits after traumatic brain injury (TBI), Ghajar arrived at Stanford to study student-athletes. There, he met Scott Anderson, at that time the head of the Stanford Sports Medicine program, which oversees the medical care provided to student-athletes. In 2015, Anderson began using a prototype device based on EYE-SYNC technology in clinical trials. The trials examined how eye movements, brain trauma, and cognitive functions relate to one another (Figure 3). Soon after, Anderson integrated the technology into the clinical care of more than 900 athletes across 36 athletic programs. In 2016, EYE-SYNC technology was awarded its first FDA clearance as a Class II medical device.
Ghajar later founded SyncThink and commercialized the EYE-SYNC technology as a product in 2017 following 15 years of clinical research. The EYE-SYNC product has evolved into an award-winning digital brain health platform. The platform includes a wireless, Bluetooth-enabled, custom-designed VR goggles, a smartphone, and a tablet. Figure 3. Top academic, research, clinical, and sports organizations across North America use EYE-SYNC.
Our human brain interacts with its environment by predicting incoming visual information to determine how to react. Using the eyes to receive information, the brain tends to synchronize a person’s behavior with the information received constantly. If this synchronization is off by 0.25 seconds, a healthy brain will react a certain way. If the brain has a problem, it will respond differently. The difference between and the normal and abnormal reaction is called “variance” or “errors”. EYE-SYNC measures the variance. Over the years, SyncThink has developed a very comprehensive data set of eye signatures for clinicians to use in diagnosing brain functions.
“EYE-SYNC has given us a window to the brain. When a clinician gives the patient a set of instructions to follow, the patient will move his eyes accordingly. Using multiple built-in cameras, EYE-SYNC can detect the movement of the patient’s eyes. The captured data will then be compared to a set of well-defined eye signatures. With this new ability to compare findings, clinicians now have information that was once unavailable,” commented Anderson, Chief Clinical Officer of SyncThink, “It is indeed remarkable!”