Technology - A NextGen Pillar

Artboard 3 (2).png

VR, AR, IoT. A powerful collection of letters that, to you, may mean nothing but to the tech world represent the cutting edge.

The Now

Advancements in technology have allowed for the measurement and collection of people’s behaviors throughout their daily lives. Weight, physical activity, substance use. These are just some of the ways employers and medical health professionals hover over people. While typically very resource intensive, automatic hovering has allowed for a cost-effective way to collect frequent measures of health and other behaviors using your own phone’s multi-sensor capabilities (Asch, et al., 2012). Your smart phone alone has the ability to track your location, approximate your physical activity (steps walked, distanced biked, etc.), recognize your face or thumbprint, your searches and app use, and much more. This along with the increased widespread use of smart phones results in much data, for many people. As of 2016, over three quarters of American adults owned a smart phone. That’s a lot of phones, resting in lots of pockets, collecting lots of data. There is no question of the untapped potential for behavior analysts to make an impact. One way this goldmine of data has been tapped into involves ecological momentary interventions (EMIs). These involve the use of multi-sensor devices to deliver intervention components in real-time, in the individual’s natural environment (Heron & Smyth, 2010).

Going along more niche lines, emerging technologies have expanded upon these multi-sensor capabilities in unparalleled ways. While more commonly known for their entertainment value, Virtual Reality (VR) and Augmented Reality (AR) have unique capabilities that may provide value to “forward-thinking” behavior analysts. VR, which can be mobile-phone based or PC-based, typically involves a headset which, when put on, immerses the user in whatever world is being virtually recreated. These have the ability to capture: movement within the virtual environment, head movements, interactions with virtual stimuli, and eye-tracking. Current uses beyond entertainment include treatment for phobias, PTSD, and addiction (Morina et al., 2015; Rothbaum, et al., 2001; Son et al., 2015). More recent developments have included wearable gloves that assist in creating a more realistic interaction experience in the VR world (Read More Here)

giphy.gif

Augmented reality, on the other hand, is still relatively new technology. The most common examples being Snapchat filters and PokemonGo the game, augmented reality embeds virtual objects onto the image of the natural environment casted on your phone. The clinical implications of this could be immense.  Some clinical uses have, again, included treatment for phobias, as it allows the user to experience the eliciting stimulus right in their natural environment (via some screen, of course). Some companies have worked towards incorporating AR into smart wearable lenses (see Google Glass or Glass-X) or even car windshields (see here). If incorporated with existing multi-sensor technology, such as eye-tracking and GPS, these have the potential to expand behavior analytic applications beyond what most are currently considering.

While all exciting, these technologies seem to be developing at different rates, with differing capabilities and for different applications. How can we create an integrative network to address problems of human affairs across all environments? The answer may lie in the Internet of Things (IoT). IoT refers to a network of multi-sensor devices capable of wirelessly sending data to each other, usually over Bluetooth or Wi-Fi. Currently, IoT has mostly been used to create smart homes; everything from your lights, to your thermostat, to your home security can be connected on the same Wi-Fi network and controlled from your phone. Emerging, though, are applications beyond energy saving and home security. Smart fridges are currently out on the market, and have the ability to capture the contents of your fridge, letting you know what foods you’re running low on, which recipes you can cook, or which foods are about to spoil. With added sensor capabilities, the fridges may be able to track use data as well, which foods you are consuming, how often you’re opening the fridge, etc. 

So, why do I care?

Picture this. You are pre-diabetic and your doctor recommends you should improve your diet and increase your physical activity as a preventive measure. Your interdisciplinary healthcare team, which includes your primary care physician, a nutritionist, a behavior analyst, and perhaps an endocrinologist, have all agreed on a diet and exercise plan, with contingencies in place to ensure you meet your goals. Accompanied by your smart fridge, your wearable fitness tracker, your smartphone, your smart lenses, and your weight scale (all connected via IoT), your healthcare team will be able to hover and track your progress with the data wirelessly transmitted to their computers daily. You’ll have on-the-go support with your smart lenses, advising you on smart food options, nudging you towards healthier choices.

Sounds like an episode of Black Mirror, right? Within our lifetime, this may not be too far off from reality.


Author

Andrea Villegas, BS

Affiliation: BehaviorMe

IMG_5724.JPG

Bio: Andrea Villegas is a graduate student at the University of Florida where she is pursuing a Doctorate in Behavior Analysis, focusing on health behavior change, under the mentorship of Dr. Jesse Dallery. She received her B.S. in Behavioral and Cognitive Neuroscience from the University of Florida, during which she gained experience addressing severe problem behavior in research and clinical settings, primarily with children with Autism Spectrum Disorder and adults with Prader-Willi Syndrome. Andrea is also co-founder of BehaviorMe, a start-up dedicated to expanding the field of behavior analysis using cutting-edge technology.

 


References

Asch, D., Muller, R. W., Volpp, K. G. (2012). Automated hovering in health care – Watching over the 5000 hours. The New England Journal of Medicine, 367, 1-3.

Heron, K.E., Smyth, J. M. (2010). Ecological momentary interventions: Incorporating mobile technology into psychosocial and health behaviour treatments. British Journal of Health Psychology, 15, 1-39.

Morina, N., Ijntema, H., Meyerbroker, K., Emmelkamp, P. M. G. (2015). Can virtual reality exposure therapy gains be generalized to real-life? A meta-analysis of studies applying behavioral assessments. Behaviour Research and Therapy, 74, 18-24

Rothbaum, B., Hodges, L., Ready, D., Graap, K., & Alarcon, R. (2001). Virtual reality exposure therapy for vietnam veterans with posttraumatic stress disorder. Journal of Clinical Psychiatry, 62, 617-622.

Son, J. H., Lee, S. H., Seok, J. W., Kee, B. S., Lee, H. W., Kim, H. J., Lee, T. K., & Han, D. H. (2015). Virtual reality therapy for the treatment of alcohol dependence: A preliminary investigation with positron emission tomography/computerized tomography. Journal of Studies on Alcohol and Drugs, 76(4), 620–627.


That's right! Next Gen Revolution Summit Miami and Online is ALL SET.  

Where:

Miami, Florida, USA

Address:

The LAB Miami

400 NW 26th St, Miami, FL 33127

When:

Friday, November 10th, 2017: Optional Meet 'n' Greet for all registered attendees

Saturday, November 11th, 2017: 8am-6pm - Official Revolution Summit Event for all registered attendees

Saturday, November 11th, 2017: 6pm - 10pm Wynwood Artwalk and Social open to the public. The Artwalk is for all generations and backgrounds, and just so happens to occur right after our event! Learn more here.


The Full Lineup

Scroll Through to Learn More About Our Stellar Lineup


Check out the full program and buy tickets below.


LIVESTREAM AVAILABLE DAY OF AND ON-DEMAND 

That's right. You don't have to be in Miami to experience the awesomeness and you can watch it on your own time afterward.

Direct Questions to info@nextgenrevolutionsummit.com | 775.639.8436

Instagram