“Virtual Reality, Machine Learning and Biosensing Advances Converging to Transform Healthcare and Beyond,” an Interview with Stanford University

Walter Greenleaf, Neuroscientist at Stanford University’s Virtual Human Interaction Lab, talks with Tom Vogelsong, Start-Up Scout at K2X Technology and Life Science for the “Virtual Reality, Machine Learning and Biosensing Advances Converging to Transform Healthcare and Beyond” interview at the May 2025 Embedded Vision Summit.

In this wide-ranging interview, Greenleaf explains how advances in virtual and augmented reality, machine learning and agentic AI and biosensing and embedded vision are converging to transform not only healthcare but human interaction as well. He details how this convergence will impact clinical care, disability solutions and personal health and wellness.

Through real-time monitoring of physiological measurements, eye movements, voice tone, facial expressions and behavioral patterns, these integrated technologies are enabling sophisticated systems capable of sensing, analyzing and adapting to our arousal levels, cognitive status and emotional state, adjusting to individual preferences and interaction styles. Greenleaf examines how this technological revolution will transform physical and mental health as well as how humans interact with each other and with the world around us. You’ll learn how agentic AI and immersive visualization will unleash truly personalized experiences that reflect and enhance an individual’s physical and mental health.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

Berkeley Design Technology, Inc.
PO Box #4446
Walnut Creek, CA 94596

Phone
Phone: +1 (925) 954-1411
Scroll to Top