Multi-resolution Visual Sensing Architecture for High-Fidelity Vision at Low Power

Venkatesh Kodukula, Alexander Shearer, Van Nguyen, Srinivas Lingutla, Dr. Robert LiKamWa

ASPLOS '21

Paper PDF

Enabling haptic perceptions of virtual fluids in various vessel profiles using a string-driven haptic interface

Shahabedin Sagheb*, Frank Wencheng Liu*, Alex Vuong, Shiling Dai, Ryan Wirjadi, Yueming Bao, Dr. Robert LiKamWa

UIST '22

Paper PDF

Visualizing Planetary Spectroscopy through Immersive On-site Rendering

Lauren Gold, Alireza Bahremand, Justin Hertzberg, Connor Richards, Kyle Sese, Zoe Purcell, Alexander Gonzalez, Kathryn Powell

IEEE VR 2021

Paper PDF

Oct, 2024

Hurricane Heroes (ASU Now), deployed in GPH 214: Introduction to Meteorology, featured in Local News on ABC 15 Arizona and 3TV Arizona's Family.

The "Hurricane Heroes" virtual reality lab experience uses real-life hurricane data. Image courtesy of the Meteor Studio

Aug, 2024

USPTO granted ASU a patent for "Method and apparatus for time-domain crosstalk cancellation in spatial audio." This is ASU's 5th patent on Meteor Studio research.

The "Hurricane Heroes" virtual reality lab experience uses real-life hurricane data. Image courtesy of the Meteor Studio

Aug, 2024

USPTO granted ASU a patent for "Method and apparatus for time-domain crosstalk cancellation in spatial audio." This is ASU's 5th patent on Meteor Studio research.

The "Hurricane Heroes" virtual reality lab experience uses real-life hurricane data. Image courtesy of the Meteor Studio

Blending Spatial Realities Through Student Innovation

Advancing AR & VR systems and XR content development research

CONTACT

Electrical & Computer Engineering (ECE), Rice University

6100 Main St, Houston, TX 77005

QUICK LINKS

Publications ·

News ·

Team ·

Technology Intro ·

Join Us ·

Partners

Home

Publications

News

Team

Technology Intro

Research Area

Recent Publications

Recent Publications

Visual Computing Systems

We design and optimize software and hardware underlying XR platforms—from sensors and mobile platforms to rendering pipelines. This includes hardware-aware scheduling, perceptual capture, and OS-level integration for real-time immersive applications.

Learn more

Immersive Data Visualization Interfaces

We create multi-scale, multi-user, and multimodal systems to help people interact with complex datasets. These interfaces support scientific exploration, collaborative sensemaking, and public storytelling in spatial computing contexts.

Learn more

Creative Production Pipelines

We develop authoring frameworks and collaborative toolkits that empower artists, educators, and domain experts to create immersive experiences. This includes AI-assisted workflows, multi-user content creation, and custom pipelines for VR/AR storytelling.

Learn more

Multi-Sensory Interaction Frameworks

We build responsive systems that extend XR experiences through touch, sound, smell, and environmental feedback. Our research integrates haptic actuation, spatialized audio, olfactory synthesis, and real-time control into game engines and physical spaces.

Learn more