Supporting Co-Located Collaboration in Large-Scale Hybrid Immersive Environments

These are videos from my talk for the 2016 Media Arts & Technology Seminar Series @ UCSB. While at UCSB, I had the chance to walk in the Allosphere: it's by far the most immersive large scale system I've ever seen!


 Abstract: In the domain of large-scale visualization instruments, Hybrid Reality Environments (HREs) are a recent innovation that combines the best-in-class capabilities of immersive environments, with the best-in-class capabilities of ultra-high-resolution display walls. Co-located research groups in HREs tend to work on a variety of tasks during a research session (sometimes in parallel), and these tasks require 2D data views, 3D views, linking between them and the ability to bring in (or hide) data quickly as needed. Addressing these needs requires a matching software infrastructure that fully leverages the technological affordances offered by these new instruments. I detail the requirements of such infrastructure and outline the model of an operating system for Hybrid Reality Environments: I present an implementation of core components of this model, called Omegalib. Omegalib is designed to support dynamic reconfigurability of the display environment: areas of the display can be interactively allocated to 2D or 3D workspaces as needed.
 
Alessandro Febretti - full talk - part 1/2 from mat.ucsb on Vimeo.
Alessandro Febretti - full talk - part 2/2 from mat.ucsb on Vimeo.

Comments

Popular posts from this blog

Parallel Beam Tracing and Visualization of 200 Million Sonar Points

Parallel Gaussian Elimination Using MPI

Flickering 2D transform elements in Chrome: a simple fix