Advanced Graphics Lab

Consortium for Fulldome and Immersive Technology Development (Award #0917919)

INTRODUCTION- PARTNERS - MAJOR ACTIVITES- PHOTO/VIDEOWHAT’S NEXT

Team Leaders: Dr. Pradeep Sen, Dr. Joe Kniss

Links: AGL home page

 

DomeGL(freeDome)

Partner: ARTS Lab

A multiprojector, multi-gpu framework for projecting on hemispheric and other unusual surfaces.  Dr. Joe Kniss led a team of graduate students (Jeff Bowles, Matthew Dosanjh, Vahid Normoofidi, Andrei Buium) to develop an OpenGL software platform that would allow real-time rendering on curved surfaces. The basic idea of the software was to take a typical OpenGL application and intercept the rendering code to render to an off-screen buffer instead of the screen, as it would normally render. This off-screen buffer was then distorted by virtually mapping it to a spherical structure on the graphics hardware, which was used by the GPU to compute what each projector in the dome should display in order for the dome visualization to appear correct to a user standing in the center of the dome. The result was a framework that allowed us to run real-time rendering applications in dome environments without distortion, and all one needed to do was to modify the existing code to have this new rendering path. The project, called DomeGL, was a success and it enabled the students to port several applications to the dome quickly and easily.

For more information see the domeGL wiki.

|

|

|

Calibration of projectors in multi-surface environments

Partner: Santa Fe Complex

While the freeDome API is a projection solution that can adapt to any type of projection surface, the problem of unusual and complex surfaces then becomes one of calibration.  The AGL developed a camera-projector pair system to calibrate projection by sending a displaying a reference pattern through a projector and comparing that to the image captured by a camera paired with said projector.  Gaps in the camera coverage were resolved by using the GPU to compare the camera imagery with the reference texture and perform real-time resampling.  For more information on this process, please refer to March 2011 Annual Report, Section_2.1.  Related work using alternative hardware and software solutions was done concurrently by the Santa Fe Complex.

|

|

Making Immersive Lighting a Reality with Smart Rooms

Partner: Rensselaer Polytechnic Institute (RPI)

One of the chief challenges of making immersive technology more accessible is how to bring it to users easily without diluting the technology to the point that it is indistinguishable from other forms of media.  The AGL team has been working to bring Smart Lighting technology into conversation with immersive media in order to bring the research from the projector based methods into concert with other light sources typical to living and working spaces.  Dr. Sen of the AGL is also an investigator on a NSF Engineering Research Center grant at Rensselaer Polytechnic Institute working on immersive lighting experiences.  By using a combination of lights and sensors, in a manner similar to the camera projector pairing used for calibration, the room can transform itself based upon both the needs of the user and the optimal energy efficiency to achieve those needs.  For more information, please see March 2011 Annual Report Section 2.3 & the Smart Lighting Engineering Research Center.