Major Activities

Consortium for Fulldome and Immersive Technology Development (Award #0917919)

INTRODUCTION- PARTNERS – MAJOR ACTIVITES – PHOTO/VIDEOWHAT’S NEXT

freeDome(aka domeGL)

domeGL, a reconfigurable, multi-projector, multi-surface programming library has been the major effort at ARTS Lab with the lead from collaborators from the University of New Mexico’s Advanced Graphics Lab, Computer Engineering and Computer Science departments.  domeGL is an OpenGL based framework that is designed to operate a multiprojector system from a single computer.  As part of the development of this project students at UNM have developed several demo projects to showcase the features and compatibility of this framework.  It is compatible across OSX, Linux and Windows, using a variety of hardware.  We have demonstrated it with up to 8 channels of video in fulldome and other configurations.  More information can be found at the domeGL wiki.

|

|

|

Projectors in multi-surface environments

When projecting content onto arbitrary surfaces, such as the corner of a room, the problem becomes much harder. The purpose of this part of the project is to calibrate the projector and camera automatically, so that the image viewed from the point-of-view of the camera looks correct despite the distortions introduced by the room environment. This problem was tackled by both teams in AGL (Dr. Sen working with graduate student Vahid Noormofidi) and at the Santa Fe Complex (Stephen Guerin working with Cody Smith, Steve Smith, Bjørn Swenson, August Swanson, Skyler Swanson, Scott Wittenberg). (see Advanced Graphics Lab & Santa Fe Complex partnered research)

|

|

Portable Dome Development

The ARTS Lab collaborated with the School of Architecture and Planning to build a small-scale, single projector stand-up dome.  Unlike larger fulldome installations, this model can be quickly disassembled and transported by a small crew.  Drawing upon research by Paul Bourke at the University of Western Australia, architecture student Ian LeBlanc designed and lead the fabrication of this dome with a team of staff and student volunteers.  The skeleton of the dome was constructed from plywood cut on the CNC mill at the school’s fabrication shop, and it was surfaced with a faceted paper skin that was painted and sanded to be nearly seamless.

Santa Fe Complex contracted local business Lumenscape to design and construct a dome based upon geodesic designs and surfaced with a flexible fabric skin. (see Santa Fe Complex partnered research)

Blending between Multiple Projectors

As we project multiple projectors at arbitrary angles onto non-uniform surfaces like the pleated LumenScape Dome or arbitrary room surfaces, we must warp the projected imagery in the projectors to compensate for complex geometric surfaces, as described in the previous section. This means that we must blend the overlapping projector images in cases when their areas of intersection can be extremely non-uniform. (see Santa Fe Complex partnered research)

Input Devices for Immersive Environments

As part of the domeGL API, the team has repurposed several commercially available hardware controls for use in immersive environments. Hardware is primarily Nintendo Wii controllers and sensors, and is in use in both its original form and as de-constructed and reconstituted devices unique to this project. (see domeGL wiki)

Opening of Institue for American Indian Arts Dome & standing up a Native focused Fulldome Curriculum

The Institute for American Indian Arts has opened a moveable and articulating Fulldome theater, for use in classroom instruction and student production. (see IAIA partnered research)

Undergraduate Instruction in Immersive Media, IFDM course 491

As part of the Interdisciplinary Film & Digital Media Program a course in Immersive Media and Visualization for student production at the University of New Mexico was started in Spring of 2011. The course will be offered on a recurring basis by ARTS Lab instructors.

Immersive & Smart Light with Rensselaer Polytechnic Institute

Dr. Pradeep Sen and his team at the AGL have been studying how Smart Lighting rooms can be used to create immersive lighting experiences for users. The basic idea is that the Smart Lighting rooms have a combination of light sources and sensors, very similar to the camera-projector pairs described in the calibration section.  This is extremely interesting because it helps motivate the use of these immersive lighting environments to a typical user. For example, as opposed to the typical applications we think about (such as active visualization, entertainment, etc) the user might simply want to illuminate the room while using the least amount of power possible.  We can also use the lighting in this way to track users or create other forms of interaction between the user and the lighting. This whole room can be controlled by a computer similar to those used in the multi-surface calibration work. (see Advanced Graphics Lab partnered research)