Rensselaer Launches Center To Explore Intersections of Science and Humanities

Nov. 13 Launch of Center for Cognition, Communication, and Culture Features Demonstrations of Current Research

November 12, 2012

Image

Rensselaer Polytechnic Institute today launched the Center for Cognition, Communication, and Culture (CCC) with a series of multimedia presentations and demonstrations of research in the initial core research areas: cross-modal displays, mixed reality, and synthetic characters.

Now one of the 10 Institute-wide research centers at Rensselaer, CCC will focus on the intersections and interdependency of cognition, communication, and culture in the context of contemporary research, technology, and society. The center’s interdisciplinary research activities will draw on the arts, design, engineering, humanities, science, and social sciences.

The Rensselaer Plan is bringing a continued expansion of interdisciplinary research, and the launch of the Rensselaer Center for Cognition, Communication, and Culture is an important milestone in support of that priority,” said Rensselaer President Shirley Ann Jackson. “This new center represents a new frontier of both research and pedagogy, and their intersection. The center will bring together researchers from such seemingly diverse arenas as the arts, computer science, cognitive science, and game design to forge new tools at the intersections of the cognitive, cyber, and physical worlds. By making it possible for us to interact with and manipulate vast quantities of data on a human scale, their work will help us to meet our social and technological challenges.”

The initial core research areas within CCC are: cross-modal displays—which seek to employ all human sense in understanding and exploring data; mixed reality—in which data overlaid on the real world enriches learning and research environments; and synthetic characters—computer programs intended to simulate an independent individual, according to Jonas Braasch, director of the new center.

“Through CCC we hope to tackle some of the emerging challenges and opportunities that life in our growing parallel digital universe has brought up,” said Braasch, also an associate professor of architecture. “Initially, the center will focus on virtual reality-based narrative and game playing to develop better ways to learn languages in a more natural and entertaining way, work on the design of next-generation synthetic intelligent characters that can interact with us and enrich our social life, and on cross-modal scientific displays that take into account how humans integrate all their senses to explore and understand big data sets produced by the supercomputers like CCNI, the Rensselaer supercomputing center.”

To support this work, CCC will host the new Emergent Reality Lab, a large-scale, room-sized advanced virtual reality system that combines high-resolution, stereoscopic projection and 3-D computer graphics to create a complete sense of presence in a virtual environment. The system is currently under construction in the Rensselaer Technology Park.

To help celebrate the launch of CCC, four Rensselaer faculty members presented today on their interdisciplinary, leading-edge research:

  • Cogito—Selmer Bringsjord, professor and head of the Department of Cognitive Science, with doctoral student Naveen Sundar Govindarajulu, discussed his work in the creation of a “self-conscious” synthetic character: one that has a genuine personality, knows that it has it, can come to know “your” personality, and can interact with you as a genuine individual.
  • Augmented Reality & Data Visualization—Barb Cutler, associate professor of computer science, with doctoral student Tyler Sammann, has assembled a series of applications that demonstrate how augmented reality and data visualization can aid collaborative processes. In one such application, multiple users armed with laser pointers work together to assemble the pieces of a puzzle projected onto a large screen. As Cutler explained, the same technology offers tremendous promise in any collaborative process from engineering and design to emergency response.
  • Emergent Reality: The Lost Manuscript— Ben Chang, associate professor of arts and co-director of the Games and Simulation Arts and Sciences Program, reviewed a pilot project in which researchers use a mixture of augmented and virtual reality, narrative, and game design, to teach Mandarin Chinese.
  • Embodied Conversational Agents— Mei Si, assistant professor of cognitive science, introduces embodied conversational agents: what they are, what they can do, and the challenges and fun in creating. Si gave brief demonstrations of such agents in three different forms: 3-D human realistic digital characters, 2-D cartoon-styled digital characters, and robotic characters created by combining a robot lower body with a digital upper body. Si also presented two different AI approaches for modeling the mind and the behaviors of the characters. 

Following the presentations, CCC hosted an open house of its facilities within EMPAC. The open house included additional demonstrations and poster presentations of student research internships held during the summer of 2012 with funding from the Rensselaer Office for Research.

Faculty demonstrations included:

  • Deep Listening (CCC Studio 1) – This year, renowned musician and Rensselaer Professor of Practice Pauline Oliveros, with the help of software written by Braasch, recreated in concert the acoustic experience of the her iconic 1989 album “Deep Listening,” recorded in a cistern with astonishing acoustic properties. The concert recreated in a studio in the CCC facilities studded with 64 loudspeakers, to mimic the cistern’s immersive sound properties and 45-second reverberation at low frequencies.
  • FILTER (CCC Studio 2) – Doug Van Nort, a Rensselaer graduate, research associate, and instructor, demonstrated FILTER, an intelligent system for improvisation that autonomously listens and learns sound structures during a live performance, reacting to musical context and transforming the sounds of a performance partner. FILTER is one of the technologies being used in the Creative Artificially Intuitive Reasoning Agent project, which is attempting to create a digital conductor of live avant-garde musical performances.
  • Social Robots (CCC Studio 3) – Si presented video of a project to determine how robots with limited physical abilities can express emotions through movement. The project is part of a longer term goal of developing an AI robot capable of supporting an interactive narrative with human users.
  • Telematic Performance (EMPAC Theater) – Performing from two locations linked through LOLA – a low-latency audio and videoconferencing system technology that enables real-time, simultaneous, live musical performances over advanced research and education networks such as Internet2 – Mary Simoni, dean of the School of Humanities, Arts, and Social Sciences, and Marjorie Bagley, assistant professor at the University of North Carolina at Greensboro, performed Aaron Copeland’s “Hoe-down.” 

Student research projects displayed included:

  • Binaural microphone system for HUBO - Cameron Fackler developed an idea for microphone inputs mounted in artificial ears to allow HUBO, a humanoid robot developed at the Korea Advanced Institute of Science and Technology, to listen to its surroundings in a human-like manner.
  • Company – Allison Berkoy developed an interactive multimedia installation in which participants interact with a life-sized, three-dimensional video projection that varies a narrative based on the audience response.
  • Video Game Theory for STEM Education – Laquana Cooke merged techniques from game design into the “Culturally Situated Design Tools,” a series of educational tools developed in an NSF- funded STEM education initiative. The educational tools build upon the math and computing knowledge embedded in cultural practices such as Native American beadwork, Latino percussion rhythms, and skateboarding.
  • Interactive storytelling with cognitive robots – Michael Barron developed a cognitive robot (a robot programmed with goals, behavior, and emotions) to engage students of Mandarin Chinese, and aid in language recognition, while telling a Chinese fairy tale called “The Painted Skin.”
  • Handel – Simon Ellis developed software exploring artificial musical creativity. The program takes the role of a conductor or music tutor, “listening” to a performance and offering a critique based on a comparison with what it “knows.”
  • Creating Accurate Spherical Sound – Sam Clapp designed and built a spherical microphone array that is capable of decomposing a sound field recorded with omni-directional microphones into its spatial components, known as spherical harmonics, which can then be used to accurately recreate the sound field through a sophisticated loudspeaker array.
Press Contact Mary L. Martialay
Back to top