We are three weeks from our first dress rehearsal! Since our last update, we have a production team and a cast! We look forward to completing the collaboration between RPI departments that are working with us to stage this project. Some of them include the Department of the Arts, Department of Cognitive Science, Department of Communication and Media, the RPI Players, the RPI Union, EMPAC, and others! More updates will be coming in the next few weeks, as we ramp up in production. Stay tuned!
Producer, Our Town 360
Using documentation of a different painting from Clare (it's called Replacement Cities, and it is not related to Our Town), Marc has come up with a method for animating different layers of paint to visualize the painting process. This will be used to visualize the panoramic painting for Act 3 during Emily's posthumous memory. Check out the demo video.
These images are screen captures from our VR simulation (created by Marc) to give a sense of the audience perspective during the Our Town production. The images show the scope of what a seated audience member would see from their point of view. The screen is showing a simple square grid pattern, just under 3 squares tall, and 20 squares around, designed to aid our visual artist's process (Clare) as she develops the panoramic ink drawings for the project.
Clare and Rebecca worked together to develop this set of cut-and-paste storyboards for the show, outlining basic concepts for the artwork on the screen, and the mise en scene on stage, for each scene and/or crucial moment in the play.
This video demonstrates a new algorithmic technique Marc has engineered for using gestures in real time to reveal and create layers, areas, and elements of the panoramic painting for Emily's memory in Act 3. The painting in this test video is not the painting to be created for the play, but is by the same artist, Clare Johnson.
Demo video of the terrific VR simulation of the performance space, complete with responsive gesture system, Marc created for us as an ingenious solution for rehearsals before we have access to the performance space. Clare's concept sketches are featured too, and the music is from a previous collaboration with our composer, Brendan Padgett, from a project called Ascent.
The visual artist who is creating the artworks for the production, Clare Johnson, has also developed some sketches to help illustrate the concept for our re-imagining of Our Town.
Not only was Marc able to model the 360-degree screen with audience configuration in Studio 1 in Maya ... he was able to create a functioning prototype of the gesture interface using the Kinect to control the screen, in Unity. This is incredibly helpful, not only as a prototype, but as a rehearsal tool. Right now we're using a VR HMD, but have an AR HMD on order. This means the actor playing the Stage Manager can rehearse wearing an AR HMD, practicing the gestures and seeing a simulation of the Studio 1/360-screen respond in real time, and using optical see through AR technology, be able to see the other actors rehearsing with him at the same time. We'll post a demo video of this soon - in the meantime check out some screen captures.
We got official confirmation from the EMPAC folks today that we are on their calendar for our Spring tech week and performance dates in March 2017! Wow!!
We had a really productive technical planning meeting with the very talented staff at EMPAC today. Great discussion about technical constraints, calendar, personnel, and expertise needed. It should be possible for Marc to model the whole performance space, including the 360-degree screen in Maya, and the EMPAC folks even have a model of Studio 1 already that we can work from. The really major innovation in design from this meeting is the idea to suspend the 360-degree screen from above, allowing the screen to hang at about 8 or 10 feet, so well above the actors' heads. This solves several problems for us - not only about performers and audience members occluding the screen, but also provides us with more flexible staging possibilities in terms of blocking. With the screen on the floor, we would have access to only one entrance/exit, lousy for blocking and also restrictive in terms of fire code and number of people who could be in the audience. So this is very exciting! And we also had a chance to begin to discuss strategies for theatrical lighting, given the presence of the 360-degree screen, as well as possibilities for producing video documentation of the performance. One other exciting piece of news - we learned from Sr. Research Engineer Eric Ameres that he has developed a much smaller version of his CampFire interface. This smaller version is about 1.5 feet in diameter, and is called the CandyDish - this would be fantastic as an interface for our Stage Manager for using gesture to control the images on the large screen.