Week 4: 1/4’s

We opened this week with 1/4s presentation to all the faculty dropping into our Zoom, getting our first sense of ETC feedback. We were pointed to a number of helpful resources and had a few suggestions made for how to solve some of the primary issues that we were facing in our design. But generally, the faculty response was that we were in a good space for this point of the project, so we’re right on track for the time being.

At sitdowns on Friday we met with Mo and Mike. Mo talked us through a previous AR project that we worked on and made suggestions based off of his findings from that experience about including different binary on/off devices using Arduino to really create interesting moments of crossover. He also pushed us to map out the experience as a storyboard to give a better visual sense of the interactions. Mike wanted to talk through our incorporation of ARENA in our project and how we might pass off information to the ETC for after the project for later teams to use.

With some of this feedback in mind we drew out the first interaction loop that we are designing in more detail to get a better sense of what is happening in it to provide clarity. This first loop sees guests walk up and interact with a physical typewriter in the space, which will simultaneously trigger both another physical reaction and an AR reaction. We’re working to balance between slow and fast reactions in order to keep things within the guests viewport and not lose their attention with too many things going on at once.

On the art side we identified an art style that we would like to pursue for our AR elements and began to model the bird first and foremost because of its relative complexity and importance to the reactions. We drew primary inspiration for the shape from the Ikki bird that is regularly found in Rube Goldberg cartoons.

On the tech side, we worked to figure out the pipeline for exporting GLTF format 3d assets for use in ARENA. This is the only file type that ARENA supports and has its own limitations and challenges with texture support that we needed to learn about. We also figured out the pipeline for triggering actual animations to play rather than hard-coding in translations to animate.

We also started to incorporate Raspberry Pi into the pipeline and put together a demo in which a physical block will move forward to push a virtual ball using a distance sensor. This type of sensor will be able to trigger animations to play as we begin to incorporate them in the project.

Next week we will map out the second interaction and begin to acquire some of the physical pieces of the interaction. We’ll also continue to piece together the first interaction loop and try to include the triggering of a fan in the process.