Cars That Feel – VIVID Sydney

VIVID Sydney is a unique annual event of light & ideas that happens all around Sydney’s CBD. It’s Sydney’s largest tourism event attracting over 1,400,000 visitors. Soap Creative was chosen as one of the 50 artists to create a light installation. Our piece the ‘Cars That Feel’ was created in collaboration with Toyota to showcase the entire Prius range.

Cars That Feel

Concept

Cars That Feel explores anthropomorphism, and the relationship we can have with inanimate objects. This piece was born of the idea that people strive to look for human qualities in things, and will behave differently if that thing is perceived to be “alive”. What if cars had feelings? What would you say to one? How would you react?

We wanted people to have an emotional 1 on 1 connection to the cars not just stand back and watch.

The Challenge

The first challenge was to project video onto every window in a car.  We also had to create an invisible system which would allow a person to interact with the car through touch, voice and movement. A keyboard or microphone would have ruined the illusion we wanted to create. We’d also need to do this without altering the cars in any way. The cars would need to resold at a later date so large mechanical or technical alterations could be done.

To quote Walt Disney; “If you can dream it you can do it”

The Team

The team was a cross section of the entire Soap office with everything except sound being done in house. It was important for us to learn with this project rather than pull in external contractors. Everyone worked collaboratively with the animators helping the Air developer on the early proof of concepts and everyone pitching in to help screw and nail the projectors into place. We can’t thank our IT guy enough for tracking down and setting up all the hardware requirements too.

Creative Director, Art Director/Designer, Animators x 2, Air Developer, Back-end/Systems developer, Hardware/Technical Developer, Technical Director x 2, Producer, IT guy + Noise for sound.

During the event we also called in volunteers from the Soap team to man the cars across the 3 days.

Technical Approach

We went into this project quite naive thinking that projecting onto all the windows of the car would be simple.  The small interior (compared to normal operating areas for a projector) and the very wide windscreen of the Prius had us learning more about Ultra Wide Throw projectors than we’ll ever need again.

The invisible interaction with people and the cars was probably the biggest and simplest innovation for the project. Conductive paint, Kinect, pressure sensors were all discussed before we realised we could just treat the cars like puppets. Or the “Wizard of Oz” effect. Operators will use tablets to control the cars as “puppets” by selecting the appropriate emotional response and controlling the eyes of the car. They’d stand a few meters back and mingle with the crowd. The effect allowed us to give the cars a technically “human” response to whatever people threw at the car.

01-hero-780

Creative

The creative process was quite fast with two directions investigated. Both inspired by Japanese culture to give a big wink to Toyota’s heritage.

The first was more animal like taking a Hayao Miyazaki approach.

  

The second was an abstract minimal direction inspired by LA artist FriendsWithYou.

Concept Faces

The more abstract style was chosen as it felt more “universal” but also allows greater freedom with the imagery we could bring in to create the emotions. It was also much easier to produce so we could offer more options in the same amount of time.

The next step was to map the range of emotions and responses we felt we’d need. It turned into a colour wheel of emotions. This quick colour wheel allowed us to discuss how and what would make a car go into a rage or act suspicious. It also raised that we’ll need more than just motions but common replies such as Yes, No, Don’t know etc.

Emotions Map

Proof of concept

The first stage of the project was to develop a proof of concept and have a base working model. Our initial questions were:

  • can we animate an image on a windows clearly
  • can we animate other features of the car
  • can we control all this remotely in a timely fashion

The proof of concept  was to use a projector for the car windows but there was a doubt we could achieve total coverage without projectors bleeding into each other. The space inside the car with the seats and steering column is very small. With the front and rear windscreens being wide and at an extreme angle it wasn’t as simple as we first thought.

We bought an Ultra Wide throw projector (0.53 throw ratio was the best we could find) and quickly established that it was possible, albeit likely to need some nifty engineering to prevent bleed.

To drive the projectors we used multiple computers running Windows and an Adobe Air (Flash lives!) application to draw animations and movies.

To drive other features (such as LED lights under the car for better effect) we used a Raspberry Pi running a c# application to control GPIO.

Hooking this up to a controller required a central server that could receive instructions. To keeps lags to a minimum we tested using socket connections from browsers to the server, and then socket connections from the server to the various devices. The server could receive a single command and distribute to multiple devices. Fault tolerance and logging was tested so that communications could break and recover .

At this stage (after around 3 weeks of development) we had proof of concept, and the hard part began of making it look good.

Toyota’s Windows

Confident that we could make compelling animations to display on the windows of cars, we just needed a way to get them there. We are used to working with standard PC monitors, so this was very new for us. The problem with projectors is that they work best room sized spaces with perpendicular things to bounce the light off. Car windows, it turns out are a lot trickier.

Challenges

  • find the right glass covering;
  • getting enough ‘throw’ from a short distance;
  • mounting projectors;
  • casting of shadows;
  • ‘spill’ from one window to another;
  • image distortion;
  • heat build up (6 projectors + a PC creates a LOT of heat).

Prototyping

We were unable to get a Prius in the office for the first month of the project so we had to improvise. We traced the window shapes from the cars, and created a shabby replica car in the studio nicknamed the ‘ghetto car’.

Of particular concern was projecting onto the front windscreen. Only a position quite near the roof in the center of the car would avoid getting shadows from the dashboard (we were not allowed to remove the steering column in the car), but this position caused severe distortion with windscreen angled away from the projector.

Side windows were far easier to get a good projection on, but not at the same time – the ideal position for one side would cast a shadow on the other. It became apparent that we would need six projectors for each car: two ultra wide throws for the front and back and four standard throws for the sides.

Ghetto Car

Flash App Development

We chose to continue to develop the application in Adobe Air. Utilising the Flash runtime provides good video and animation capabilities and Air has good support for multi-windowed applications.

We discovered an AMD Radeon graphics card which can support six monitors (seriously someone uses 6 monitors?!). However, could the application handle six full-screen windows and still be performant? We had doubts. Maybe we would need more than one PC/App in each car. On the other hand, the single app solution would be simpler to develop and control.

Our animations consisted of background video overlaid with .swf files containing the face animation. We need to keep the face element dynamic, so we could allow our operators to control its position. If the operator was not controlling the face position it would follow timeline animation as seen in the following video.

This gave us the greatest flexibility. Note to any robot puppeteers in the future; without the control of the eyes, the cars felt much more like sterile robots than living creatures.

We experimented with different source resolutions to find one with acceptable quality that Air could perform well with while throwing six screens around. The resolution we settled on was a background video of 3072×1152 (6 x  1024×576).

Admin Window

GPU

Using the AMD Radeon graphics cards with six monitors presented a small challenge. We found out that to get six monitors working you need to use active adapters from mini display to HDMI (and not the passive adaptors provided, as seen in the next image).

six-outputs

We looked into the possibility of rendering video at the hardware level with one of Flash’s newer rendering modes (i.e. Direct). However to use hardware video rendering we would have to give up any pixel processing on the video which seemed essential for correcting the distortion in the projected image. Using fullscreen source rect we upscaled our smallish windows to fullscreen without any extra draw on the CPU.

Rendering pipeline

Background video is played and drawn (blitted) every frame into six bitmapdata instances corresponding to each window. For the front window, the face animation also gets drawn. For simple image display (no distortion correction) this bitmapdata can update a Bitmap instance. If we need to correct the projected image’s distortion, we used a Bezier surface algorithm and the drawTriangles method to draw a mesh that can compensate the distortion.

73-rf-warp-780 78-rf-projected-780

Position/Scale/Rotation controls

70-rf-position-780

Mask Shape

71-rf-mask-780

 

This is where the full resolution video ended up. The animators would work to the full resolution file and have these masks as reference. This was much quicker and easier than working across six separate files.

the-full-res-780

 

Arduino

We realised it would be difficult to control car features such as lights, wipers and horn due to needing to dismantle some parts of the car. This was not an option we had because the three cars were on loan vs ours to destroy.

To add extra impact we put LED lights on the underside of the car which would mirror the colour seen on the windows for a more interesting visual look.

The LEDs would still need a hardwired controller. The proof of concept used a Raspberry Pi to control the lights, but we realised we could hook up an Arduino instead and have finer control of colours and timings. We reproduced the sockets communications on an Arduino to drive the LED lights. As the Air application progressed, we established that if we could hook the Arduino to the Air application, we would end with the Air application controlling all features from one device which is easier to command and sync than having two devices.

After struggling to get Air Native Extensions to control the Arduino directly, we opted to re-employ the socket server to listen to updates from the Air app and forward instructions to the Arduino.

86-testing-780

Starting with a basic colour configuration, the Arduino control in the app evolved into a UI for composing and previewing instructions to be sent to the Arduino / LED.

Arduino UI

60-arduino-780

Installation

The arrival of the first car, after over a month of half after development started, meant we could now explore mounting the projectors.

Ripping out the seats and steering wheel would have made this much easier but we like a challenge!

After some failed attempts with camera tripods, we looked to ceiling mounting kits, turned upside down and screwed into wooden boards custom fitted around the seats. We were able to find positions for the side projectors where they did not cast shadows on each other, although this generally meant positions with more acute angles that generated a more severe distortion. Warp correction turned out to very necessary for most projections.

It was a like a Rubik’s puzzle with no instructions.

96-projector-throw-780

The placement of the front windscreen projector, was proving to be very difficult. We ended up just holding the projector in our hands as we searched for the optimal position. Eventually we discovered that the projection worked considerably better when the projector was upside down (There’s a technical reason for this but at this point we didn’t care why just happy it worked).

04-widethrow-upsidedown-int2_780

Ok great – now we just have to find a way to mount a projector upside down in a car! A trip to the local hardware store yielded an unlikely accessory for upside down projector mounting – the heavy duty post anchor!

messy-inside-780

purple-inside-projection

For each car the hardware set up looked like this:

Hardware-Setup

Obtaining the projectors was not as simple as we thought. The first projector we bought as a test went out of stock requiring us to get a slightly different model. When we were ready to order the bulk to them the price jumped $300. When you’re about to order 21 projectors this was a surprise.

We were also lucky enough to get a contact at BenQ Australia (from the people we bought the projector mounts from) who supplied us with enough projectors to fill the cars and give us a safety net in case projectors broke down.

This is how we saved on postage from Amazon by ordering everything separately. 1 x projector – $50 delivery. 2 x projectors = $200+?!

10251526_241935799325280_296009992_n

Window covering

One element we thought would be no issues would be the projection film on the windows. We looked at the projection film on the market and found that 3M’s Vikuiti film was the best. However it wasn’t that easy to order. So we went to Alibaba and got a range of samples from China.

 

It’s surprising the difference in quality you get. We ordered the film and it arrived in a massive tube.

However no window tinter would agree to apply the film to the cars because it would make them non road worthy. To top it off the film we ordered wasn’t suitable for “heat transfer” which meant we’d have nasty cut makes and gaps to make the film fill the curved surface of the windscreen.

It was a nerve racking few days as we went around in circles only to strong arm (ie pleading almost in tears) one company into helping us as the cars would be “art” not for driving.

The tinters also helped us identify the right 3M film we could heat transfer and who to order it from.

The quality of the 3M film was amazing which diffused the light many times better than the previous film we bought from China. You could get right up close and not see where the point of light from the projector was coming from. This had a huge effect on the final quality of the piece as people got up really close to the cars trying to guess what was inside.

Side note: When we initially found out they wouldn’t let us apply film we experimented with spray frosting on glass to see if we could get the effect we needed. For those wondering it does work but the effect isn’t great.

Here’s what 9 different projection film samples looked like. The difference is huge.

differentr-projection-film

Side note 2: All Prius’ come with some form of inbuilt tinting which meant the back two windows had a different colour than the other windows.

Of course, once applied the cars not be driven on roads, so they were transported to the event by truck.

97-transport-780

LED lights

“We’ll just tape the LED lights to under the car!… simple!” Here’s another thing we learned about cars. They are built in a way to avoid things sticking to the underside of them. Especially around the wheel hub. So after our taping failed we we constructed a PVC pipe frame to attach the lights to.

pvc-led-rig

Sound Design

We were very lucky to have the guys at Noise International take over all sound and music design for the project. They also created an original music piece which the three cars would play together.

 Sound Setup

For sound we used two UE Boom speakers  for the front (in each wheel hub) and a smaller Mini boom in the back.  These were much cheaper than a sound bar and we were able to use them later for general office use post the project. They were also water proof/resistant.

We had to have these wired because of the delay Bluetooth speakers have and our requirement for synced sound. We also had them plugged in vs. use battery power and charge them each night because we strapped the speakers in so they wouldnt be stolen over night.

Side note: The day before launch we had troubles getting sound out of one of the speakers. After a while we realised it was synced to one of our teams phone and he was on the phone, confused why he couldnt hear the person on the other side

Communications Protocols

CTF-Software-Setup

We chose to retain the Apache/PHP application as it had proved reliable in testing for socket communications and also provided robust technologies (HTTP) to fall back in an emergency. Originally the solution was expected to control multiple devices in one car, and to control multiple cars. However we eventually discovered we could simplify the solution by isolating each car and running a server set up per car.

As we simplified the solution, we left the full server solution in for fault tolerance; we could have a sever or servers fail, and do some rewiring to have two or more cars controlled by one server. We also had a spare back-up server set up ready to roll if needed.

The communications were split into 3 primary elements:

  1. HTTP servers were used to receive information about available devices (Air applications, Raspberry Pi etc); each device was set to broadcast availability to this server.
  2. A socket server (“Console Server”) was created to receive instructions from the controllers and place into a database,
  3. Each device contained a socket server to receive instructions from the central “Instruction Server”.

1) HTTP Server

As each device came online, it was set to broadcast its name, the car it was in and its available communications port over the local network. When a server responded the application would lock on this server for all future communications. The server would record the device details in a database that would be used by the Console Server and Instruction Server. After receiving confirmation, it would repeatedly “ping” the same information. This allowed fault tolerance if a server got rebooted or lost connection then it would, within 30 seconds re-establish itself. We also used HTTP server to deliver the console application over HTML (see “Console” below) and to provide status information over web pages (see “process monitor” below).

2) “Console Server”

The servers ran a PHP application that opened as a socket server ready to receive instructions from the controllers. and that other status pages could be quickly developed. On pressing, moving or releasing a control in the browser, an instruction was sent through the socket. Each instruction was a minimally crafted series of bytes that specified the car, instruction name and up to three parameters to be sent through. To save CPU, the PHP application queried each socket every 10 milliseconds with a pause in between; 10 seconds being the optimal loop to ensure all instructions were read. Repeated instructions in each loop were dropped and the remainder saved to a database. We used a database because, as fast as new instructions arrived at the end of the queue, older instructions would be processed. A database was ideal to handle and process the inputs and updates simultaneously. To remove unnecessary database calls, a memcached server was employed to store the latest time of updates for each car.

3) “Instruction Server”

The Air application (and originally the Raspberry Pi and Arduino) was programmed to have a socket server that could receive instructions. It would open the socket and broadcast its identity, the car it was located in and open port though an HTTP call to servers on the local network as noted above. A second PHP application was written that also looped every 10 milliseconds, read the memcached server to see if new instructions were waiting, read the queue of instructions from the database, removed any duplicates, mapped those instructions into a new series of instructions for each device and sent the newly mapped instructions to the Air application. Response from the device was logged to provide feedback. As the Air application developed we simplified communications. We removed mapping from the server (putting in the app instead) and could pass instructions directly to the application. We attempted to pass all instructions, unfiltered by PHP/database, directly into the Air app. Using this we could do away with the server set up altogether. Unfortunately many instructions were getting dropped and we kept the PHP server to assist with throttling the instructions passed to Air.

Console

The console consists of buttons for each animation/emotion, a colour chooser and control for positioning the face. The face animations are structured in such a way that an administrator can gain and release control of the face animations timeline at any time. We optimised the UI for touch controls and used screenshots of the animations to make the buttons more distinct.

Console Dashboard Console Colour Chooser

In case of emergency

We created a service manager application that also runs along side the app and the servers, and can restart the php services, the Arduino Sockets app, and the Air app. Through a web interface on the HTTP server, the operators could access to the service manager and quickly restart a component if required.

Service Manager

The Event

110-event1-780

The event ran from May 22 – June 8 from 6pm – Midnight. That’s right in the middle of Sydney’s Winter which meant a lot of cold nights out with the cars. Luckily it only rained 3 of those nights.

Each car was controlled by a single operator to allow a personal touch to each car. The Soap team had a volunteer roster across the 18 days.

We had people hugging, kissing, tickling the cars constantly. The amount of smiles from people young and old made us feel like “Walt Disney”.

10387848_604523659654883_941652884_n

925270_1542221685998131_1451109218_n

Cars-That-Feel_huge

During peak times (Friday and Saturdays) there would be over 100 people in the area with several people all talking to the car at once. The operator had to focus on who they thought would have the best experience. We didn’t want to limit the interaction with a single person and form a line because it breaks the spirit of the event. We were surprised that even though the focus would be with someone at the front of the car people looking at the back window still felt like were talking to the car.

Evolution

We kept the animation assets external to the app.  Initially this allowed the animators to test new animations in the app without requiring it to be rebuilt but it also made adding new animations easy to add to the cars which remained on site throughout the event.

After the first weekend of testing with huge crowds we soon realised that the cars could do with some more conversational elements. Animations such as ‘tickle me’, ‘hug me’, ‘selfie?’, ‘come closer’ were some animations that we added after the event started.

Results

The Cars That Feel was the most interactive piece for VIVID 2014 and a real crowd pleaser.

We estimate around 50,000 people directly interacted with the cars delivering 1.8 million interactions during the event. The cars received so many hugs, kisses and tickles each night that we had to get them cleaned daily. A further 200,000 people saw the cars on location right under the iconic Sydney Harbour Bridge.

We had a great response on Instagram and other social channels.

CTF-instagrams

To further extend the campaign we  took the cars to two AFL games as part of Toyota’s sponsorship of the league. The cars were given teams to support and had custom animations created including the team songs.

crows_social3_780

Learnings

Here’s a few things that we took away from this project on human behaviour

  • The 1 minute musical set piece we created led to people getting bored. As they were not interacting and just watching they wouldn’t stick around. The personal interaction with the cars is what made the piece so powerful.
  • Never ask someone to kiss a car straight away. Always go for a tickle first.
  • Some people just want to make the cars angry. Mostly older women.
  • It’s impossible for a car to have a meaningful conversation with 15 people at once
  • Most people will not Hashtag a photo and if they do they will not use the official hashtag for an event
  • Asking someone for their “#” means phone number for someone aged 30+ and “hashtag” for everyone else.
  • The crowds at VIVID are very open to experimenting and interacting, while the crowds at the AFL are focussed on getting to their seat.
  • People LOVE taking selfies.
  • Very few people (<20) noticed the cars were being controlled by people with iPads. Even when the crowds thinned out. A person hunched over a device has become so common place.