We staged Tokyo Fury Live an augmented reality experience that was staged in Melbourne’s iconic Federation Square from the 17th – 19th July 2013.
Players battled waves of on-screen ninjas to win tickets to the advance screening of ‘The Wolverine’ movie release and get free samples.
Every battle was captured and then uploaded to V-Energy’s Youtube channel. The participants were emailed the link as a virtual memento.
Creating an augmented reality experience in a large outdoor space presents some problems:
- The 11m distance from the camera to the player was greater than what kinect can support;
- Kinect does not work in direct sunlight;
- In a large open space players are unlikely to stay in one spot.
We opted for a ‘kinect-like’ technology of motion detection. A human! While that might not sound groundbreaking from a technological point of view, it did provide some benefits:
- Humans can anticipate player movements, providing more fluid responsiveness
- Players have freedom to attack with their feet / skateboard / whatever
- Easy to cope with variable conditions, players moving away from target spot
- Multiple players – no problem
We developed a dual window application in Air: one window for an administrator and the second outputted to the big screen for the player.
The application used the MVC pattern to provide separation between the game data and the multiple views, and ensure consistency between the views.
Admin controls were refined to allow the admin to react quickly. Mouse interaction was replaced with faster keyboard controls. Color coding enemies helped the admin distinguish position.
Admin could also award style points when the battle finished.
In addition to controlling gameplay the admin UI also allowed the Admin to switch to ‘idle’ mode where the trailer and high scores would be looped.
Initially we expected a webcam feed would work best. After all, we only needed one computer to receive the feed.
However we found the following limitations with webcams:
- they compromise either quality or frame rate;
- USB power doesn’t work over long cable lengths.
We ended up ditching the webcam approach for a live video stream via an Adobe Live Encoder / Flash Media Server setup with the following settings:
The encoder settings we end up using were:
- input resolution 1920×1080
- output resolution 960×540
- output frame rate 25fps
- format VP6
- Bitrate 1500 Kbps
A requirement of federation square’s big screen was that the video feed be delivered through SDI (Serial digital interface). What’s SDI? we initially thought. We then quickly found out that only professional cameras support SDI and cost many thousands of dollars.
Instead we used a humble Sony CX220 handy cam and a bunch of Black Magic HDMI/SDI converters. We used a Black Magic ‘Intensity Extreme’ box to convert the application output to SDI for the big screen, as well as a thunderbolt out for our game capture card.
In addition to the admin’s machine running the application, another 3 other machines were dedicated to camera encoding, streaming, and final game footage capture.
The final set up got a little bit complex:
The potential issue with our setup was latency, but through experimentation we were able to arrive at a solution with very low latency and acceptable resolution and frame rate.
The ninja characters were rigged with biped skeletons to allow animation to be easily transferred between the male and female ninjas.
To save extra time during animation, poses and gestures such as fists and crouches were stored and re-used.
Because the location of the camera was fixed, we were able to ‘bake in’ the perspective and positions of the enemy animation sequences, eliminating complex 3d math in the application. With the freedom that a desktop application allows, we were able to render a lot of animation frames.
Despite not being a problem for file size, rendering fullscreen png frames would hinder performance, so we used the shoebox tool to crop animation frames to their mininum size, and generate jsfl code to import and position them correctly in the Flash IDE.
We also rendered a dummy object that tracked the character’s position on the ground, so we would have a reference point for the characters position in 3d space:
Male and Female Ninjas were rendered at 2 angles (flipped to make 4) with the following animated sequences:
- Entry (4)
- Attack (3)
- Death (3)
- Idle loop
This gave a total of 88 sequences. To review animations and check that the in and out points lined up we created a custom tool:
Editing the footage
In order to turn the videos around quickly for youtube upload, we needed to:
- trim the battle videos quickly;
- top and tail each video with branding segments.
For basic trimming we used QuickTime Pro.
We then ran a custom FFMPEG script to:
- conform captured footage to match branding segment format;
- concatenate the video segments together.
The final script we used was this:
# This will loop through the contents of original, convert the mov files into
# the standard mov format we use, then convert that file into .ts
# The ts version of the video is merged with top.ts and tail.ts saved and the temp files are deleted
for s in original/*.mov ;
ffmpeg -i original/$filename.mov -r 30 -b:v 1372k -bufsize 1372k temp/$filename.mov
ffmpeg -i temp/$filename.mov -c copy -bsf:v h264_mp4toannexb -f mpegts temp/$filename.ts
ffmpeg -i “concat:top.ts|temp/$filename.ts|tail.ts” -c copy -bsf:a aac_adtstoasc output/$filename.mov
rm -rf temp/$filename.mov
rm -rf temp/$filename.ts
Fox supplied us with some very real-looking (and very expensive) replica claws. At the risk of someone losing an eye we opted for the safer foam claws.
In the end this allowed people to have more fun, and we didn’t need to worry about someone taking off with the impossible to replace replica claws.
To personalise the videos we added the ability to display the player’s name in the battle intro and score screen.
We needed a way for promotional staff (Ambient) to communicate player’s name and details to the admins in the control tent.
It seemed liked walkie-talkies would be error prone. Instead we created a google form which the promo staff could display to the user on a networked ipad.
The Fed Square AV staff for helping us setup and troubleshooting
Ambient for running the sampling and talent on the day.
- 3×4hrs of operation
- 75,000+ people exposed to event
- 4,250 V graphite cans distributed
- 368 battles captured
- 259 videos uploaded to Youtube
- 38,659 ninjas defeated
- 1 happy client