It’s been nearly 5 years since the last London Vive Developer Jam. Triangular Pixels was once again welcome to take part. Using the latest unreleased VR hardware, two extra friends, and only around 26hrs, we created Munch Mania!
Last time we did this Jam, we created Unseen Diplomacy – so expectations for us were high and the pressure was on! The Jam was at Goldsmiths University, London – and started on the Friday evening (24th Jan 2020). We had a few rounds of talks before team building. We failed to get any artist, producer, or audio developer but we did have two great programmers join us – John Hill and Thiago Escudeiro Craveiro, bringing our team up to 4. We did the Jam on hard mode too, with our 4 month old baby with us!
The hardware/SDKs available were;
Admix (for adding adverts into your game)
Tg0 etee (like Valve Index controller, but 3DoF)
Vive Pro Eye Tracking
Vive Trackers (the pucks you can attach to things)
Vive Lip-Tracking Modules
Hand Tracking SDK (Vive)
SRWorks – Passthrough Camera (Vive)
Picking an Idea
We wanted to focus on using the new hardware features we’ve not used before – so tracking eyes and mouths. We jammed a few ideas with what would make more compelling input. Eyes would be awesome, being able to show emotions – but with no artist we didn’t have the skills to be able to model and rig something in a day to support our concepts. So we went with the main input being the unreleased Lip-Tracking Modules. For those we thought that “eating” would be the most amusing for people to do, and for others watching in the room.
We went through a few concepts on the trip to the event, but when we found our team we settled on one that we knew we could do in 1 day with the skills we had. The idea came from eating a long trail of food flying into your mouth, and a lot of bad puns.
Dance Pranz Ravioli
Mochi mochi panic
Audiosurf n turf
Bite Trip Eater
Elite Beat Chefs
But we settled on something a little less copyright infringy – Munch Mania
About The Game
Satisfy those munchies!
Fill your belly with as much as possible before your all-you-can-eat time ends.
Munch Mania is a VR rhythm action game, based on the style of Elite Beat Agents. The only controls you need to be able to play is your mouth and head movement!
Food comes towards you with a beat and you have to eat as much as possible before the end of the song. Some notes/food just takes one bite, some multiple. If a piece of food doesn’t get bitten before it passes you it removes points from your score. If you bite into a piece of food it will hang around in front of you, blocking your view, until you finish it.
As the song progresses, you go through starters, mains and desserts of course!
We used a set of Index controllers we had and a real bowl to allow outside friends to throw extra tomatoes at the player in VR for bonus points. These non-VR players don’t need to see the screen, but they do have to be careful! Sometimes they can throw a chilli! If the VR player eats a chili pepper they get extra bonus points, but flames come out of their mouth blocking their view. Someone can come running in though with the milk controller and feed that to the player – it’s a Vive Tracker on top of one of our baby’s milk bottles!
We had a smooth panning, tracking selfie camera for the VR player to place to steam the action in 3rd person to the TV screen – and of course added the game logo to that and the main menu.
We used Unity 2018 LTS – nice and stable, no need for extra risk! Worked fine for what we needed it to. Art assets were remixed from assets found on the Unity Asset Store.
The eye tracking worked well once you calibrate it per person – so it’s not so great when you pass the HMD between people. In the game, some food flies high above the player. We used the eye tracking so if you looked and blinked at that food, it’ll get eaten. This didn’t work too well though as turns out as you blink you lose focus on what you are looking at – so the tracking flakes out a bit. We highlighted this so hopefully something can be done SDK side to allow blinking as a form of input as it’ll be great for accessibility!
The Trackers worked well as always. It’s an easy win for a Jam game, but it does have issues with device ID assignment which we could have made slicker given a bit more time.
The mouth input works very well, and didn’t need calibrating between people. SDK was simple to use. It’ll work really well for those who need some more accessible forms of input – as long as developers or the software allow for button remapping to gestures. Looking forward to seeing a commercial release and I hope they can build this into the next HMD.
The game itself has a LOT of potential (and we won a prize with it!). It’s a fun game to play single player, local co-op with as many devices as OpenVR supports, and just watching the chaos. It’s really expandable (we had to drop leads of mechanics!), works for in home and location based VR, and there’s plenty of other ideas we have for it. Keep your eyes peeled what comes next – but the build we have is now up on our itch.io game page!
We had a very tiring but rewarding time. It was wonderful to work with our new team members and the Vive team and university did a great job of looking after us and our little one. Thanks to all!