Making CAKE: from Object-Based Production to Experience

Published: 12 December 2016
  • Jasmine Cox (MSc)

    Jasmine Cox (MSc)

    Development Producer

Watch as the team take you through their work so far on our CAKE project and read about our object-based studio shoot.

Our desire to make an engaging interactive experience meant we had a great deal to learn in order to plan and undertake a daunting/exciting 2-day multi-camera shoot. Much of our pre-production dealt with planning the data model, but also followed a similar process to traditional telly-making.  

The kitchen set was built in collaboration with our academic partners at Newcastle University’s Open Lab. They worked closely with us and external set designers to integrate a fully sensor-loaded central island, where the work surface, appliances, and utensils all employed a rich array of embedded sensors.

Open Lab is looking at experimental research to understand approaches to assisted living with connected devices in the home, and in our scenario this allowed us to tag synchronised data about the use of kitchen utensils with the footage we were capturing.

For example, when an orange is sliced, the blender is turned on, or a fish is taken out of the fridge, we have recorded movement data for the tools being used that precisely synchronises with the events happening on screen.

This additional information is the key to enabling the kinds of data-driven content we want to experiment with, and unlocks a wide range of possibilities for inferring real-world actions - if the programme knows what professional chopping looks like, and infers that someone needs a bit of help to master that technique, it has enough information to perhaps offer an assistive video clip.

The set was installed in a studio unit at The PieFactory in MediaCityUK and we actually did two complete shoots during the 2 days. On the first day we shot what we referred to as the “linear” shoot, a relatively unscripted run-through of all the recipes. This served two purposes: firstly, it allowed our partners to capture a full sensor and AV log of the linearly-filmed recipes without any of the stop/start interruptions of the object based version. This was also shot using four static 4K cameras, similar to the capture rig we have used for our Nearly Live Production experiments. Secondly, it gave us a more conventionally shot version of the programme that we hope will be useful as a comparison when evaluating the object-based variants.

On our second day in the studio the full object-based shoot got underway. Our pre-production planning meant that we could export a shooting script and other helpful documents straight from our data model. This was challenging as we had big ambitions and only a day to get all our object assets, the plan was to stick to our script which detailed parts of the recipe as individual elements. This involved frequent stopping and starting, meaning our home economists were crucial to ensuring we had tasty looking nosh at all times!

We devised notation for shots and objects to track our current shots, this mapped back directly to the data model: 

We planned our shoot so that we could record single, step-by-step recipes only once. this meant we didn’t have to film endless variations, the application would then create a personalised cooking programme for individual audience members.

However, additional challenges arose from shooting in this way, a biggie was that of continuity. All of the individual scenes had to be shot in such a way that they would make sense when presented in a different order, or when combined with other dishes. Our concession to recording multiple variants extended only to the introductions and presentation of finished plates. For example, every fish recipe had to be introduced verbally in the context of it being part of a larger meal, and then the same intro was shot again for each one but scripted differently so it would also make sense when the fish recipe is viewed in isolation. Another interesting example was presenting the food at the end of each recipe - we captured many variations on putting dishes away for later prep (and subsequently bringing it back out again), so that we had enough configurable elements to reassemble everything logically. 

Interestingly, language played a large part in maintaining continuity too. We asked our presenter, Bella, to avoid references to other recipes she is also cooking, so that when you choose a combination of recipes in the application the steps interleave neatly. She also doesn’t mention the quantities of ingredients so we can scale if you’re cooking for any number of people.

One of the major changeable elements in our object-based script was in the preparation of the fish: we scripted and shot three possible ways to prepare the same fish (a tasty red mullet). We really wanted to be able to offer the viewer a few different options for this ingredient, which could be chosen based on preference, or even on factors like the time that someone has available to spend cooking. The preparation time for these three options ranged from a few minutes (pan frying) through to over half an hour (if oven roasting the fish). Shooting these meant capturing three different sequences using physically identical fish, while scenes that were common to all three (for example, de-scaling the fish prior to cooking) could all utilise the same single video element.

Shooting in a non-traditional way was a unique opportunity for everyone on set - as well as the project team getting to grips with production roles, we also worked with experienced TV & film professionals used to a specific manner of working. Studying and refining the way in which we steered the production was a really rich learning experience itself, bearing in mind that part of this research is of course focused on the journey as well as the end product. 

As we undertook a somewhat bespoke experimental process the production was significantly slower when shooting our object-based script, as there was a high degree of stopping and starting, to ensure we were capturing everything properly. But by marking-up the footage in an object-based way it just means that there are so many more possibilities for how the content can be used and reused.

Following the shoot, we took the complete raw footage set into an edit session, to transform it from raw video and audio into a balanced, consistent set of objects to work with. This process included combining the multiple angles of video with the recorded audio streams, cropping footage, colour grading everything for consistency, and logging the finished video segments as individual objects and metadata. As with the shoot itself, this process was of course driven by, and documented in, the existing data model that we’d established at the start of the process.

The end result of this approach was our final object-based footage set. Every part of the recipe accounted for and organised as an object in data, with beautifully edited footage for each one, from multiple angles (including the particularly gorgeous overhead view demonstrated in the photo at the top of this post).

Making an object-based experience doesn’t have to have any impact on a production at all. We can still shoot as we would normally, but by marking-up the footage in an object-based way it just means that there are so many more possibilities for how the content can be used and reused.

The next stage was to combine these assets with logic from our data model structure into our designed application. This phase was an iterative design and engineering excersise, where we utilised some of the building blocks underpinning the department's IP Studio work and made some new ones. Check out the OPTiC project for more info on these production tools and services for making object-based applications.

The key to the responsiveness of our application is the object-based nature of the content we captured on our studio shoot, the way we model the recipe data, and the rules we establish about the ways in which it can behave based on someone cooking at home.

You can read the first part of the CAKE story if you haven't already, and we'll talk more about how we designed the app once you can play with it when it launches on Taster in early 2017!

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: