Fuzzy Door’s ViewScreen on-set AR puts CG characters and locations in the viewfinder
Reading Time: 5 minutesPractically every TV and film production uses CG these days, but a show with a fully digital character takes it to another level. Seth MacFarlane’s ‘Ted’ is one of those, and the tech division of his production company Fuzzy Door has built a suite of on-set augmented reality tools called ViewScreen, turning this potentially awkward process into an opportunity for collaboration and improvisation.
Working with a CG character or environment is tough for both actors and crew. Imagine talking to an empty place marker while someone does dialogue off-camera, or pretending a tennis ball on a stick is a shuttlecraft coming into the landing bay. Until the whole production takes place in a holodeck, these CG assets will remain invisible, but ViewScreen at least lets everyone work with them in-camera, in real time.
The usual process for shooting with CG assets takes place almost entirely after the cameras are off. You shoot the scene with a stand-in for the character, be it a tennis ball or a mocap-rigged performer, and give actors and camera operators marks and framing for how you expect it to play out. Then you send your footage to the VFX people, who return a rough cut, which then must be adjusted to taste or redone. It’s an iterative, traditionally executed process that leaves little room for spontaneity and often makes these shoots tedious and complex.
‘Basically, this came from my need as the VFX supervisor to show the invisible thing that everybody’s supposed to be interacting with,’ said Brandon Fayette, co-founder of Fuzzy Door Tech, a division of the production company. ‘It’s darn hard to film things that have digital elements, because they’re not there. It’s hard for the director, the camera operator has trouble framing, the gaffers, the lighting people can’t get the lighting to work properly on the digital element. Imagine if you could actually see the imaginary things on the set, on the day.’
You might say, ‘I can do that with my iPhone right now. Ever hear of ARKit?’ But although the technology involved is similar — and in fact ViewScreen does use iPhones — the difference is that one is a toy, the other a tool. Sure, you can drop a virtual character onto a set in your AR app. But the real cameras don’t see it; the on-set monitors don’t show it; the voice actor doesn’t sync with it; the VFX crew can’t base final shots on it — and so on. It’s not a question of putting a digital character in a scene, but doing so while integrating with modern production standards.
ViewScreen Studio wirelessly syncs between multiple cameras (real ones, like Sony’s Venice line) and can integrate multiple streams of data simultaneously via a central 3D compositing and positioning box. They call it ProVis, or production visualization, a middle ground between pre and post.
For a shot in ‘Ted,’ for instance, two cameras might have the wide and close shots of the bear, which is being controlled by someone on set with a gamepad or iPhone. His voice and gestures are being done by MacFarlane live, while a behavioral AI keeps the character’s positions and gaze on target. Fayette demonstrated it for me live on a small scale, positioning an animated version of Ted next to himself that included live face capture and free movement.
Meanwhile, the cameras and computer are laying down clean footage, clean VFX and a live composite both in-viewfinder and on monitors everyone can see, all timecoded and ready for the rest of the production process.
Assets can be given new instructions or attributes live, like waypoints or lighting. A virtual camera can be walked through the screen, letting alternative shots and scenarios occur naturally. A path can be shown only in the viewfinder of a camera with movement so the operator can plan their shot.
What if the director decides the titular stuffed bear Ted should hop off the couch and walk around? Or what if they want to try a more dynamic camera movement to highlight an alien landscape in ‘The Orville’? That’s just not something that you could do in the pre-baked process normally used for this stuff.
Of course virtual productions on LED enclosures address some of these issues, but you run into the same things. You get creative freedom with dynamic backgrounds and lighting, but much of a scene actually has to be locked in tightly due to the constraints of how these giant sets work.
‘Just to do one setup for ‘The Orville’ of a shuttle landing would be about seven takes and would take 15-20 minutes. Now we’re getting them in two takes, and it’s three minutes,’ said Fayette. ‘We found ourselves not only having shorter days, but trying new things — we can play a little. It helps take the technical out of it and allows the creative to take over… the technical will always be there, but when you let the creatives create, the quality of the shots gets so much more advanced and fun. And it makes people feel more like the characters are real — we’re not staring at a vacuum.’
It’s not just theoretical, either — he said they shot ‘Ted’ this way, ‘the entire production, for about 3,000 shots.’ Traditional VFX artists take over eventually for final-quality effects, but they’re not being tapped every few hours to render some new variant that might go straight in the trash.
If you’re in the business, you might want to know about the four specific modules of the Studio product, straight from Fuzzy Door Tech:
- Tracker (iOS): Tracker which streams position data of an asset from an iPhone mounted to a cinematographer’s camera and sends it to Compositor.
- Compositor (Windows/macOS): Compositor is a macOS/WIN app that combines the video feed from a cinema camera and position data from Tracker to composite VFX/CG elements into a video.
- Exporter (Windows/macOS): Exporter collects and compiles frames, metadata, and all other data from Compositor to deliver the standard camera files at the end of the day.
- Motion (iOS): Stream an actor’s facial animation and body movements live, on set, to a digital character using an iPhone. Motion is completely markerless and suitless — no fancy equipment required.
ViewScreen also has a mobile app, Scout, for doing a similar thing on location. This is closer to your average AR app, but again includes the kind of metadata and tools that you’d want if you were planning a location shoot.
‘When we were scouting for The Orville, we used ViewScreen Scout to visualize how a spaceship or character would look on location. Our VFX Supervisor would text shots to me and l’d give feedback right away. In the past, this could have taken weeks,’ said MacFarlane.
Bringing in official assets and animating them while on a scout cuts time and costs like crazy, Fayette said. ‘The director, photographer, [assistant directors], can all see the same thing, we can import and change things live. For The Orville we had to put this creature moving around in the background, and we could bring in the animation right into Scout and be like, ‘OK that’s a little too fast, maybe we need a crane.’ It allows us to find answers to scouting problems very quickly.’
Fuzzy Door Tech is officially making its tools available today, but it has already worked with a few studios and productions. ‘The way we sell them, it’s all custom,’ explained Faith Sedlin, the company’s president. ‘Every show has different needs, so we partner with studios, read their scripts. Sometimes they care more about the set than they do about the characters — but if it’s digital, we can do it.’
Ref: techcrunch
MediaDownloader.net -> Free Online Video Downloader, Download Any Video From YouTube, VK, Vimeo, Twitter, Twitch, Tumblr, Tiktok, Telegram, TED, Streamable, Soundcloud, Snapchat, Share, Rumble, Reddit, PuhuTV, Pinterest, Periscope, Ok.ru, MxTakatak, Mixcloud, Mashable, LinkedIn, Likee, Kwai, Izlesene, Instagram, Imgur, IMDB, Ifunny, Gaana, Flickr, Febspot, Facebook, ESPN, Douyin, Dailymotion, Buzzfeed, BluTV, Blogger, Bitchute, Bilibili, Bandcamp, Akıllı, 9GAG