UXDI | Case Study

Paramount+ Streaming Event


Live Stream Event


Overview

ViacomCBS hired Piranha where I led my CG team to help create a virtual event for one of their most important launches to date. Paramount+ studio unveiled a one-of-a-kind streaming service showcasing its extensive content library along with the financial projections during a 3-hour live streaming event to present to media and investors.


Event Highlights

Just to show some of my visualizations from the 3-hour-long streaming event, I’ve created this short 30-second clip that will take you on a quick journey through the eye of a digital artist responsible for 3D content and some of the compositing.


Understanding the Project Scope

The unique nature of this project required considerable coordination and collaboration between multiple agencies working in tandem, remotely, and on-site within the Covid-19 environment during the midst of a pandemic. Creating a full three hours of high-quality content, with very tight deadlines was a major undertaking.

Working remotely in a small team environment created many difficulties. However, by utilizing all of the available tools (Zoom, Slack, Dropbox & High-Speed Internet) we were able to gain traction. The entire project team had to commit to a standard operating rhythm and be flexible to adapt along the way.

blog2
Diving Into Design & Development

With the use of LiDAR surveying technology, the Paramount lot was fully scanned and we were able to reproduce the actual sets with the use of photo-realistic CGI through means of digital modeling, texturing, and lighting.

Most of the actor's content was shot on green screens in NYC and LA, where we’ve worked alongside the world’s famous stage designers (Emmy’s, Grammy’s, and VMA’s crew to be precise) To construct a virtual set that made the presenters feel like they were working together in one space. With the use of cutting-edge camera tracking technology and STYPE, we were able to sync 3D camera data and match the perspectives of captured green screen content.

Utilizing The Unreal Engine

Despite our deepest cravings to film an entire real-time production in Unreal Engine 4, we were limited by the lack of tools that the current version provided at that time. However, the entire green screen & camera tracking data that was captured both in NY and LA, was completely shot through Unreal Engine with the use of Stype Technology. We were able to decode the data and convert it to use with 3DS Max in order to create fully flexible, pre-rendered digital presets that would be used for plug-in-play throughout the entire presentation.



contact me

Have any questions? Please feel free to reach out!