Havas
Video production
Social budgets limited creatives to static shots. Layering AI over basic 3D models now turns single frames into complex 1-minute videos.
- 1-minute video produced vs single static shot
Spec ads took three weeks across separate systems. Gen-2 now generates full video from text, bypassing physical production.
An award-winning creative agency serving clients like Google and Budweiser established a dedicated AI-first studio operating across four global cities.
Traditional production relies on expensive location shoots, while early AI experiments suffered from fragmented toolchains. Creating a single spec ad...
“Using Runway was the moment when we really got it – that we could absolutely do our job while avoiding traditional shootings. Our 'aha moment' came with Gen-2. We participated in Gen:48 and absolutely loved the process and started to realize that we really could do our job with AI. The game-changer was when we moved from static images to video. In the beginning, we were cheating a little bit – shooting a face and adding a tree in the background, then animating the tree with wind to create this illusion of reality. But now, with the improvements in Runway, we don't even need those tricks because the quality keeps getting better and better.”
Independent creative agency network for advertising and brand strategy.
Generative AI platform for professional video creation and world simulation.
Related implementations across industries and use cases
Social budgets limited creatives to static shots. Layering AI over basic 3D models now turns single frames into complex 1-minute videos.
Stitched stock photos failed to sell ideas. Now, AI generates animated concepts instantly and automates video localization at scale.
Manual ad production took months. Now, teams use AI to predict performance and generate creative in just 5 days.
Social budgets limited creatives to static shots. Layering AI over basic 3D models now turns single frames into complex 1-minute videos.
Stitched stock photos failed to sell ideas. Now, AI generates animated concepts instantly and automates video localization at scale.
Manually illustrating vintage assets would have tripled production time. Now, a 3-person team uses AI to generate and animate the visuals.
Custom analytics required months of full-stack development. Now, self-serve AI apps connect analysts directly to data models.
Data scattered across systems forced generic content. Now, AI agents surface personalized stats and videos via chat during live matches.
Manual workflows delayed global campaigns for weeks. Now, local teams use GenAI to instantly draft localized, regulatory-compliant copy.
Spec ads took three weeks across separate systems. Gen-2 now generates full video from text, bypassing physical production.
An award-winning creative agency serving clients like Google and Budweiser established a dedicated AI-first studio operating across four global cities.
Traditional production relies on expensive location shoots, while early AI experiments suffered from fragmented toolchains. Creating a single spec ad...
“Using Runway was the moment when we really got it – that we could absolutely do our job while avoiding traditional shootings. Our 'aha moment' came with Gen-2. We participated in Gen:48 and absolutely loved the process and started to realize that we really could do our job with AI. The game-changer was when we moved from static images to video. In the beginning, we were cheating a little bit – shooting a face and adding a tree in the background, then animating the tree with wind to create this illusion of reality. But now, with the improvements in Runway, we don't even need those tricks because the quality keeps getting better and better.”
Independent creative agency network for advertising and brand strategy.
Generative AI platform for professional video creation and world simulation.
Related implementations across industries and use cases
Social budgets limited creatives to static shots. Layering AI over basic 3D models now turns single frames into complex 1-minute videos.
Stitched stock photos failed to sell ideas. Now, AI generates animated concepts instantly and automates video localization at scale.
Manual ad production took months. Now, teams use AI to predict performance and generate creative in just 5 days.
Social budgets limited creatives to static shots. Layering AI over basic 3D models now turns single frames into complex 1-minute videos.
Stitched stock photos failed to sell ideas. Now, AI generates animated concepts instantly and automates video localization at scale.
Manually illustrating vintage assets would have tripled production time. Now, a 3-person team uses AI to generate and animate the visuals.
Custom analytics required months of full-stack development. Now, self-serve AI apps connect analysts directly to data models.
Data scattered across systems forced generic content. Now, AI agents surface personalized stats and videos via chat during live matches.
Manual workflows delayed global campaigns for weeks. Now, local teams use GenAI to instantly draft localized, regulatory-compliant copy.