Surviving more than ten years of Vodka Cruisers being hurled our way while producing content for some of Australia’s most successful music festivals, we’ve seen the progression of the industry over the years.

From the humble Canon 5D to drones with 360 cameras, vast changes have occurred in the production process of documenting and promoting these events. However, these changes were all before the dawn of generative AI, a transformative innovation that will see a monumental shift in content production and make countless creative jobs obsolete.

In order to gauge where the technology currently stands, we decided to simulate a brief with which we’re intimately familiar; create a 30 second festival aftermovie with unlimited creative scope. From inception to completion, the entire process took an Enamoured Iris team member one, twelve hour working day. Keep in mind that this also included the process of identifying platforms and designing a coherent workflow. Future projects will be substantially faster than this, particularly if image upscaling is not required. 

If you find this article useful we’d love for you to stick your email in the form and subscribe to future tutorials. 

Close up of androgynous musician singing at a music festival during a video production in Melbourne from Enamoured Iris

What Is The Use Case For Generative AI Video?

With the technology in its infancy, it’s difficult to imagine how generative AI will be used by video producers. Aside from deep-fake propaganda and our prediction that brands will soon have the ability to choose from a stable of AI influencers rather than deal with human egos, we are discovering the applications as we experiment.

Within the parameters of music festival video content, we’ve identified the following three use cases that can be achieved today with the current generative artificial intelligence techstack:

  1. New festival or event promoters who (a) do not have an existing database of video assets (b) do not have the budget for live-action content but require promotional content for the forthcoming festival.
  2. Festival promoters who want to preselect/curate “avatars” that best represent their brand in advance of the event, who will then be digitally inserted into the final aftermovie (previsualisation).
  3. Video producers who missed a key moment during live production and want to replicate the shot after the fact.
Medium shot of woman raising her hands during sunset at a music festival during a video production in Melbourne from Enamoured Iris

How Did You Create It?

Without giving too much away, we trained an AI model with a smorgasbord of our past festival footage, mood boards, and even the odd hallucinogenic-inspired artwork. What emerged were images nothing short of a cybernetic fever dream. Like a digital Picasso on a binary binge, the AI crafted cinematic visuals that accurately captured the raw emotion of a live festival. We then took these image assets, upscaled them and added motion before passing the video content through our traditional Adobe workflow.

The result? A 20-second aftermovie that to the untrained eye is indistinguishable from “organic” content (particularly on a small screen). We’re already producing short films with the same technology, with feature length AI films in the pipeline. To think where this technology will be within the next 12 months is mind-blowing. 

Close up of drummer hitting a drum at a music festival during a video production in Melbourne from Enamoured Iris

What Are The Implications?

As with any seismic shift, this technological leapfrog comes with its own set of ramifications. The first, and most glaring, is the existential threat it poses to the cinematographers. Why slave under the Australian sun with heavy camera equipment when AI can churn out a masterpiece in the time it takes to switch lenses?

The other implication is for festival organisers. Now armed with the power to fabricate entire events from the ether, face a new dilemma; do they even need live-action? Or why not achieve the best of both by hybrid productions? The line between actuality and artifice grows thinner. 

Then there's the audience. Are we nurturing untapped creative potential or is pixelated perfection creating unrealistic expectations of reality? Will the emergence of synthetic content lead us to become distrustful of our own eyes?

Crowd photo of falling confetti in a music festival aftermovie produced by Melbourne video production company Enamoured Iris

The Final Cut

As manipulated media starts flooding our screens, one can't help but wonder: are we pioneering a new art form, or are we digging the grave for authenticity in festival cinematography? The aftermovie we've created is a testament to the boundless potential of AI in the realm of video production. It's a siren song, luring us with its promise of efficiency and innovation. But the algorithm can’t yet substitute the human touch, the organic chaos and the genuine moments.

Generative AI in festival aftermovies is not just a technological leap; it's a cultural cannonball into untested waters. It promises a future where the line between the real and the rendered is not just blurred, but obliterated. As content creators, we wield a double-edged sword; let us be mindful of the edge we choose to sharpen.