As an early adopter of new technologies, artificial intelligence has been part of my workflow for over two and a half years. Its role extends beyond generative AI, playing an active part in scripting, development, budgeting, and scheduling.
In post-production, AI tools now routinely assist with footage management, string-outs, and rough cuts, as well as audio mixing and color correction.Generative AI has been especially valuable—used for AR mapping in graphic effects, creating voiceovers with existing talent for reality and unscripted content, and supporting scripted development.
Adobe Firefly has helped extend shots in broadcast television, while generative tools have been combined with traditional techniques like green screen in music videos. AI-generated music tracks have also supported development, including a custom-written song tailored to a specific project.
Most recently, AI was used to generate both pre-show and in-show video content for a theatrical production created for the Sturniolo Triplets.
This hands-on experience has naturally led to consulting and teaching roles with production companies, agencies, post houses, and television networks—helping them understand how AI will shape the future of their work.
And when it comes to implementation, it's more than just experimentation; every prompt, every result, every edit has been crafted personally—prompt engineering, operating the tools, and delivering finished audio and video.
Adobe, including Premiere, Photoshop, Audition (and firefly tools)
RunwayML
Google Veo
Midjourney
Kling AI
ElevenLabs
Sonu AI
Udio
Canva
Topaz
Meta Tools including Audiocraft
Nolan Finabilt
Studiovity
ChatGPT
...and because the field is changing so quickly - I spend time following the latest updates daily.
The Sturniolo Triplets (40m+ followers) went on a stage tour through 15 cities over 30 days in 2025. The stage show included a 20ft LED screen, center stage, that displayed images and video.
In addition to producing the show and content for social and YouTube, we also created a multitude of generative AI images and video, including a countdown that hyped the live audience before the Triplets entered stage and video elements that introduced each element of the show.
In addition to using an AI driven post workflow, this pilot episode for MTV/Paramount+ called "Collectors Piece" used AR technology to digitally create the host talent for all of our special effects - seen in the video below.
This episode also included AI voice-overs, perfectly mimicking the voice of the lead talent and other characters in the episode.
The AI post workflow, driven by new technologies found in Premiere, including AI syncing, string-outs, and first cut edits done by transcript selection.
Creating sizzle or demo reels from found footage is a skill that has been enhanced greatly by the use of AI. For this project, AI was used to extract the interview audio from Youtube videos. A variety of software created stems separating out the dialogue from the music in the original video. Then, we created an original song using Udio with our board game lyrics.
For BMI Music London, we directed and created imagery for a suite of music videos and social media posts. AI generated video was combined with green screen as backgrounds, it was used to alter the original video (some shot more than five years ago), and blending techniques mixed AI footage, green screen footage and non-green screen footage for various visual effects. Some footage used AI rotoscoping.
Extensive Testing on multiple Generative AI platforms for client approvals helped to conceptualize creative concepts and test limits of AI at that moment (late 2024).