Welcome to the Vitrina Podcast, today, we are honored to host two remarkable leaders in the media and entertainment industry: Craig German and Seth Hallen. With their wealth of experience and expertise, they bring invaluable insights into the ever-evolving landscape of content creation, distribution, and technological innovation. Join us as we delve into their journeys, discuss industry trends, and explore the future of storytelling in the digital age. Welcome, Craig and Seth!
Seth Hallen: Screenwriting is an iterative process, involving drafting, seeking feedback, and script breakdowns for production. AI tools now provide quick feedback within seconds, analyzing sentiment and suggesting dialogue alternatives. This accelerates the feedback loop compared to traditional methods. While some tasks may be automated, these tools enhance efficiency, allowing creators to focus on tasks requiring human creativity.
In this podcast episode, Atul Phadnis interviews Seth Hallen and Craig German about the impact of AI on the entertainment supply chain. They discuss various areas of the industry that are already being affected by AI, such as localization, scriptwriting, and post-production. They also explore the potential of AI in animation, sound, and distribution. The conversation highlights the need for demystification and collaboration in the industry and the importance of considering the broader applications of AI beyond picture and sound.
Podcast Chapters
00:00 | Introduction of Panelists |
08:50 | AI in Screenwriting and Script Breakdowns |
15:58 | Expanding the Applications of AI in the Entertainment Industry |
30:48 | AI in a Post-COVID Entertainment Industry |
40:32 | Introduction to HPA AI Steering Committee |
49:27 | Exciting Developments and AI Filmmaking Competitions |
Craig German: The AI that most of us are learning about is currently limited in its data sources. It just has what we feed it from the internet or specific text, image, and video data we train it with. What it doesnโt have is sensory inputs or real-world system models, like what it feels like when you wobble as you get going on a bike. But we have other ways to model behaviors and systems to augment what AI can do for us.
Key Takeaways
- AI is already impacting various areas of the entertainment supply chain, such as localization and scriptwriting.
- Tools like AI-powered dubbing and deepfakes for lip sync are being developed to enhance the localization process.
- AI can assist in script breakdowns, making the process more efficient and allowing human writers to focus on creative aspects.
- AI is being used in post-production and VFX for tasks like outpainting, image replacement, and color application.
- AI has the potential to improve efficiency in HR, finance, legal document review, marketing, data management, and lead generation.
- The HPA AI Steering Committee aims to demystify AI and guide the industry in adopting AI in a sustainable and ethical manner.
- The future of AI in the entertainment industry is evolving rapidly, with new tools and features constantly being developed.
- AI filmmaking competitions, like the one using the Leonardo tool, are becoming popular and showcase the creative potential of AI.
Atul: All right, so welcome everyone to the Vitrina podcast and a special edition of that of the Vitrina podcast. A very exciting set of panelists today for discussing a very hot topic, the AI impact on the entertainment supply chain. With us is, I couldn’t have thought of two bigger speakers on this topic. So let me just introduce both of our panelists, our panelists today, Seth Hallen, Managing Director of Light Iron and the President of HPA. Again, Seth, welcome to the Vitrina podcast. We are sort of, we’re doing this for industry. 72,000 leaders and executives and experts around the world tune into the Vitrina podcast and all of our newsletters and supply chain digests. They actually look for trends, patterns, and also key developments within the entertainment supply chain. So welcome to this edition of the Vitrina podcast. And then again, welcome to Craig German as well. Craig, again, global leader in post-production on the studio side, had beginnings with Amazon Studios. Last profile was at Crafty Apes, chief executive officer, a board member at the HPA. So again, I know both of you have actually sort of dabbled a lot in AI recently, both from your day jobs as well as from the HPA perspective. So couldn’t have thought of two bigger experts on this topic to sort of look at the entertainment supply chain, how’s Hollywood looking at things with all the stuff that’s happening with AI.
And then you both are fresh from NAB. So there’s lots to discuss today. But I just want to welcome both of you on behalf of Vitrina and our entire Vitrina Business Network. Thank you.ย
Seth: Thanks for having us, Atul.ย
Atul: Fantastic. So let’s get started on this topic right away. And I want to actually start with a very macro picture, macro view. There is a, over the last six months, I think we’ve been seeing a host of new developments in different aspects of the industry. There are new solutions happening. There are new innovations that are being attempted. In some cases, there is a fair amount of buzz. In some cases, we are actually seeing solutions roll out. And there is a specific visual which I want to queue up just to understand which aspects do we think are most affected by AI right away. So this is a view of the census that Vitrina has of the global entertainment supply chain. At this point, Vitrina has 125,000 companies within the entertainment industry profile. And the breakup of those 125,000 companies in terms of the primary focus areas of these these companies is such where you’ve got 2,200 companies in pre -production financing, more than 81,000 production houses and so on and so forth. And one of the things that, and again, both of you will laugh at this. One of the things that we are sort of faced with is that a host of new services and solutions are being launched by companies in each of these areas which is sort of getting Vitrina to sort of expand our taxonomy and classifications of services. Some of these are brand new services within our industry, and they are replacing or appending or modifying some of the more traditional offerings. I’ll give you a couple of examples, right? Localization is the one place where we are seeing a host of new services being launched, synthetic voices.
AI -powered dubbing. We’re actually seeing one big trend of deepfakes for lip sync. And every other month, we are actually seeing a very rapid set of new services being launched. But I wanted to take a pause. And while I mentioned the localization example, what
I mean, as you look at this map of the supply chain, which are the areas that are sort of affected already with some innovative solutions?
Craig: Well, yeah, sure. The localization one, obviously people have been doing a lot of experimentation with it. There are, as you know, a lot of challenges there. There are challenges in terms of talent and guild relations. There are challenges in terms of cultural sensitivity of translations. And there’s also concerns just for the general job market. But there’s also even challenges that I really frankly wasn’t aware of until I was working closely with the Amazon Studios localization team in terms of what’s really done in adaptations for the dubbing aspects and other musical or other sound aspects of a production, a television show or movie that go well beyond just straight translation. Cause yes, there are so many, I’ve been shocked over the past year at how many tools there are out there that can, that say, um, just give us five minutes of your voice and we’ll be able to create your avatar in 20 languages. And you can almost have like a Tower of Babel translation dynamically even. But those are straight translations, which as we know is very different from a professional production. So it’s not as straightforward. But having said that, one of the areas that I recall was the most interesting at Amazon Studios to a lot of us was, there were a large catalog of content that’s licensed by any streamer, right? For example, any broadcaster or streamer. And there’s back catalog that was not controlled in the same way of the production process that you would have if you were doing your originals, Amazon Studios originals or Netflix originals or anybody else’s. And so the net effect was they didn’t actually a lot of times do the dubbing or subtitling. So there, if the choice is between, and it wasn’t, it was too costly to Google or to create it upfront or, or actually some of the, even the originals they weren’t doing dubbing or some translations for, but for other licensed material, they weren’t doing translations or dubbing at all. If the choice is not to have it at all in your local language versus have it in your local language, not being perfect. That is actually a lot of people would argue that’s preferable. So that’s a, I think a net positive things that wouldn’t happen otherwise. Um, but, uh, I do think the adaptation piece is much more nuanced, like they do song replacements, even song rewrites. They have to avoid certain sensitivities, certain words. There, what we discovered without going into too many details of Rings of Power, that there were historical translations of the original books that you would want to preserve in that language that don’t come through in more modern translations. So there are just a host of challenges that don’t just come from straight learning and watching and of scraping the internet or even training on typical models.ย
Seth: I totally agree. Just to pile on on that one. Localization is different from translation. Translation is a part of the process of localization. Localization is not just translating one language to another language. It’s actually making sure that there’s the cultural implication is captured in the new language and those are very, shall we say, creative endeavors and, you know, currently, although I’m sure moving forward in the future, sometimes there could be models that are trained to understand all of those nuances. Sometimes certain scenes or certain bits of dialogue have to be completely rewritten to relate in a particular market. And one little example I like to use is in America, we like to say cats have nine lives. Well, in Spanish, Italian and Greek and other languages, cats actually have seven lives. So you can imagine if you take a one -for -one translation and say, people will think it’s just wrong, right? And so that’s just one kind of simple example of many, many unlimited countless kind of cultural adaptations of the dialogue. So even though Atul, I think what you’re honing in on is that AI is enabling new tools, new methodologies to be able to provide these services, and that’s true, different efficiencies are brought in, there’s gonna be, I believe, and what we’re seeing, aโฆ sort of evolution or a change in the human role with these tools. So humans are going to continue to very much be in the loop on this, but of course, in just different ways, just like every technological revolution has brought to this business and every industry.ย
Atul: Fascinating. And look, I mean, I interviewed a Dutch a company CEO a couple of weeks ago, they do actually end -to -end localization now completely AI powered. It’s completely fully automated now, including the dubbing. So as a customer of theirs, you can actually select the type of voice that you want to actually read this in Dutch or in whatever Spanish, for instance, as a language example. And you can actually do a live testing. And then once you figure out that this is the right voice on this particular actor or this face on the screen, you can actually make that entire selection and move that forward. So just the speed with which that part seems to be moving is quite amazing, actually. So yeah. We’re definitely seeing a lot of movement in that area.ย
One of the exciting things, just to relate to the experience at the HPA Tech Retreat this year, Seth, you chaired a panel where you had a company that was actually applying a fair amount of AI and LLMs in pre-production. So screenplay, I mean, the demo that we saw was a book that was converted into a full-fledged screenplay. And within minutes, it was just absolutely breathtaking to see what kind of new innovations can actually happen. But clearly, that area seems to be getting impacted as well. \
Seth: Yeah, I think in a very positive way, actually. So think of some of these tools as co -pilots. And, you know, for screenwriting, it’s no different. I’m not a writer, but I’m friends with writers and I have some understanding of the process that they go through, but it’s very iterative. And it starts with conceptualizing certain characters, certain situations, certain scenarios, coming up with your story, writing the drafts of those stories. And then in many cases, almost every case that I’m aware of in terms of the writers that I know, they’ll pass that early draft onto a mentor or a friend or a partner to read that first draft and give them notes. And many times those notes sometimes could be some sort of creative feedback, but a lot of times it’s technical notes. So story structure, character arc, things such as this, continuity. And what’s amazing about some of these tools is you can drop this script โ human written script and put it into a tool and have your AI co -pilot be that first pass, do that first pass and give those first notes. And it can within seconds come up with some really interesting feedback. It could identify, we know that large language models are really good at sentiment analysis. So when it looks at character arcs, it can maybe pick out a bit of dialogue that might not be in sync with that arc that it observed you might be trying to go for. And it’ll give you a few different versions, alternative versions to consider with that dialogue. And that’s very similar to what a human, you know, a partner or a mentor would do for that writer. But of course, that might take a few days or a week for your friend to get back to you. And this is done within seconds.
Script breakdowns are very time consuming processes. A script is when you’re gonna turn that into an actual production, there’s a lot of work where you need to understand what are all the locations? What’s the wardrobe? What are all the props that are needed? How about the character summaries for each character? How about aโฆ synopsis of the story. These are things that these large language models and the tools that have been built with UI to enable very sort of specific use cases are really good at doing. Again, within seconds, it can do a full script breakdown and really aid in the process. So we might say, well, who used to do that script breakdown? Well, many times interns or people working in the writing department.
Okay, those tasks are gonna get replaced, but I argue that these tools will make writers more efficient, make some of those other folks who might be writing apprentices not have to do some of these more mundane tasks that machines are much better doing so that maybe these, you know, writing assistants or script, you know, people in the script department could focus on things that, you know, humans can do much better. And…
I think it elevates us. I don’t want to, Craig and I get in some debates about this and sometimes he accuses me of painting too much of a rosy picture. Of course there’s exceptions to what I’m saying, but overall I see all these things as tools that are elevating and enabling the human creative process and all processes really we just have to look out for and try to conceive of how does the human role evolve.ย
Atul: Fascinating.
Craig: And I would say to support all of what Seth just said, I think there’s a lot of unfortunate friction in the process that’s introduced by things as simple as plot inconsistencies between episodes, right? Like I think people can sometimes lose track and that could be a part of the co -pilot, the script writer’s co -pilot or the writer’s room co -pilot. So I think it can help you find those inconsistencies in story, those inconsistencies with
character profiles, things like that, and can say, hey, are you sure this person would really say this or that kind of thing. And then one of the talks that I Seth, I was telling you guys earlier, I sat in on one of my last talk before I headed to the airport from NAB was put on by Amazon Web Services with Chris Delconte from the VFX department at Amazon Studios, plus two of the principals at Untold Studios. And.
They were asked, what are you using AI for? And it wasn’t to create visual effects. It was to do script breakdowns, expand across much more than VFX. They hit the casting and they hit production and they hit post and they hit wardrobe, everybody, right? And we know how many iterations do you go through on a script when you’re in development and pre -production? It could be 20, you know, it could be, it’s a ton. And every time there’s like a, who knows how many pages have been touched.
And you, it’s hard to go and find like, what is the implication on our schedule on what we’d already chosen in terms of wardrobe changes, in terms of our budget, all those things could be, uh, assisted. And otherwise people are doing mad scrambles to figure it out. Uh, every department has to go figure that out every time, uh, or not every department, sometimes one is affected or two are affected. Um, so really valuable. And just to put a cap on that, the Untold Studios guys.
They’re saying, well, you know, one of the first AI applications we did is we developed an intelligent Slack channel for our team where we could actually query on things about a production. So it’s like nothing to do with what people are worried about in terms of what AI will replace in terms of job functions.ย
Atul: Fascinating, fascinating. So I mean, right from stories to sort of screenplay, screenwriting, and then of course, a selection of talent, even suggestions. We saw a couple of demos relating to suggestions of saying, hey, this particular character in this story, who do you think will you recommend? And it came up with the, it did not give a definitive answer of, this is the exact start, it actually gave a set of options. And then it’s at least a set of suggestions to get you started.
Seth: Yeah, well actually at the tech retreat the demo that you’re talking about That’s right. That’s right two writers in the audience who were kind enough to donate their script that they wrote for the demo and in the presentation in the demonstration of this of concept art, which was being delivered by its founder Christian Cantrell the the app did suggest certain actresses for the main character for the lead role. And in fact, the one that came up as the number one recommendation as an actress to play that leading role was who they had in mind when They were scripting and it wasn’t amazing. And I didn’t find that unfortunately didn’t find that out until after when the writers came up to me all excited. I wish it came out in the audience. That would have been really dramatic. But but that was pretty pretty amazing because it will, it’ll suggest actual actors and it will then generate images like using a Midjourney plugin with those particular actors in costume, in those settings, it’ll start to storyboard and do all of this sort of pre -visualization of some of this concept work early on.ย
Atul: That’s right. That’s right. And then as the demo sort of, you know, kept going forward, that same in that same HPA panel discussion and demo, it got into, you know, selection of locations, you know, which what, where should the shoot happen for this sort of a story? It then moved into, you know, also dates, and which characters were occurring the most number of times with each other.
And then that is a scheduling impact. It’s just astonishing. I mean, just the number of, and a lot of these are actually productivity enhancers. They just make the whole schedule run much more efficient and on a tight ship is what I’m guessing. Right.ย
Craig: Yeah. And I think my experience seeing it from a post -production perspective, interacting with a whole bunch of other teams was that a lot of times it was a mad scramble.
Like somebody wouldn’t know that a decision had been made or that something had been changed and they go down the line on a certain assumption, they’d have to go back and backtrack. So I agree. I think it removes a lot of the chaos of the process. So people can really can focus on what people do best. If it’s casting, they’re going to have the conversations with those agents and say, hey, are they available? Would they be interested? What are they trying to do with their next movie?
If it’s on the wardrobe, they can focus completely on like focused changes and logistics of how they manage wardrobe changes and transport and all that. So there’s just a lot of, I think, I won’t call it hidden waste because everybody knows it’s happening, but it’s the stuff that people do that it’s probably not something they’re excited to do about figuring out the impact on the whole of an entire production.ย
Atul: Fascinating. So, so there’s a raft of supply chain elements that AI could actually enhance or increase productivity or just make things faster. Again, both of you are experts in Post and VFX and that entire brand of services. What developments are you seeing in those areas relating to post -production, VFX?
I know that there is both the impact of AI and just very sophisticated computing and rendering sort of breakthroughs that are happening. You’ve gotten gaming engines, which are playing a big role now in special effects. So talk a little bit about those developments. Are dots connecting between gaming, cloud, gen.ai in weird ways that you hadn’t seen coming?
And Seth, you have a perspective on this?
Seth: Yeah, that’s an interesting thought. I haven’t seen, I mean, obviously we know how the Unreal Engine has, things like that have kind of gone across gaming and scripted television feature content. I’m not sure I’ve seen very much new tools, but maybe Craig.
You have, but I haven’t seen that. You know, when you talk about post and Craig can speak better on VFX, but when you talk about post, there are certainly some really interesting tools out there. Runway as an example is a tool that’s very robust, very feature rich, well funded. They continue to develop. It’s a pretty amazing new tool. It’s an editing system with timelines and you’re able to bring in video and do editing and effects. And they have this whole suite of AI magic tools that are available that you can use to do all sorts of things that are powered by generative AI, outpainting, image replacement, even color application, things like that. So all sorts of stuff. And then you can easily do…text to image, text to video, and bring those into the timeline. They were actually one of the first to come out with text to video technology in their gen two last June, I believe. And so we’re looking at those things very closely and in fact starting to use some of those AI magic tools on some of our, well, at least in terms of our testing, I’m not sure we’ve actually used it on a live project yet. So.
And then also an announcement just the other day Adobe announced that they’re integrating Sora, Pica and Runway into Adobe. So when you’re working in Premiere, I believe it is, you’re able to kind of maybe even like create a space for a bit of video at some point, you’ll hit the Sora button, enter a text prompt, it’ll give you a few images to choose from as a starting image and then it’ll generate that video and you can drop it right into the timeline. So those are, you know, they’re kind of setting the stage for us to kind of imagine, you know, where it’s going. But Craig, maybe you speak to VFX a little bit better. What have you seen?ย
Craig: Sure. Well, first I just wanted to say I actually got a chance to go by the Adobe booth and look at the Firefly.
uh… area where they were talking about these yet to be released features with a they don’t have a date for it by the way of course and they’re not even they weren’t even comfortable saying yet that any of those three are partners, they are in talks but they did have the drop downs implemented so you could go out and you want to generate this and uh… they’re also one of the i think one of the most impressive things i found about Adobe beyond just being who they are is that they’ve made a conscious choice to be what they would call an ethical tools developer. What that means is they have artists going out and generating original training data to train their Firefly video model. So it complements the other three that we talked about. And they are part of a, I think it’s called the Content Authenticity Initiative. And they are signatories. Ironically, I think it’s with Microsoft and the New York Times, ironically because those two are suing well, because Microsoft’s being sued by New York Times for taking their data. So, but anyway, so they’re all in there and talking about how do we ensure that we’re doing this training properly? But I was, it was really cool. The features they talked about creating on their own within the tool will include generating basically stock footage, well, replacement stock footage, B -roll kind of footage.
Add and subtract objects like Seth was saying and extend shots if you want to extend out something because you need to time it out more. So really cool things. And obviously there’s going to be a lot more to come from these innovators. On the VFX side, you know, when I was back at Crafty Apes, one of the big things that they were focusing on was de -aging. That was a big focus. And there are definitely a bunch of companies, like I think,
One of the ones that was focusing a lot on that also was MARZ. I think they, but obviously last year was decimated a lot of different companies and MARZ is still around. And I think they’re focusing mostly on their IP and not on their services. Maybe they’re now focusing on services again, but de -aging was one area, honestly, for the quality that you’re trying to generate for Hollywood productions.
it was still not there. Often you’d have to end up going and doing a lot of artistry to correct the AI. Yeah.ย
Atul: I think the same. Just to add to what you’re saying, I heard the same thing from on the localization piece for the deep fakes of lip sync. And the place where AI is breaking down and not able to sort of get the right perspective is if you have a sideways shot right? And the mouth is moving, then it’s breaking. And it’s not very smooth. But there are limitations in some of these cases, right? Of how…ย
Craig: But I mean, when you look at this, and I think Seth has really been diving into this a lot more deeply than I have, but the cool thing that’s coming, I think, is the marrying of the different AI frameworks.
So we’ve been talking about generative AI because all of a sudden you had this confluence of compute power and new models that work much more efficiently and people who said, let’s go grab the internet for good or bad. And all of a sudden it’s accessible to everybody. So we can all relate because you don’t have to be an AI expert and you can type something in a chat or draw something. Seth and we’ll talk about a little bit like with Leonardo at NAB.
But it’s accessible to the common person, which means it’s in all of our minds. So all of a sudden, people are saying, how can we use it more? But GenAI can only go so far without spending a lot of money with a lot more data, probably. But you can also do alternative models that are much more structurally based, symbolically based, so that you can marry them up. And everybody’s looking at RAG a retrieval augmented generation, where you can actually go and validate some content and say, does this actually make sense basically, or look up some information that’s missing. So they’re gonna get there with the structural pieces that it takes a lot longer if you don’t, I’ve seen, I think it’s Yann LeCun has been talking about how they’re never gonna get there with LLMs because you can just look at how long it would take AI to learn what a child can learn about going to ride a bike in just like a couple hours, it can go ride a bike. AI won’t be able to do that unless it tries for a year, or I’m just exaggerating here. So there’s missing data sources that AI doesn’t currently have. It just has what you feed it off of the internet or out of documents. It doesn’t have sensory inputs of like, what does it feel like when you wobble on a bike or whatever. But you have other ways to model that that you can feed in for specific kinds of capabilities.ย
Atul: But one aspect that was curious was also, there is a post -COVID sort of world within the entertainment industry, which is that, had COVID not happened, I don’t think a lot of this video would have ended up on cloud as much as it actually has. Especially because of a lot of the remote workflows that got introduced at the time because of compulsions of COVID. Is that sort of creating lots more video data now? Because a lot of that video data, a lot of videos stored on cloud, you can actually run all kinds of other metadata extraction of object identification and place identification and person identification and so on and so forth. I just wanted to see if that has had an effect.ย
Craig: I think you’re right. I know that at AWS, well, it’s not really directly AWS, but AWS’s clients, people who build on that infrastructure, those service companies like, let’s say, Deluxe.
They have a large footprint around the world and a lot of clients. So with permission from their customers, they had a unique vantage point where they said, Hey, we’ve got all this data in the cloud. Um, we want to run some analysis on it. We, we will show you how we’re going to do it. We will tell you how we’re protecting it because obviously that was a big hurdle. Uh, it was like a very hard choice for a lot of people to say, we’re going to put all of our crown jewels in the cloud. Uh, and, and, uh, well, I mean, people were pretty careful about it obviously, but to make sure it’s a legitimate process, but then they do have the ability to do large scale analysis. And I’m just picking AWS, but obviously Microsoft has been doing it on Azure and then you have Google Cloud, and then you have other separate contained clouds and local infrastructure where people are able to do that. But yes, I think to answer your original question, I do think that it gave certain service providers an ability they might not have been as easily able to do if thatโฆ that step hadn’t been taken during COVID.ย
Atul: Fascinating. Fascinating. And just before we move to another very hot topic relating to a couple of initiatives that HP is doing in AI, one other question was relating to, again, the meat and potato stuff. So I spoke to Chris from Crafty Apes. And one of the things that he talked about was the meat and potato stuff, a lot of the regular stuff that otherwise happens in post, that actually having a lot of AI tooling and really reducing the amount of time and effort that is now required to do some of that stuff. Again, is that a big sort of contribution already that is mainstream now in post ?ย
Craig: Yeah, like Roto. I think is definitely something that is quickly going to be mostly, if not completely machine. I mean, maybe they’re outlier cases and matte painting like Seth was talking about paint in or paint out. So those ones, absolutely. And, but even, you know, I stopped by the MTI booth with Larry Chernoff and Cortex and he showed some new features that they were doing with AI. The funny thing was, some of the features they were showing and talking about AI when everybody’s talking about it, they were already doing a year ago. There’s some new ones that they showed like jump cut smoothing and also the black removal for ads, ad insertion, things like that. But really cool when you look at it, we’ve all seen if you slow down and stutter through, if you’re in an edit suite and you’re going through frame by frame, you see some jumpiness sometimes.
And when you watch what they do, they basically reconceptualize the, uh, the flow and, uh, smooth it out. But as Larry was pointing out, I think there were some, uh, artifacts that came from the way that AI was doing that, um, from the way it’s trained. Like, um, there was like a thinness of a woman’s leg coming through a revolving door that you could, uh, manually paint in and then, and then carry through. But yeah, I think there are definitely AI assists going on in the tooling that people are using. Um, And I think the key thing will be, I think there are, I wouldn’t even want to hazard a guess of how many individual tools there are to do all kinds of AI corrections. But I think they will probably coalesce around some core groups of features that particular tool vendors either buy up or implement on their own or integrate with.ย
Seth: Before we move on from this topic, I want to jump in and say one more thing that I’ve been thinking about a lot. And that is, in our industry, we’re in the picture and sound business. And how much fun is it is to play with these tools that are, you know, you have captured the world’s imagination in terms of creating these pictures, creating sounds, creating voices, creating music, creating videos, animations. And of course that speaks right to the heart of what we do in production and post. I think it’s important, however, for us to remember as we are in a at a time in our industry where the supply chain is challenged and anybody in the supply chain has to reduce their costs and create efficiency and scale. And we always know that one of the best ways to do that is through technology all through the years. I think it’s important for us to remember that AI does a whole heck of a lot more for our businesses than the picture and sound part of things.
And so certain things that every company needs, AI can play a huge role in making more efficient. And so I’ve been focusing on a lot of the, let’s say not sexy stuff. So things that AI can do to help make efficient the HR process, reading resumes, recruiting, finance, analyzing spreadsheets, doingโฆ creating graphs and charts of data. How about legal document review? How about marketing and copy and those sorts of things? How about data management and security? How about lead generation and market analytics? I mean, it goes on and on and on. And to be honest, if we’re talking about how these tools can be used to help the supply chain, I think some of that is actually even lower hanging fruit. I think the things, those more mundane things that every business needs is where I think we all need to look at just as hard at, if not even harder than how it’s gonna impact picture and sound creation. Just thought I’d throw that in.
Craig: And I would say I can’t agree more. Seth and I both took this class that I think everybody in the universe isย
Seth: Actually, actually we say all three of us have taken that class on AI and strategic and business strategy.ย
Craig: That really got me thinking exactly the way that Seth was talking about. And I would even take it one step further at if you look at every single person in a company, if you’ve got a hundred person company or a thousand person company and you say, I’m going to improve their daily efficiency by 5%. Look at their cost efficiencies. You’ve just introduced into your retiring company by using, I’ll just let’s say Microsoft co -pilot. I spent a lot of, I spent a bit of time playing with that at NAB also, nothing to do at all with content creation or anything like that, but much more about daily efficiency with being able to, ask what another group is doing within your company if you’ve got an enterprise release. Or as I think Seth has been also doing a lot of, capturing the notes from a Zoom call or something like that and translating that into action items and scheduled meetings and follow -ups and emails and summaries, like all that. I literally, I know I used to spend two hours a day doing that stuff because there was such a big job that would literally, I would imagine take a minutes just to go and review and say, does this look right? And are there any weirdnesses in it? I think that actually one of the big takeaways I had from some of the talks at NAB was, one of the more important parts is the QC aspect of everything. And that’s where you need experts who can go through and say, what’s an efficient way to go through? Actually, the two areas where they were saying, obviously, prompt engineering sounds like it’s been overused in terms of a potential role or strategy but it will be whatever you’re guiding your AI partner to do is going to have to be through a range of interactions that are like prompts basically, right? Right, right.ย
Atul: Fascinating. I mean, there is just so much of potential in that area. In fact, those tools, because they are applied horizontally across industries are moving faster in just their evolution and their development.ย
Craig: Yeah, they brought us to use case exposure so they can suss out, the needs of the broader market.ย
Atul: Exactly, exactly. Fascinating. This is super inspiring. So I’m talking to two HPA board members and a president. So I know that HPA recently sort of instituted an AI steering committee. I was curious about what’s the key sort of driver of that. Is it to sort of look at new technologies coming in? Is it to guide some of the industry sort of adoption towards AI? Would love to sort of get a little sense of that.ย
Seth: Yeah, we haven’t had our first meeting yet. That’s in the works of scheduling that, but we have a number of really amazing professionals that are going to join us for our first meeting. It’s the HPA AI Working Group.
And the idea is that there’s so much demystification that needs to happen on this topic and not just for the community, but for ourselves as well. And since there’s so much uncertainty moving forward and this world is moving so quickly and so fast and new things are coming out every day, every week, I think it’s extremely important that we are mindful that communication, today and collaboration is more important than ever. And so to have a group of people and we’re organized in a certain meaningful way to have these conversations, to start to think about how these tools can be integrated into our workflows and into our companies in sustainable and ethical ways, to try to help the industry with shining a light on the path forward, which has not been paid, right? So to kind of be a part of the leadership of our industry that can find that way forward, I think is so important. And that’s the general idea. So we’re gonna start with an organized agenda and start having these conversations and see if together we can really be a part of continuing the success of our future in this area.
Atul: Fascinating. No, look, I mean, again. There is a big push required within industry to just understand what’s happening and for experts to sort of parse out where some of the aspects are headed from a tooling perspective and adoption perspective. As you would know, we have a host of members who are not from the US and not from Hollywood. I’d love to get aโฆ little sort of plug for HPA and what does it stand for? What does it’s charter? Either of you, if you can sort of take that for the benefit of our users dialing in from around the world.ย
Seth. Well, correct. It’s just as much yours as anyone’s. But OK, the Hollywood Professional Association, is the trade organization, nonprofit trade organization for production and post. And traditionally it’s been focused more on the post -production community and industry, but it’s the tools and technology, the creative, and really even the commercial aspect of the supply chain of the production and post industry. And we like to think of ourselves as having three pillars which are community, knowledge exchange and recognition. So through a series of initiatives and events, we exercise those three pillars through those initiatives and events, which include our two tent pole sort of flagship events, which is the HPA Tech Retreat, which we were just talking about that happens every February in Palm Springs.
The HPA Awards in November, which celebrates the creatives and the artists and even some of the engineering accomplishments of our industry. And then we have various other initiatives like Women in Post, Young Entertainment Professionals, HPA All, HPA Net, that really help to bring value to the community, value to the industry. And ultimately, the HPA is a platform for anybody in the community and industry to exercise leadership, thought leadership, and just participate in the community and in the industry in a much more, in a really meaningful way.ย
Atul: So. Got it. Got it. Fantastic. You know, look, a couple of aspects, right? One is that we covered a lot of ground, but we didn’t cover certain aspects. So it looks like there is a part two of this that is required very soon.
I think the aspects that we would love to sort of dial into animation, sound, I think we talked a little bit about that. Is there a distribution angle from curation point of view, a lot of speculation about more sophisticated recommendation engines and personalization and streaming platforms which are very differently now structured?
because of all the developments that are happening within the AI space. Just last thoughts about for the purpose of this conversation, in the next few days, what are you most excited about? Are you sort of testing some new software, meeting new companies? What’s the most exciting aspect? Or maybe waiting for some new launch.
Craig: I think there’s no such thing as waiting for a launch with AI anymore. When I was first thinking, okay, I need to get more savvy on AI, I would go, I think I subscribed to TLDR and maybe I think I looked at a couple of others that were particular individuals like Azeem Azhar. And then every time I’d read an article, a medium or something, I’d say, I think that person knows what they’re talking about.
And so now I’ve got this, uh, this explosion of emails every day. There’s a lot of repetition. So I’ve been trying to call it down. Um, but I think that, uh, there was something I don’t want to over Sam Altman gets way too much exposure. I know that, but, uh, there was a quote that he, I saw he had in an interview, uh, in the past couple of days or maybe, yeah, it was probably in the past couple of days where he said, look, if you’re a startup, uh, and you’re focused on chat GPT fours features, you’re going to be left behind. What you should be looking at is how do you keep current with the evolution of the tools, use the newest features and plan as if the improvements you can imagine are going to be made. Because you said it’s basically every time the release happens, it’s a hundred X improvement in some dimension. So I think when, yeah, right. I mean, so when you think of it that way, obviously the downside, I think that we’re, I’m really curious about with AI is, what is going to happen with the impact of these lawsuits about provenance, state of provenance and fair use and things like that. Because if they do have to go back and do something, hopefully it’s going to be a middle ground where they pay some aggregate amount into a fund and they start paying for access to training data and things like that, which is what they’re starting to do with like with Reddit. Reddit got that massive commitment of $200 million over the next few years for their data. And there’s other commitments like that. So I think that’s going to be really interesting. But it is evolving so quickly. I just have to say, and I think Seth could describe it more, but the funnest thing I saw at NAB was the AI filmmaking party slash e-sports competition where they would have two teams, unfortunate coloring, red and blue, but both teams would be using Leonardo and they would get a prompt of a scene or an idea. And then they would draw, what they were imagining would create that best, the winner output. And it’s sometimes stick figure-ish, or mostly stick figure-ish, I would say. And then you’d end up with these amazing images on the other side. So split screen for both teams. And it appeals to me as a failed, pictionary player from many years ago, that I’m terrible at drawing, and I’m encouraged that I could actually make something amazing happen. Maybe not world class, but at least satisfying. But so fun that it’s a husband and wife team, Caleb and Shelby, that Seth has gotten to know really well from Curious Refuge. And they would rate every team for hour. I don’t know how long that part of that competition went. That was really, I think that concept is gonna explode. I think that’s gonna be something that people do now everywhere.ย
Seth Groundbreaking. Yeah.ย
Craig: First one of its type. Wow.ย
Atul: I’m going to dial you actually, Craig, on that one to understand that a little bit more. But this is fascinating. Look, as I mentioned, we covered a lot of ground. Looks like there is a part two of this, maybe part three, just with all the developments that are happening. But one of the things that we will do is we will summarize this conversation for our users as well and try and link it with some of our observations on these new services and new tools, even as Vitrina’s tracking those as part of the supply chain mapping that we’re doing globally.
But just on behalf of Vitrina and our entire business network of 72,000 leaders and execs and experts, can’t thank you enough for your time today and for an incredible conversation. I think just lots of ideas in there.ย
Seth: Thank you.ย
Craig: Yeah, thank you for having us.ย
Atul: All right.