Redefining CyberSecurity

Deepfakes, Publicity Rights, and the ELVIS Act: The Intersection of Intellectual Property, AI, and Your Likeness | A Conversation with JC Heinbockel | Redefining CyberSecurity with Sean Martin

Episode Summary

Dive into this episode of Redefining CyberSecurity, where host Sean Martin and intellectual property lawyer JC Heinbockel explore the complexities of the ELVIS Act and the evolving landscape of rights of publicity in the age of AI. Discover how businesses can navigate the legal and ethical challenges of deepfake technology and protect their intellectual property.

Episode Notes

Guest: JC Heinbockel, Associate, Seyfarth Shaw LLP

On LinkedIn | https://www.linkedin.com/in/j-c-heinbockel-6563996a/

____________________________

Host: Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

View This Show's Sponsors

___________________________

Episode Notes

In the latest episode of Redefining CyberSecurity, Sean Martin delves into an intriguing conversation with JC Heinbockel, an intellectual property lawyer specializing in brand protection. The episode primarily focused on the intersection of the ELVIS Act and rights of publicity in the age of AI.

The discussion kicked off with JC Heinbockel providing a primer on intellectual property and the rights of publicity. He explained that while intellectual property encompasses discrete categories such as copyrights, patents, and trademarks, the right of publicity is more nuanced and often intertwined with personal privacy rights. Essentially, the right of publicity allows individuals to exploit their likenesses for commercial purposes or prevent others from doing so without permission. Heinbockel emphasized that the right of publicity is particularly relevant to celebrities and public figures whose likenesses hold significant market value. However, with the advent of generative AI and deepfake technology, protecting one's likeness has become more complicated.

The new ELVIS Act in Tennessee is designed to address these challenges by extending the right of publicity to include voices and by explicitly targeting the misuse of likenesses through deepfake technology. The episode also touched on various instances where deepfake technology has already led to unauthorized use of celebrity likenesses. JC Heinbockel cited examples like deepfake ads featuring Clint Eastwood and Tom Hanks, highlighting the legal and ethical complications these technologies introduce.

The Elvis Act serves as a legislative response to these advancements, aiming to protect individuals' likenesses from unauthorized commercial exploitation. For business leaders and security professionals, the conversation underscored the imperative need to develop robust AI policies, especially within marketing and advertising departments. Heinbockel urged organizations to carefully navigate the use of AI in creating content, as both the input and output of AI-generated material need to be scrutinized for compliance with existing laws and ethical standards. Moreover, the potential pitfalls of using generative AI extend beyond marketing to areas such as customer support and even internal operations.

Heinbockel warned of the risks associated with using AI platforms that might inadvertently disclose confidential information or generate legally dubious content. He emphasized the necessity of setting strict guidelines and having comprehensive policies in place to mitigate these risks.

The episode concluded with a call to action for companies to be proactive in understanding the implications of using AI and to plan accordingly. By doing so, they can better navigate the complex legal landscape surrounding intellectual property and publicity rights in the digital age. This timely discussion with JC Heinbockel highlights not just the challenges but also the opportunities for businesses to adapt and thrive in this evolving technological environment.

Top Questions Addressed

___________________________

Watch this and other videos on ITSPmagazine's YouTube Channel

Redefining CyberSecurity Podcast with Sean Martin, CISSP playlist:

📺 https://www.youtube.com/playlist?list=PLnYu0psdcllS9aVGdiakVss9u7xgYDKYq

ITSPmagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

___________________________

Resources

The Gadgets, Gigabytes, & Goodwill Blog: https://www.gadgetsgigabytesandgoodwill.com/

___________________________

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: 

https://www.itspmagazine.com/redefining-cybersecurity-podcast

Are you interested in sponsoring this show with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Episode Transcription

Deepfakes, Publicity Rights, and the ELVIS Act: The Intersection of Intellectual Property, AI, and Your Likeness | A Conversation with JC Heinbockel | Redefining CyberSecurity with Sean Martin

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

Sean Martin: [00:00:00] And hello everybody. You're very welcome to a new episode of redefining cybersecurity on ITSP magazine. I'm your host, Sean Martin, of course, where, as you know, if you listen to the show, I get to talk to all kinds of cool people about lots of cool things, security and privacy related. And, uh, yeah, Yeah, today's no different. 
 

I have a really cool guest on and the topic I think is going to tweak our minds just a little bit because it's going to force us to think about stuff differently. And, uh, I'm thrilled to have JC Heinbachel on to talk about IP and, and privacy and, you know, Perhaps some deepfakes and impersonations and who has rights to what and who has legal control over things and legal recourse if they're being abused or Or whatever online or however. 
 

I guess doesn't have to be online even but JC it's it's thrilled to uh Feel to connect with you and a shout out to [00:01:00] Pouya for making that connection. Thanks for reminding me. 
 

But yeah, thrilled. So, um, and we're gonna just see the. The idea of, uh, something being called the Elvis Act coming out of Tennessee, that, that, that was intriguing all in itself. So, but anyway, here we are. Um, we're going to have some fun today. Tell us a little bit about, uh, who you are, what you up to. I know you do some, some legal work, obviously some little IP stuff, uh, working with Booyah, but, uh, who's JC? 
 

JC Heinbockel: Sure. Um, well, you, you, you started with, uh, With the career, so I guess that's where I, I'll start. I, I'm a lawyer by trade.  
 

Sean Martin: It's a very American thing to do, isn't it?  
 

JC Heinbockel: Isn't it? 
 

Sean Martin: Who are you? And that means where do you work? 
 

JC Heinbockel: Well, uh, I, I, uh, I'm an intellectual property lawyer. I focus on brand protection typically. Um, I, I came to IP [00:02:00] protection because before I went to law school, I was myself A, an amateur artist of sorts. I played in a rock band while I lived overseas in China. And that was a fun, exciting, uh, time to be in a creative community. 
 

There were all sorts of people doing fun, interesting things. You could start a business or a project at the drop of a hat. And I was around all these creative people and I was doing creative things. I was making no money and I was trying to think of, you know, where, where does my career go from there and applying my skills and my interests. 
 

And that's how I got into intellectual property and intellectual property law. And, uh, it's a fun journey. It's a really exciting field, a lot of fast moving developments. You get to learn about so many cool, uh, Businesses and brands and projects and inventions and, uh, fun laws like the the Elvis act that, that we're, that's brought us together. 
 

That's right.  
 

Sean Martin: Yeah. Oh, [00:03:00] no, that's, that's Scooby Doo anyway. That was a very bad impersonal. Stop right there. Um, let's, let's paint a big picture first. Um, because one of the things we wanted to talk about is kind of the rights of publicity and how those interact with. intellectual property rights. And the first thing that comes to mind for me there is celebrity and paparazzi and the never ending battle there. 
 

And you, and you mix in brands and influencers, and I mean, we can start to tack on all kinds of layers on this, but how does, because I mean, an influencer to celebrity, Ultimately as a brand, perhaps they have, they have IP. I went to the soft, some posts the other day of, it was another one from Puya, funny enough about, uh, uh, Taylor Swift has a lot of patents and things or trademarks or IP on. 
 

Stuff that she's termed she's [00:04:00] created or whatever. Yes. Anyway, what's what's this big picture? I don't I mean when I think IP as a security business professional I immediately go to somebody building some software right because that's my mind But it seems like it's much much bigger. So what what are some of the things you look at? 
 

JC Heinbockel: Yeah, well, so you're you're right that intellectual property tends to think of certain You Discrete categories of things, right? You've got your, your copyright, which is protection in, in creative works. You've got patents, which are protections over inventions, and you have trademark law, which is protections around, uh, branding source of goods and services. 
 

The right of publicity. Is adjacent to all of that, but it really comes out of kind of more personal privacy rights. Um, it's a patchwork of laws that span across all 50 states where it's a different regime. And it [00:05:00] originally kind of comes from this idea of the right of privacy, right? That you, you have your inherent right to protect your own image and likeness from, from misuse. 
 

The right of publicity spins off of that. To commercial uses and it differs in every state, right? Not everyone has a right of publicity as it were in every state. Right.  
 

Sean Martin: Can you, I'm not, can you describe right? It would, yeah. Shall I say the legal description of right? Well, so yeah, so long. So maybe any, an abridged version. 
 

JC Heinbockel: What, yeah. What I, what I was trying to get to is that the right of publicity is essentially the right to exploit your likeness. for commercial purposes, right? Or, in another way, the right to exclude others from using your likeness for commercial purposes. Likeness is maybe an ambiguous term, but that's almost intentional. 
 

The likeness is typically going to be a person's image, right? A photograph of you or, you know, [00:06:00] moving image of you. It might be your name. Um, it might be, you know, your, your signature or something like that. Um, in some places, Your voice is included. And in fact, the Elvis act is a statute that is meant to explicitly include voices within the scope of likenesses, at least in the state of Tennessee. 
 

But so that's, that's really what we're talking about. It's this right to, to use and to prevent others from using you for commercial purposes, whether that's advertising, mostly advertising, but other sorts of, of, you know, promotion or commercial ideas.  
 

Sean Martin: So what if a scenario be I'm sitting in this really cool swanky place drinking a name brand beverage? 
 

I post that on Instagram or something and would it be that I Have the right to publicity but only for [00:07:00] myself that could prevent the name brand of that beverage That owns that Coca Cola or I just said it anyway, the name brand of the, of the drink preventing them from using my likeness in that image to promote their. 
 

JC Heinbockel: That's, that's, that's a big part of the idea. Yeah. We actually see a lot of cases right now, more, more so actually involving copyrights, but, uh, where brands find pictures of celebrities wearing their clothes taken by paparazzi and they say, Oh, look at this, so and so is wearing our. You know, our, our famous sweatshirt, check it out. 
 

And then there's a lawsuit incoming, right? The paparazzi has a copyright. They took the photo, they own that, and they have the right to prevent other people from, from reproducing it. But the celebrity might not want to be. You know an endorser of that brand and so they might be able to say hey You don't have the right to use me in a post promoting the sale of this sweatshirt just because i'm wearing it [00:08:00] So that's that's publicity rights Um, we've seen it a lot in in advertising contexts more formally than social media, right? 
 

One of the kind of foundational cases that I think almost every law student reads in just their general property case, even if they're not studying intellectual property, um, was a case involving Vanna white, uh, where there was, uh, a brand that used a robot that kind of looked and acted like Vanna white, you know, turning the letters and wheel of fortune. 
 

And she said, well, Hey, wait a second. Like there's, there's value in that. I S I, you know, sign a lot of contracts and do a lot of ads. You're not paying me. You're violating my likeness, my persona. Um, you know, that's, that's kind of how it all starts.  
 

Sean Martin: Interesting. Interesting. And I, I get to dabble in some, some stuff that takes me to premieres and openings and red carpets and things like that, where [00:09:00] plenty of people are standing. 
 

Next to a step and repeat with tons of brands. 
 

JC Heinbockel: Right, right. Yeah. It goes both ways, I guess. You know, the brand has trademark rights, right. Of someone creating an association with them. Just like someone with a commercially valuable image has a right to prevent someone else from creating an association with them that they don't want.  
 

Sean Martin: All right. So earlier today, and I mentioned this before we started recording, we're talking to some folks out of the European Union about, uh, data privacy and AI act. 
 

And one of the things that was triggered for me during the conversation. Was the idea of, so I think what, what we just described is a real person wearing a real brand used by a real brand to benefit from that. Um, But the thought that I had was kind of this misinformation and deep fakes where [00:10:00] brands or others, um, are misrepresenting people and the use of their brand. 
 

Maybe, I don't know, manipulating images, creating images, voices, whatever, creating ads. What are you seeing in that regard?  
 

JC Heinbockel: There's a lot of proliferation around that right now. Um, and that's one of the reasons why, you know, Uh, the state of Tennessee ultimately decided to pass this Elvis Act, which we keep talking about. 
 

I guess I should mention Tennessee has had, uh, Tennessee has had, uh, laws on the books for the right of publicity for, uh, at this point, almost, uh, actually almost exactly 40 years. And part of that came from the need to protect the Elvis Presley estate, um, you know, and extending postmortem rights. To, you know, celebrity likenesses. 
 

But the general idea was this was, this was a law on the books. As opposed to the type that is just [00:11:00] kind of the court looking at general principles of law and creating a scope of rights that way. And the idea was that it protected the estate from the unauthorized use of a person's likeness for commercial purposes and in particular for advertising. 
 

So you couldn't say, Oh, you know, this was Elvis's favorite peanut butter and bacon sandwich right here, right? That's, that's against the law. The. Elvis act is now an extension of that is to essentially prevent, uh, a lot broader use of likeness and in particular voice. And that's a response to deep fakes. 
 

We've seen a lot of action on deep fakes from a personal privacy standpoint already. A lot of states have taken action because of the potential for deep fakes to create. Uh, sexually explicit content involving people who, you know, don't want to have their [00:12:00] images used for that, right? For obvious reasons, without their consent. 
 

Um, this is the first law here in Tennessee now that is applying the similar deepfake principles to commercial purposes. But that's also a response to the fact that we're already seeing it happening. Over the last year, we've seen a number of deepfake ads. Um, some of them which have led to lawsuits and some of which haven't, but I think we've seen, for example, uh, Clint Eastwood in ads for CBD gummies, which he took no part in. 
 

Uh, there was a story about Tom Hanks.  
 

Sean Martin: Right,  
 

JC Heinbockel: right. Um, there was a story where Tom Hanks had to disclaim on social media, his involvement. Uh, because, uh, there was a dentist's office that I think was, was using deepfake Tom Hanks in ads for their practice. Um, you know, it's becoming a bigger problem, and it's one thing for it in a commercial [00:13:00] context, right? 
 

You know, people have valuable images that they're, they, they want to, you know, use and license and endorse things with, but you're also seeing it on the flip side in more troublesome contexts, right? And probably we'll see more of it over the next six months, uh, in the political realm because these have gotten so convincing. 
 

And so states are really concerned about this and there's no national law. Really that prohibits it beyond general principles of, of unfair competition and advertising, at least in the commercial space. Um, and so the federal government, and that's partly constitutionally to get really in the weeds, but they've left it up to the states to kind of create this patchwork and, and regulate their, their own. 
 

Jurisdictions on this type of stuff. 
 

Sean Martin: Ah, and I, there's so much here. Um, I think, yeah, I left you with a lot. Cause the other thing I'm, I'm thinking is, um, [00:14:00] yeah, I don't know how, how this compares to individuals. I'm thinking like the SAG AFTRA stuff from, um, where there's, there's an agreement, but if they happen to be in, in the state of, you know, Well, I guess the agreement ended up being that they couldn't use the likeness anyway, but if the agreement is, we grant you the use of our likeness. 
 

But you're operating out of Tennessee. There might be some, some conflict there, perhaps. I don't know.  
 

JC Heinbockel: I think probably not, right? Because the key is unauthorized use of a person's image. And so, you know, I'm not familiar with the details of the SAG agreement. But I guess the general idea is that you're, if there's consent to using it, right? 
 

If you're consenting to let the studio. Scan your body to create, uh, duplications for, for different purposes. Then that is outside the scope of it, [00:15:00] right? You you've said that's okay. Um, now if they then went ahead maybe and sold it to, you know, a potato chip company to use in a commercial, maybe that was outside the scope of, of what was permitted in that agreement. 
 

And that could be a problem. Um, so, so there's nuance there.  
 

Sean Martin: Okay. So bear with me now, because we. My audience is generally very heavy, very interested in privacy, of course, um, but and very business oriented. So business leaders, security leaders, privacy leaders. And so what I want to do is maybe this is a great overview of kind of what's going on there. 
 

What I want to do is maybe speak to them a little bit in terms of some of the things they might have happening in the organization. that maybe they should be paying attention to. And of course, uh, they have legal counterparts there that hopefully they're listening to this, they're bringing [00:16:00] them into the conversation. 
 

Uh, I'm just thinking so clearly marketing and advertising, right. The use of these technologies, um, could potentially put, put the company at risk. But I'm thinking in terms of just, so those are pretty open and overt and obvious. Perhaps. Yeah. I'm thinking more now of things like customer support, chat bots, or, um, I don't know, things like meta that may, that may have services that, that use information and data to generate something else. 
 

I don't know. I'm just wondering what, what are some signs that perhaps an organization might be close to stepping over the lines? And some of their business operations that could put them in jeopardy of one or  
 

more of these laws,  
 

Elvis  
 

or  
 

others.  
 

JC Heinbockel: Sure. Well, there's a, I mean, the obvious [00:17:00] one, since we're, we started talking about it is use of, of, you know, other people's likenesses. 
 

Right. That's, that's a big pitfall. Um, a lot of brands have been wanting to use celebrity likenesses or, you know, public figures likenesses in advertising before AI came about, and they could manipulate it, right? We, we counsel clients all the time on, on how to use images that include, uh, you know, easily recognizable faces. 
 

Um, and it, it's gonna, again, be a kind of a complicated question, but the general idea is that you have to be mindful of where you're gonna be using those images and whose images they are. And all of that affects what rights those people might have. Are they living or are they deceased? Um, were they people who use their images to endorse products or were they simply, I don't know, politicians? 
 

Um, you know, what is, what are you suggesting something that's, that's not true about either those people or the connection between them [00:18:00] and, and your business? Um, those are all big questions that you're going to need to ask yourself. We also see some other things come up. Um, You know, AI and especially generative AI, we see, um, it's potential to create really, uh, unique images, um, not just in involving people, but we have to remember that those, those AI images are trained. 
 

On, uh, other works and there have been a lot of lawsuits about the, the way that AI companies are training their LLMs and, and other generative AI models and the output of that too. Um, And I think for most businesses, you know, that's, that's kind of an interesting question to watch, but we actually have seen some cases of, um, where, you know, some cases where, like, for example, you know, a, a, a photo agencies, uh, watermark didn't get erased. 
 

[00:19:00] When it was, you know, integrated into a larger picture, right? And so you're looking at this image and you're going, oh, it's really cool. And oh, it's, it's from, it's from Getty. And so you're right. So that, that, that could create images, uh, issues, right? You have to really be careful about what output you're using. 
 

Um, if you're going to use that for, for advertising or for any other purposes. Um, and then of course, there's a really fundamental question too. Thank you. About whether or not works that are created by AI are capable of protection. And this is a potentially really expensive question. If you are someone are in a business where ideas are your, your assets or your capital, um, and you're using AI as part of a process to create something new that you want to patent. 
 

Well, um, certain types of AI generation. Actually might invalidate, uh, a patent grant that you might have because the patent [00:20:00] law really only gives patent rights to, to human inventors. That doesn't mean you can't use AI, uh, to create something or to invent something, but the patent office is starting to require disclosures both here and in the EU, actually. 
 

Um, about how you used AI, if at all, to create something or to invent something. Um, likewise, the copyright office won't register a copyright. If the work was basically, you know, substantially in AI work because you need human authorship to, to create copyright under U. S. law. Uh, there's a case right now at the, the appellate court where the copyright office had denied someone a copyright registration because he put in some prompts and created an image. 
 

And they said, no, this is, this was AI. This is not a human author. And he's trying to say, no, you know, this, this is still capable of protection. So there's, there's open questions there, but I think [00:21:00] there's a lot of value that people could be creating or losing if they're not paying attention to those types of questions. 
 

Sean Martin: Yeah, I love it. And, uh, Yeah, I mean, we can go, uh, I naturally want to go down to the creating code and using multilinguals models to help with that and whatnot, but, um, let's stay, let's stay true to the, to the core topic. How about, um, what was the other thing I was thinking about? The, the, the concept of likeness. 
 

You said it was vague purposefully. Um, I think we've talked about it in the context of a physical being, right, and a face primarily. Uh, we talked about it in terms of voice. Um, but I could see that those two things kind of just getting washed away and there's much, much more to a likeness that [00:22:00] could come as technology continues to advance. 
 

And I guess what I'm leading to is the data that makes our digital versions of ourselves really becoming our likeness in a digital world. Um, How we, so, I mean, not just the voice, but the words we use, our gestures, uh, medications we take, where we go on vacation, who our friends are. These shape us as humans become our likeness and the data from those experiences, the more we capture them on our phones and watches and, and rings and other people's phones and rings catching on ours, catching our stuff on their own devices. 
 

Um, I guess my point is. The likeness seems to become much bigger, and I don't know, maybe I'm a little out there, but I can see a movie script where my [00:23:00] likeness and my story is created in that movie. Right based on information I could see that's me my family could see that's me being represented in that movie Maybe others don't who don't know me, but I don't know. 
 

It's an out there kind of thing I mean, what I guess likeness the definition of it and where does that where does that lead us?  
 

JC Heinbockel: That's a really interesting question. Um, and in the lead up to this conversation, I was thinking a lot about uh, an article that From the mid 90s called, uh, Cyber Law and the Law of the Horse. 
 

And it was really, it was a speech given by a, Federal appellate judge. And the premise of this article was that we were starting to see in the nineties and we still do today, a proliferation of these kind of subject [00:24:00] matter courses, cyber law was the big hot one at the time. And this judge was saying that, you know, yeah, it's fun that we have cyber law classes, but you don't see classes about the law of the horse. 
 

Right, but there's so much law that's built out of, you know, horses kicking people and causing accidents, people negligently riding horses, the sale and the license of horses. And at the end of the day, what we want to always remember is that the bedrock principles of the law adapt themselves really well to new technology. 
 

And so the idea is that when you're thinking about new technology, right, and the way that AI might affect Our personal rights and the way that they're used, um, you know, whether it's the studio making a movie out of your life without you knowing or something like that, these general principles of privacy rights. 
 

Are in a lot of ways going to be just as applicable Even [00:25:00] if there's not a specific law that says you can't do that And I think we should we should be mindful of that One of the things that's interesting about a lot of these the contrast between publicity rights and privacy rights Is that publicity rights are really meant for people who have? 
 

Proven that their likeness has a value, right? Those are movie stars, athletes, and other people who endorse things and who would sell their image. Whereas most of us in most places don't really have strong publicity rights because You and I are not, you know, signing contracts to get photographed holding, you know, a cool new invention or a cool guitar or something like that, where wearing the neatest outfit. 
 

Um, but we do still have protection and states are trying to address that with a lot of these new deepfake laws, um, that are geared towards preventing sexual exploitation, but they might have greater [00:26:00] applicability. And so I think you've, you've raised a really interesting hypothetical and I don't think. 
 

We've caught up to that yet, but I think that, as I said, those general principles, right? The, the idea that, you know, an accident caused by a car is an accident caused by a horse, right? As technology evolves, these general principles, they still stand for the same thing.  
 

Sean Martin: All right. I'm going to, I'm going to continue to, uh, pretend I'm on gummies like a  
 

JC Heinbockel: Clint Eastwood. 
 

Sean Martin: There we go. Um, here's another one just, just for fun. Let's have some fun with this. So, I, as an individual, clearly have zero value as a public figure. Um, 10, 000 of us can like, connected together as customers of our organization to create a common likeness [00:27:00] that the company's then exploiting to make money. 
 

So, presumably for advertising it. So, it's a thing. Advertisers, they cause all the trouble. And the pornographers, but anyway, so I'm just wondering, again, in the context or the definition of likeness, um, I think the Elvis act is an individual, right?  
 

JC Heinbockel: Yes.  
 

Sean Martin: And probably the others are as well. But is there. Is there an idea or should there be an idea that a collective of attributes that create a likeness that's being exploited? 
 

I'm just thinking class action suit, right?  
 

JC Heinbockel: Yeah. Well, when you say a collective likeness, are you thinking of almost like, like an avatar, like a, an image that represents the full, uh, you know, the full community of, of listeners, or are you thinking of almost more like a, like a logo, a brand?  
 

Sean Martin: So I'm, I'm thinking, It's a [00:28:00] virtual representation, whether, whether it takes form of, of an avatar or some human like thing, I don't know, but there's a collection of information that, that becomes valuable, driven by my data and others. 
 

JC Heinbockel: Ah, yeah.  
 

Sean Martin: That's then being exploited. I don't want to be part of it.  
 

JC Heinbockel: Right. Well, that, that gets into sort of questions of, you know, the typical types of, of data use questions that we have. Right. And the questions, or rather the things that, that laws like the, the GDPR and California's data protection laws, and there's a number of other states that have them now are trying to solve, I suppose. 
 

Right. What, what kind of data Can you use and in what contexts and how can you collect it? Um, and Again in the u. s. That's still a pretty open question to some extent right each state is is working through Things while the the federal government kind of lets lets everyone go [00:29:00] ahead and and see what works um, so so I think that that's where that would come through if I if I had to To put an answer to that question. 
 

But, uh, yeah, it's an interesting question.  
 

Sean Martin: So we're, we're thinking about it, just not maybe through my weird lens.  
 

JC Heinbockel: Yeah, to me, to me, that doesn't sound like a likeness question per se. I mean, I get the idea that you're like a community can have common attributes, but Uh, I suppose, you know, if, if there's a way that that community could be, could have some sort of a, a legal personage that could then represent you in court or something like that. 
 

Right. And it's, it's a tricky question.  
 

Sean Martin: All right. So I'll, I'll, uh, I'll get off the gummies now. So let's, let's bring the, thanks for that's a really interesting question. I mean, that's, yeah, [00:30:00] I don't know if they entertain anybody else, but anyway, I'm thrilled with myself. But, uh, So back, back to the business end of things here. 
 

So people are litigious, um, even if they have no right to be, um, what, uh, so I guess is a company's operating out of Tennessee or bound to the Elvis Act, or is it?  
 

JC Heinbockel: It's anyone who's engaging in that content or behavior within the state of Tennessee, right? So you don't have to be a Tennessee company. If I, here in New York, you know, do, make an Elvis deepfake, and I, you know, start buying billboards in, in Memphis, right, or I'm, start, uh, you know, putting up, uh, making robocalls using his voice, or anyone's voice, right? 
 

That's target, that's conduct, that's, yeah, exactly, that, that's conduct that's, that's entering into and [00:31:00] targeting Tennessee. That's going to be the problem, right? It doesn't matter where I'm doing it from.  
 

Sean Martin: Got it. Okay. All right. I'm trying to think of what, uh, what's some of the things organizations should be kind of preparing themselves for here. 
 

I think, I think the obvious one is using, using gen AI to create stuff that they don't know the source of it. And  
 

JC Heinbockel: yeah, well, you know, the, the big thing for companies is that if you don't have a generative AI policy. It's time to start thinking about it, right? Um put you know guardrails in place and guidelines in place for your employees on how they can use Uh ai right and not just for what they're generating but because A lot of times, um, if you're contributing information to a, uh, to a a an LLM into a [00:32:00] a, a AI platform in order to generate something for you, that platform is gonna then reuse your inputs. 
 

And so are you putting in confidential information that is maybe going to lose some of that confidentiality by, you know, creating prompts for an AI platform? Um, you know, that's another thing that people really have to start thinking about. Um, if you're running a business. Because, you know, those types of trade secret issues get complicated really quick. 
 

So how are you protecting information that you're putting in? Um, you know, what is coming out too? It's not just, is it going to be a violation of someone's trademark or copyright or personality, right? But all of those hallucinations that we hear about, right? We've seen it in the legal profession with people using AI to generate, uh, court filings only to find out that the cases cited in there were completely made up. 
 

Well, how does that affect you if you're writing a white paper and you're citing articles that are [00:33:00] made up or, um, you know, something like that? So there are a lot of, uh, new AI platforms that are targeted towards certain industries to avoid those types of hallucinations, right? They're using closed universes. 
 

So maybe you have to ensure that your employees are using the appropriate AI platform instead of just letting them do whatever they want. Um, it really covers so many different areas. of the law. Um, you could go on for, for days and days and days about different risks. 
 

Sean Martin: And I'm going to go back on the gummies. I'm sorry. No, the, um, so I, one, one more thing is, uh, as I think about this, cause the, Let's say you are an agency representing personalities, figures of, uh, in the public that have value and, and you're using AI to use their likeness to create new stuff because I, I want to, I don't want to [00:34:00] make, I can, I can easily go back to the IP, maybe have another one on that, but, um, cause I added a conversation with somebody talking about, uh, digital signing service. 
 

Where people were submitting contracts with detailed data that they didn't want in there and, and, and signing, uh, their, their trademark documents and their, uh, their patent documents. Right. So, and this company using all that content to train whatever their own systems for, presumably in a closed system, but potentially connected outside. 
 

Um, So I guess that leads me. So we'll leave that there. But in terms of representing public figures and I guess there's a law extend to a company in that way. So they, so [00:35:00] they're kind of the middle person, middle, middle entity. So I don't know what I'm trying to say here, but I guess the, you AI, and specifically to help you manage those brands. 
 

JC Heinbockel: Yeah. And.  
 

Sean Martin: Something happens that takes you into Elvis act territory that you didn't didn't expect.  
 

JC Heinbockel: Yeah. So if, if you're, I think you're saying like a marketing agency or a, an ad agency. Yeah, you could absolutely, uh, be just as liable as your client, um, depending on your agreements to write what what types of indemnity provisions and waivers and releases you have. 
 

But, um, generally speaking, if you're the agency that's that's involved in that. Um, you know, your client asks you to do something, you, you create it and you give it to them and then they use it. You could both absolutely be, be liable. So, um, at, at all levels, you want to really be careful and [00:36:00] really be thinking about what you're creating, what rights and consents you have or don't have. 
 

Um, and, you know, what, what, what are you trying to really achieve? 
 

Sean Martin: That's a great, great point. Uh, well, I think we'll use that to close it. The what is the outcome and where does the outcome lead you? And I think, I think to your point, uh, having, having AI policy and understanding what you're trying to achieve, if you don't have that in, in, in view and plan in motion, You're probably leaving yourself, uh, exposed somewhere. 
 

All right. Well, JC, I mean, uh, I, I took you on a journey you probably didn't expect to go on today.  
 

JC Heinbockel: It was, it was really a lot of fun, Sean. This is a really fun conversation.  
 

Sean Martin: I was thinking kind of, it's been a long day and I'm thinking out there a little bit, but, uh, I appreciate [00:37:00] you going on the ride. 
 

JC Heinbockel: That's all right. I'm going to be up all night. Thinking about that's right. No collective likenesses and  
 

Sean Martin: collective likeness. There you go. And for, uh, for my audience, obviously I'm, I'm most often talking about operationalizing security controls. So this is going to be a little wacky for them as well. But, uh, I think. 
 

There's something here. I think ultimately it comes down to technology and how technology is being used in the business and understanding how it's used correctly, ethically, legally, all that stuff. And if you haven't thought about it, shame on you if you don't have a policy in place. You have time to do that now and, uh, sadly, um, beyond that become, it becomes, uh, following all the laws that you have to, you have to, they have to monitor, uh, from a legal perspective. 
 

But if you do the first two, be aware and have a plan, you're in much better shape for, uh, following and [00:38:00] adhering to the laws.  
 

JC Heinbockel: It gets a lot easier to follow the law if you plan in advance.  
 

Sean Martin: Absolutely. Uh, and hopefully the laws. Tapping into, uh, the technologist and the security and privacy folks. So it gets set in a way that they can be followed. 
 

JC Heinbockel: Yeah, I think so. There's a lot of people, a lot smarter than I am certainly, who are thinking deeply about these, these legal questions and are, are shaping the law. And it'll be really exciting to see how this continues to progress over the next, you know, even six months. But, but beyond that, I don't think this is technology that's going away. 
 

Sean Martin: Definitely not. Definitely not. Well, JC, an absolute pleasure meeting you and, uh, good to have some fun with you today, and I, I'm certain we got people to think, and, uh, hopefully there's a few nuggets in here as well that, uh, will give us something to actually chew on as they, as they think. Um, so thank you so much for that. 
 

And, uh, of [00:39:00] course I'll put a link to the article for Elvis AI act or yeah, Elvis act and, uh, your profile, people can. Well, multiple people can come to you with their collective likeness. Yeah. With you. How's that?  
 

JC Heinbockel: We're, we're happy to help. Yeah. We can blaze new trails here. It was a delight, Sean. Thank you so much. 
 

Sean Martin: Right on. All right. Thanks everybody. Stay tuned, uh, for more on redefining cybersecurity and, uh, and beyond obviously, and I'll catch you on the next episode. Thanks everybody.