The AI Update | January 26, 2024

#HelloWorld. January has not been especially frantic on the legal-developments-in-AI front. Yes, we know the anticipated final text of the EU AI Act was published unofficially, but the final vote hasn’t happened yet, so we’re biding time for now. Meanwhile, in this issue, we check in with state bar associations, SAG-AFTRA, and the FTC. They have things to say about AI policy too, so we’ll listen. Let’s stay smart together.  (Subscribe to the mailing list to receive future issues.)

Beyond the ChatGPT lawyer. State bar associations are increasingly accepting that generative AI is here to stay—and that “guiding principles” for usage are better than outright bans. Back in November, the California state bar published a Practical Guidance document that included “guiding principles” like (a) avoid uploading client confidences to inadequately secured AI tools; (b) don’t rely on AI outputs without close human review; and (c) don’t charge clients for the time you saved by using a generative AI product.

This month, the Florida state bar chimed in, publishing an Ethics Advisory Opinion adopting many of the same principles as California and adding three noteworthy morsels:

      • get a “client’s informed consent” before using third-party AI tools that will receive client confidential information,
      • treat the AI model’s output the same way you would treat a (human) paralegal’s or legal assistant’s work product—review, verify, and, as the lawyer on the file, accept ultimate responsibility, and
      • be very careful when using AI chatbots for advertising and client intake purposes—you’ll be held responsible for any untoward messaging the model may inadvertently produce.

A deal for SAG-AFTRA voice actors. Another month, another actors guild agreement. On January 10, SAG-AFTRA announced a deal with Replica Studios allowing the latter to use SAG-AFTRA voice performers to internally create and, separately, externally license “Digital Voice Replicas.” Most interestingly for our purposes, the guild published the full text of the two agreements—one for “development use” and the other for “external and licensing use”—along with a summary and FAQ.

There are plenty of interesting aspects in these materials for the AI aficionado, under the development agreement, performers must provide “informed consent” and get a “session fee” (four-hour minimum), with rates based on the existing SAG-AFTRA Interactive Media Agreement. The performer’s consent lasts only one year; to add another, Replica Studios has to obtain “continued informed consent” and make another minimum payment.

Under the external license agreement, Replica Studios must separately bargain with voice actors for their consent to an “External Use” or “License” of the digital voice double. This is exactly what you’re thinking: use in a TV show, movie, video game, ad, etc. In general, Replica Studios must get consent use-by-individual-use, rather than advance consent to a broad, general category of uses. And in its Addendum, the agreement sets out complicated schedules for calculating the performer’s compensation (in part based on the number of lines used) and other nuanced requirements.

In both agreements, Replica Studios has to take “commercially reasonable steps” to prevent the Digital Voice Replica from being hacked, stolen, or used without permission—and must indemnify the performer for any use that is defamatory or unlawful.

The FTC flexes its (investigatory) muscles. Speaking of voices, the Federal Trade Commission has been increasingly vocal about the possibility of regulating generative AI under the Commission’s Section 5 power to police against unfair or deceptive trade practices. For instance, this past December, the FTC published a Staff Report on “Generative Artificial Intelligence and the Creative Economy,” and for us, pages 5-6 were the most interesting section. There, the FTC itemizes its recent AI-related enforcement activity, including against Amazon and Ring for collecting voice and video recordings in privacy-problematic ways. The Commission also hypothesizes about situations where it might intervene in the future, to protect against undue concentrations of control over “computing power” and “large stores of training data”—and to defend authors’, musicians’, and creators’ “writing style, vocal or instrument performance, or likeness.”

As of this week, this discussion seems less academic. On January 25, the FTC announced it was launching an inquiry into several partnerships between traditional Big Tech companies and LLM start-ups. Specifically, the Commission is looking into OpenAI’s relationship with Microsoft and Anthropic’s relationships with Google and Amazon. The FTC’s orders also require the companies to report on their investments.

What we’re reading. If you’re a human, don’t eat the berries of the belladonna, because this toxic “nightshade” plant can be lethal. If you’re a generative AI system, watch out, a digital equivalent is here: Researchers at the University of Chicago just released Nightshade in electronic form. This software tool “poisons” the data in digital images to make them unsuitable for model training. While the images look ordinary to the human eye, the modified data purportedly induces AI models to generate unwanted results. According to the researchers, “a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.” Honestly though, that sounds like a cool picture too.

What should we be following? Have suggestions for legal topics to cover in future editions? Please send them to AI-Update@duanemorris.com. We’d love to hear from you and continue the conversation.

Editor-in-Chief: Alex Goranin

Deputy Editors: Matt Mousley and Tyler Marandola

If you were forwarded this newsletter, subscribe to the mailing list to receive future issues.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress