{"id":145,"date":"2023-09-18T12:13:29","date_gmt":"2023-09-18T16:13:29","guid":{"rendered":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/?p=145"},"modified":"2023-09-18T13:14:18","modified_gmt":"2023-09-18T17:14:18","slug":"the-ai-update-september-18-2023","status":"publish","type":"post","link":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/2023\/09\/18\/the-ai-update-september-18-2023\/","title":{"rendered":"The AI Update | September 18, 2023"},"content":{"rendered":"<p><img loading=\"lazy\" decoding=\"async\" class=\"alignleft wp-image-96 size-full\" src=\"http:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-content\/uploads\/sites\/63\/2023\/04\/DM-AI-Update-e1681141844877.png\" alt=\"\" width=\"150\" height=\"60\" \/><\/p>\n<p class=\"DMBdyTxt\"><i>#HelloWorld. In this issue, the Copyright Office asks all the right questions\u2014but will it do something interesting with the answers? Microsoft and Adobe offer clever ideas of their own. And, surprise (not really): Two new lawsuits against AI developers. Let\u2019s stay smart together. (<a href=\"mailto:AI-Update@duanemorris.com?subject=Subscribe%20to%20the%20mailing%20list%20&amp;body=Please%20add%20me%20to%20The%20AI%20Update%20list.\">Subscribe to the mailing list<\/a> to receive future issues.)<\/i><\/p>\n<p><strong>The Copyright Office has questions.<\/strong> Since the spring, the U.S. Copyright Office has devoted considerable effort to its AI Initiative, launching an AI webpage, holding four public listening sessions, and hosting educational webinars. In what it calls a \u201ccritical next step,\u201d the Office on August 30 published a <a href=\"https:\/\/www.copyright.gov\/newsnet\/2023\/1017.html\" target=\"_blank\" rel=\"noopener\">notice<\/a> of inquiry asking for written comments (due October 18) on around 66 wide-ranging AI-related questions. <!--more-->The inquiries are as comprehensive as a law school syllabus and include important subjects like:<\/p>\n<ul>\n<li style=\"list-style-type: none\">\n<ul>\n<li>Descriptions of the ways developers collect, curate, and store datasets used to train AI models;<\/li>\n<li>Potential licensing regimes for compensating creators whose works are used in AI model training;<\/li>\n<li>Whether it\u2019s technologically possible or economically feasible for an AI model to \u201cunlearn\u201d data it was trained on;<\/li>\n<li>Whether it\u2019s possible or feasible to determine the extent to which an AI output was influenced by a specific piece of training data;<\/li>\n<li>The level of specificity and transparency AI developers and deployers should provide about their training data;<\/li>\n<li>Whether human authorship should be required for copyright protection;<\/li>\n<li>Whether substantial similarity is the proper test for determining whether an AI output infringes; and<\/li>\n<li>Labeling requirements (if any) for AI-generated material.<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p>According to the notice, this information will be used to help inform the Office\u2019s stance on AI-related legislation and regulation. That\u2019s not the clearest explanation of intent, but one thing is certain: The comments should be excellent sources for snapshots of the state of generative AI and IP as of fall 2023. Organizing industry and public comments into user-friendly reports and collections is one of the Copyright Office\u2019s superpowers.<\/p>\n<p><strong>Adobe and Microsoft have answers.<\/strong> While the Copyright Office went broad, two well-known tech companies tried something more narrow. First up, Microsoft, which has invested billions in OpenAI and is implementing generative AI throughout its products under the \u201cCopilot\u201d brand. To help spur adoption, Microsoft <a href=\"https:\/\/blogs.microsoft.com\/on-the-issues\/2023\/09\/07\/copilot-copyright-commitment-ai-legal-concerns\/\" target=\"_blank\" rel=\"noopener\">announced<\/a> on September 7 an indemnity, alliteratively named the \u201cCopilot Copyright Commitment.\u201d In Microsoft\u2019s words: \u201cif a third party sues a commercial customer for copyright infringement for using Microsoft\u2019s Copilots or the output they generate, we will defend the customer and pay the amount of any adverse judgments or settlements that result from the lawsuit.\u201d Important conditions and carve-outs: Paid versions only, existing filters and guardrails cannot have been disabled, and the customer can\u2019t have tried to generate infringing output (e.g., by providing input that the customer does not have rights to use).<\/p>\n<p>Adobe\u2019s CEO meanwhile published a <a href=\"https:\/\/blog.adobe.com\/en\/publish\/2023\/09\/12\/fair-act-to-protect-artists-in-age-of-ai\" target=\"_blank\" rel=\"noopener\">blog post<\/a> promoting a new Federal Anti-Impersonation Right (FAIR) Act. This proposal \u201cwould provide a right of action\u201d\u2014under federal, not state, law\u2014\u201cto an artist against those that are intentionally and commercially impersonating their work or likeness through AI tools.\u201d Charitably, this proposal aims at filling a gap in the law, where users employ generative AI tools to create output \u201cin the style of\u201d a particular artist without that output necessarily copying enough literal expression to infringe. A smidge more cynically, the proposed act looks to pin principal liability on individual users rather than on model developers like Adobe.<\/p>\n<p><strong>More of the same on the litigation front.<\/strong> Back in late July, we recapped the eight major <a href=\"https:\/\/blogs.duanemorris.com\/artificialintelligence\/2023\/07\/27\/the-ai-update-july-27-2023\/\" target=\"_blank\" rel=\"noopener\">lawsuits<\/a> targeting generative AI, mainly filed in the Northern District of California, mainly proposed as class actions, and mainly alleging copyright and privacy violations. Here come three more attempted class actions, all filed in that same court in the first two weeks of September. A group of authors including Michael Chabon, of \u201cWonder Boys\u201d and \u201cThe Amazing Adventures of Kavalier and Clay\u201d fame, <a href=\"https:\/\/www.courtlistener.com\/docket\/67778017\/chabon-v-openai-inc\/?ref=campaignforaisafety.org\">sued<\/a> OpenAI and another major AI developer for alleged copyright and DMCA offenses, while two anonymous plaintiffs charged OpenAI and Microsoft with various privacy-related violations. All three suits largely mirror the structure and theories of the previous California cases. (If you\u2019d like an updated litigation tracker, send us an <a href=\"mailto:Duane%20Morris%20AI%20Update%20%3cAI-Update@duanemorris.com%3e?subject=Please%20send%20me%20the%20updated%20AI%20Update%20litigation%20tracker\">email<\/a>.)<\/p>\n<p><strong>What we\u2019re reading:<\/strong> Sometimes all the talk of AI benefits and risks can feel disconnected from practical applications on the ground. Not so in this thoughtful Stanford Technology Law Review <a href=\"https:\/\/law.stanford.edu\/wp-content\/uploads\/2023\/08\/Publish_26-STLR-316-2023_The-Use-of-Artificial-Intelligence-in-International-Human-Rights-Law8655.pdf\" target=\"_blank\" rel=\"noopener\">student note<\/a>, which describes in detail the many ways in which states and NGOs are already using AI models to track and evaluate human rights issues\u2014for instance, by automating the process of reviewing and clustering global media news reports to identify problematic death penalty cases. The note reinforces one of The AI Update\u2019s chief mantras: Don\u2019t think of AI as a magic box. Start with a specific use case and then work backwards to see whether an AI tool can add efficiency to the process.<\/p>\n<p><strong>What <em>should <\/em>we be following?<\/strong> Have suggestions for legal topics to cover in future editions? Please send them to <a href=\"mailto:AI-Update@duanemorris.com\">AI-Update@duanemorris.com<\/a>. We\u2019d love to hear from you and continue the conversation.<\/p>\n<p><strong><em>Editor-in-Chief<\/em><\/strong><strong>: <\/strong><a href=\"mailto:agoranin@duanemorris.com\">Alex Goranin<\/a><\/p>\n<p><strong><em>Deputy Editors<\/em><\/strong><strong>:<\/strong> <a href=\"mailto:mcmousley@duanemorris.com\">Matt Mousley<\/a> and <a href=\"mailto:tmarandola@duanemorris.com\">Tyler Marandola<\/a><\/p>\n<p><em>If you were forwarded this newsletter, <\/em><a href=\"mailto:AI-Update@duanemorris.com?subject=Subscribe%20to%20the%20mailing%20list%20&amp;body=Please%20add%20me%20to%20The%20AI%20Update%20list.\"><em>subscribe to the mailing list<\/em><\/a><em> to receive future issues.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>#HelloWorld. In this issue, the Copyright Office asks all the right questions\u2014but will it do something interesting with the answers? Microsoft and Adobe offer clever ideas of their own. And, surprise (not really): Two new lawsuits against AI developers. Let\u2019s stay smart together. (Subscribe to the mailing list to receive future issues.) The Copyright Office &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/blogs.duanemorris.com\/artificialintelligence\/2023\/09\/18\/the-ai-update-september-18-2023\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;The AI Update | September 18, 2023&#8221;<\/span><\/a><\/p>\n","protected":false},"author":6,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[64,27,66,62,65,63,13],"ppma_author":[5],"class_list":["post-145","post","type-post","status-publish","format-standard","hentry","category-general","tag-adobe","tag-ai","tag-ai-developers","tag-copyright-office","tag-litigation","tag-microsoft","tag-theaiupdate"],"authors":[{"term_id":5,"user_id":6,"is_guest":0,"slug":"duanemorris3","display_name":"Duane Morris","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/843ff6e7a8fe5fc92109b47a45f34b6cf0ea499e6e788db23456c838b0ae6747?s=96&d=blank&r=g","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"_links":{"self":[{"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/posts\/145","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/comments?post=145"}],"version-history":[{"count":0,"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/posts\/145\/revisions"}],"wp:attachment":[{"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/media?parent=145"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/categories?post=145"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/tags?post=145"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/artificialintelligence\/wp-json\/wp\/v2\/ppma_author?post=145"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}