Tennessee Federal Court Dismisses Class Action Under the Video Privacy Protection Act Because Plaintiff Failed to Allege He Accessed Video Content

By Brandon Spurlock and Jennifer A. Riley

Duane Morris Takeaways: On July 18, 2023, in Salazar v. Paramount Global d/b/a 247Sports, No. 3:22-CV-00756 (M.D. Tenn. July 18, 2023), Judge Eli Richardson of the U.S. District Court for the Middle District of Tennessee dismissed a class action lawsuit against Paramount Global because the Plaintiff failed to state a claim under the Video Privacy Protection Act (“VPPA”) where Plaintiff’s allegation that his subscription to an online newsletter made him a “subscriber” under the statute was insufficient because he did not allege that he accessed audio visual content through the newsletter.  The VPPA is a law from 1980’s stemming from the failed Supreme Court nomination of Robert Bork, which involved his video rental history being published during the nomination process.  In the ensuing decades, companies are seeing an increase in class action lawsuits under the VPPA and other consumer privacy statutes where plaintiffs seek to levy heavy penalties against businesses with an online presence.  This ruling illustrates that some federal courts will closely examine such statutes to ensure that a plaintiff adequately states a claim based on the underlying statutory definitions before allowing a class action to proceed.

Case Background

Plaintiff filed a putative class action against Defendant Paramount Global d/b/a 247Sports alleging a violation of the VPPA.  Id. at 1.  According to Defendant, 247Sports.com is an industry leader in content for college sports, delivering team-specific news through online news feeds, social platforms, daily newsletters, podcasts, text alerts and mobile apps.  Id. at 2.  Plaintiff alleged that Paramount installed a Facebook tracking pixel, which allows Facebook to collect the data on digital subscribers to 247Sports.com who also have a Facebook account.  Id. at 3-4.  So if a digital subscriber of 247Sports.com is logged-in to his or her Facebook account while watching video content on 247Sports.com, then 247Sports.com sends to Facebook (via the Facebook pixel) the video content name, its URL, and, most notably, the digital subscriber’s Facebook ID.  Id. at 4.  Plaintiff claimed that Paramount violated the VPPA when it installed the Facebook pixel, which caused the disclosure to Facebook of Plaintiff’s personally identifying information.  Id. at 5.  Paramount moved to dismiss for lack of subject-matter jurisdiction under Federal Rule of Civil Procedure 12(b)(1), and for failure to state a claims for relief under Rule 12(b)(6).

The Court’s Decision That Plaintiff Had Standing Under The VPPA

First, Paramount argued that Plaintiff did not have standing because Plaintiff failed to adequately allege either a concrete injury in fact or the traceability of the injury to Paramount’s conduct, because the alleged disclosure of Plaintiff’s information to Facebook did not constitute a concrete injury.  Id. at 9.  Rejecting Paramount’s standing argument, the Court noted that the VPPA created a “right to privacy of one’s video-watching history, the deprivation of which – through wrongful disclosure, or statutory violation alone – constitutes an injury sufficient to confer Article III standing.”  Id. at 11-12.  In other words, the VPPA created a statutory right to have personally identifiable information remain private by prohibiting disclosure to third parties.  Id. at 12.  Thus, the Court ruled that Plaintiff’s allegation that his personally identifiable information was transmitted to Facebook in violation of the VPPA identified a concrete harm for standing purposes.  Id. at 14.

Plaintiff Failed To State A Claim Under The VPPA

Paramount also asserted that Plaintiff had no claim under the VPPA because he was not a “consumer,” meaning “any renter, purchaser, or subscriber of goods or services from a video tape service provider.”  Id. at 17.  Because Plaintiff was not a “consumer” within the meaning of the VPPA, Paramount argued he was not a “subscriber of goods or services from a video tape service provider,” and Plaintiff did not state a claim under the VPPA because the statute only protects individuals who are “consumers” under the statute.  Id. at 18.

The Court noted that although the VPPA does not define “subscriber,” the dictionary definition indicates that “subscriber” is a person who “imparts money and/or personal information in order to receive a future and recurrent benefit.”  Id. at 19.  Further interpreting the statute, the Court reasoned that a consumer is only a “subscriber” under the statute when he or she subscribes to audio visual materials.  Id. at 21.  Completing the analysis, the Court reasoned that under the VPPA, because Plaintiff’s subscription to the newsletter was not sufficient to establish that the he had subscribed to audio visual materials, Plaintiff’s position was unavailing in claiming that his subscription to the newsletter renders him a “subscriber.”  Id. at 22.

The Court, therefore, dismissed Plaintiff’s VPPA class action lawsuit because Plaintiff failed to allege that he actually accessed audio visual content, which necessarily meant that Plaintiff was not a subscriber under the VPPA.  Id. at 22.

Implications For Businesses

This past year has seen an uptick in VPPA class action filings against businesses that operate websites offering online videos and using third-party tracking tools.  These lawsuits represent an ongoing pattern of increased consumer privacy class litigation throughout the country exposing companies to significant risk across a wide array of industries.  Corporate counsel should note this ruling is a positive indication that some courts will closely examine the plain language and legislative intent of a privacy statute to ensure that a plaintiff actually states a viable claim before allowing class litigation to proceed.

Utah Federal Court Sounds Off On Discovery Of Materials That Could Be Replicated By Artificial Intelligence

By Brandon Spurlock

Duane Morris Takeaways: Given the fast-growing use of generative AI tools such as ChatGPT, much has been written about the privacy issues surrounding these platforms.  Many employers have serious concerns about whether sensitive company data risks exposure vis-a-vis the use of chatbots and other artificial intelligence tools.  This potential harm was recently cited by parties seeking to protect the confidentiality of sensitive documents produced in federal litigation entitled M.A. v. United Behavioral Health, et al., Case No. 2:20-CV-000894 (D. Utah May 23, 2023). The Court’s decision to ultimately protect the confidentiality of the disputed documents is instructive for businesses that may need to seek similar protection during litigation.

Case Background

In M.A. v. United Behavioral Health, et al., Plaintiffs asserted causes of action against Defendants, a group of providers, for recovery of benefits under the Employee Retirement Income Security Act (“ERISA”) and violations of the Mental Health Parity and Addiction Equity Act (“MHPAEA”).  In response to requests for the production of documents, Defendants designated certain documents as confidential pursuant to the stipulated protective order entered in the case prior to production.  Id. at 2.  Some of the documents consisted of guidelines for subacute skilled care prepared by a third-party and licensed to Defendants.  Id.  Plaintiffs challenged the confidential designation and Defendants moved for a protective order.  Id. at 3.

Plaintiffs argued that the documents were subject to mandatory disclosure based on provisions of the ERISA and the MHPAEA, and that the mandatory disclosure provisions precluded Defendants from designating the documents confidential.  Id.  In conducting its analysis, the Court determined that Rule 26(c)(1)(G) governed Defendants’ motion, and the Court applied the three-factor test applicable to motions for a protective order, including: (1) the party seeking protection is required to show that the information sought is a trade secret or other confidential research or commercial information; (2) that such disclosure might be harmful; and (3) the harm from disclosure is outweighed by the need for access.  Id. at 3-4.  If the moving party can satisfy all three prongs, the burden then shifts to the party seeking disclosure that such disclosure is relevant and necessary.  Id. at 4.

The Court’s Ruling

Although Plaintiffs asserted that this balancing test was inapplicable because of the mandatory disclosure provisions contained in the ERISA and the MHPAEA, the Court disagreed. It opined that even if the statutes’ disclosure provisions applied, the question before the Court was not about disclosure (because the documents had already been produced), but whether Defendants were allowed to designate the documents confidential.  Id. at 5.  Defendants submitted multiple affidavits to demonstrate that the disputed documents contained commercially sensitive, proprietary, and confidential information.  Id. at 8.  Specifically, regarding the subacute care guidelines, to help illustrate the potential harm, the supporting affidavits established that the care guidelines were proprietary and copyrighted, and further asserted that not only would harm result if competitors could pirate or sell the guidelines if disclosed, but also the proprietary guidelines could be used to “develop artificial intelligence and machine learning models, all of which would allow would-be competitors to use [the] guidelines to develop a broader suite of derivative products and services.”  Id. at 10 (footnote 41).

The Court agreed with Defendants and entered a protective order for the disputed documents.

It held that the declarations were “sufficient to establish that unrestricted disclosure of the [documents] would result in commercial and economic injury to Defendants and non-parties.”  Id. at 10.  Having established the potential harm, the Court further found that Plaintiffs failed to carry their burden of showing that the unrestricted disclosure of the documents was necessary. Id. at 12.

Implications of The Decision

For businesses involved in large, complex class action litigation, the discovery process often involves the production of documents that contain commercially sensitive and proprietary information which merits confidential treatment during the course of the litigation.  This decision illustrates that when the opposing side challenges such confidential designations, the party seeking protection will need to demonstrate the potential harm that could result from disclosure.

With the availability and ubiquitous nature of generative AI tools, the potential harm is exacerbated given how AI can further exploit such disclosures.  Businesses should be mindful that the arguments that prevailed in this matter could help protect confidential information under similar circumstances.



The Implications Of ChatGPT In Class Action Litigation

By Brandon Spurlock

Duane Morris Takeaways: Daily news reports about ChatGPT are ubiquitous. Can it replace legal tasks undertaken by humans (with law degrees and state bar licenses)? Can lawyers use it to enhance their legal work? Quite naturally, this raises the issue of whether ChatGPT will make its way into class action litigation – where the stakes are enormous, and the workloads of lawyers involved in those cases are enormous.

ChatGPT And The Legal Profession

Launched in November 2022, Open AI’s ChatGPT has garnered much attention across the globe for its human-like ability to engage in conversations and generate content that is often indistinguishable from what we would expect talented professionals such as journalists, authors, professors (and even lawyers) to produce. But how does this technology work and what does it mean for the legal profession?

GPT stands for generative pre-trained transformer, which is a language model trained on a large corpus of data.  Through supervised and machine learning, the model analyzes tons of words, with each word having a score.  It then chooses a word with a high score and moves on to the next one. Every choice is determined by complex math and huge amounts of data, which allows it to produce text that is both coherent and accurate (most of the time).   So not only is it able to tap into vast amounts of vocabulary and information, but also it understands words in context, which helps mimic speech patterns while dispatching an encyclopedic knowledge.

How smart is ChatGPT? Well news articles report that it is smart enough to pass an MBA exam from a top business school, a Google coding interview, and several law school essay exams, albeit with less than stellar scores.

What Does This Mean For Class Action Litigation?

For the millions who have experimented with ChatGPT, it is easy to be mesmerized by its capabilities and wonder what it means for those white collar professions that many believed were impervious to displacement by AI, particularly lawyers.  For instance, when you type in a simple question such as “how do you certify a class action lawsuit?”  ChatGPT will respond:

In order for a class action lawsuit to be certified, the court must determine that the class of plaintiffs is sufficiently numerous, that there are common questions of law or fact among the class, that the claims or defenses of the class are typical of the claims or defenses of the individual class members, and that the representative plaintiff(s) will fairly and adequately protect the interests of the class.

This is a far cry from a winning brief, but the possibilities are instantly apparent.  And if you talk to attorneys who have asked the program to prepare work product like an insurance reservation of rights letter, they will tell you that they got a surprisingly workable template from which to start the editing process.  The technology is so promising that some service providers have rolled out a generative AI tool built on a version of OpenAI’s latest models, which is enhanced for legal work.  The tool allows access to legal content with efficiency and cost-effectiveness.

What Are The Pitfalls?

Despite the obvious promise exhibited by ChatGPT’s current abilities, the technology is still in its infancy. In fact, certain results obtained from ChatGPT are often riddled with errors and, in some cases, outright falsehoods.  In one instance, it referenced a non-existent California ethics provision.  In situations like these, where generative AI appears to simply make things up, and do so with complete and utter confidence, the tech industry has termed this a “hallucination.” With these risks in mind, professional liability carries are issuing warnings to law firms on the professional responsibility and risk management implications of the technology.

What’s Next?

Given the promise of ChatGPT, tempered by the associated risks, corporate counsel are certain to ask themselves what comes next for this technology.  Right now, for things like contracts, policies, and other legal documents that tend to be normative, generative AI’s capabilities in gathering and synthesizing information can do a lot of heavy lifting. Therefore, the legal industry should be on the lookout for emerging technologies, like ChatGPT, that can tackle such low hanging fruit, with the immediate benefit being potential cost savings for corporate clients.

As law firms and corporate legal departments contemplate the future of using this tool, it is noteworthy that there is an intellectual property class action pending in federal court in California – J. Doe 1, et al. v. Github, Inc., et al., No. 3:22-CV-06823 (N.D. Cal.) – alleging that OpenAI profits from the work of open-source programmers by violating the conditions of their open-source licenses.  Some commentators believe that the future of AI may well hinge on the outcome of this lawsuit, and it will no doubt be monitored closely by those in the legal industry interested this topic.

In the meantime, if attorneys are feeling uneasy about how AI technology is impacting the profession, try asking ChatGPT “Is being a lawyer a good profession?” and one can take solace in ChatGPT’s answer: “Yes, it can be a very rewarding profession. Lawyers can have a big impact in society and can have a great deal of job satisfaction.”  The industry will have to wait and see if this answer holds over time.

California Callout: New 2023 Privacy Regulations Coming Soon

By Gerald L. Maatman, Jr., Jennifer Riley, Brandon Spurlock, and Alex W. Karasik

Duane Morris Synopsis:  On the heels of California’s enactment of the California Consumer Privacy Act (“CCPA”) in 2020, and after two legislative bills that proposed to continue the employer exemption failed, employers will now need to comply with all requirements of the CPRA (“California Privacy Rights Act”) effective January 1, 2023. California-based employers now face these strict privacy requirements in the existing minefield of nuanced employment laws.

Legislative Background

The CCPA is often considered the most stringent data privacy law in the United States.  This landmark law established privacy rights for California consumers, including:  (1) the right to know about the personal information a business collects about them and how it is used and shared; (2) the right to delete personal information collected from them (with some exceptions); (3) the right to opt-out of the sale of their personal information; and (4) the right to non-discrimination for exercising their CCPA rights. (See https://oag.ca.gov/privacy/ccpa.).

Currently, data collected from workers is exempt from all but two provisions of the CCPA: (i) employers must provide an initial disclosure to all employees at or prior to the point of collection, and (ii) employees still have a right to statutory damages in the event of a data breach. “Employees” is a term that casts a wide net. It includes job applicants, business owners, officers, directors, medical staff members, independent contractors, emergency contacts and beneficiaries.

Two separate California state bills sought to continue the employer exemption: (1) AB 2891, for an additional three years; and (2) AB 2871, for an indefinite time period.  Neither bill was passed by the Legislature in its final 2022 session. Accordingly, with the exemption expiring, employers must now fully comply with the former CCPA’s requirements, as the new CPRA comes into effect.

Employer Obligations

First, employees are now afforded various rights, including:  (1) a right to request access to their personal information and information about how automated decision technologies work; (2) a right to correct inaccurate personnel information; (3) the right to request that an employer delete their personal information, including the obligation that employers must also notify third parties to whom they have sold or shared such personal information of the consumer’s request to delete; (4) the right to limit the use and disclosure of sensitive personal information to that which is necessary to perform the services or provide the goods reasonably expected by an average consumer who requests such goods and services.

Notice Obligations

Employers should be mindful of particular notice obligations under the CPRA. These include the: (1) requirement of notice at collection; and (2) requirement of a privacy policy.  Regarding the notice at collection, employers are required to give employees, applicants, and contractors notice at the time they collect the information if they plan to collect, use, or disclose that personal information, while also disclosing the categories of personal information.  The privacy policy is comprehensive and must disclose categories of personal information collected over the 12 months before the policy’s effective date. The policy also must disclose sources from which personal information is collected, the business purpose for the collection, categories of third-parties to whom personal information is disclosed; and categories of personal information sold or shared.  And employers are obligated to post the privacy policy online where it is accessible to employees, applicants, and contractors.

Data Governance

To ensure compliance with the CPRA, it is crucial that employers understand where personal information is located within their businesses. It behooves them to undertake a data inventory or data mapping exercise to assess how and where relevant information is stored and/or transferred.  Employers should also take stock of their records retention policies to ensure compliance, and also develop an internal framework to handle requests from employees for access and/or deletion.

Implications For Employers

Employers who have operations in California should immediately take heed of these new obligations. It is inevitable that the Plaintiff’s bar will be scrutinizing these practices come January 2023.  Accordingly, employers should determine whether they are covered by the CPRA, and prepare privacy policies that are fully compliant.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress