Daily news reports about ChatGPT are ubiquitous. Can it replace legal tasks undertaken by humans (with law degrees and state bar licenses)? Can lawyers use it to enhance their legal work? Quite naturally, this raises the issue of whether ChatGPT will make its way into class action litigation – where the stakes are enormous, and the workloads of lawyers involved in those cases are enormous.
To read the full text of this post by Duane Morris attorney Brandon Spurlock, please visit the Duane Morris Class Action Defense Blog.
As companies take advantage of new technologies in their interactions with customers and employees, they need to be mindful of the risks associated with implementation of those types of systems. This is especially true in the realm of federal and state privacy statutes, which in some instances have been created recently to address privacy concerns. There are also existing laws that are now being applied in a different context.
Read the Law360 article on the Duane Morris LLP website.
A new wave of class action lawsuits filed in California, Pennsylvania and Florida target companies that use technologies to track user activity on their websites, alleging such practices, when done without obtaining a user’s consent, violate electronic interception provisions of various state laws. The two technologies at issue are: 1) session replay software and 2) coding tools embedded in chat features. Session replay software tracks a user’s interactions with the website—their clicking, scrolling, swiping, hovering and typing—and creates a stylized recording of those interactions and inputs. Coding tools create and store transcripts of the conversations users have in a website’s chat feature. The plaintiffs in this new string of class actions allege that recording their interactions with a website and sending that recording to a third party for analysis without their consent is an illegal invasion of their privacy.
Read the full Alert on the Duane Morris LLP website.