Webinar: Let’s Talk About Tech – Wearable Fitness and Health Tech

Duane Morris LLP will hold a webinar, The Data Privacy and Security Landscape: Let’s Talk About Tech ‒ Wearable Fitness and Health Tech on Monday, November 6, 2023, from 12:30 p.m. to 1:30 p.m. Eastern time.

REGISTER

About the Program

Wearable tech is everywhere—on your wrist, in your pocket, on your finger and even at work. With the ubiquity of fitness technology, what are the implications of these pervasive devices? How are companies collecting this data storing and protecting consumer information? What laws and regulations are in place as device use continues to expand? Join our panelists for a discussion on the current state and future of wearable fitness and health tech, including:

    • FDA regulation of wearable devices: What is and isn’t a medical device?
    • FDA guidance on wireless technology and medical devices
    • Biometric laws, including Illinois Biometric Information Privacy Act (BIPA), and the storage and protection of such data
    • Implications of HIPAA and wearable tech

Presenters

Frederick R. Ball, Partner

Neville M. Bilimoria, Partner

Sheila Raftery Wiggins, Partner

Guarding Your Digital Data Against AI Incursion

Digital data is becoming a hot commodity these days because it enables AI tools to do powerful things. Companies that offer content should keep up with the evolving technology and laws that can help them protect their online data.

As data becomes available online, it can be accessed in different ways leading to various legal issues. In general, one basis for protecting online data lies in the creativity of the data under the Copyright Act of 1976. Another basis lies in the technological barrier of the computer system hosting the data under the Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act. It is also possible to protect online data based on contractual obligations or tort principles under state common law. In terms of the data, a company would need to consider its proprietary data and user-generated data separately, but any creative content is invariably entitled to copyright protection.

To read the full text of this article, please visit the Duane Morris Artificial Intelligence Blog.

Autonomous AI and the Question of Creativity

On March 16, 2023, the United States Copyright Office (USCO) published Copyright Registration Guidance (Guidance) on generative AI[1]. In the Guidance, the USCO reminded us that it “will not register works produced by a machine or mere mechanical process that operates randomly or automatically without any creative input or intervention from a human author.” This statement curiously conjures the notion of a machine creating copyrightable works autonomously.

While the operation of a machine, or specifically the execution of the underlying AI technology, may be largely mechanical with little human involvement, the design of the AI technology can take significant human effort. If we look at protecting human works that power machines as intellectual property in the broad context where AI has been applied, just like authorship has been an issue when an AI technology is used in creating copyrightable subject matter, inventorship has been an issue when an AI technology is used in generating an idea that may be eligible for patent protection.

To read the full text of this article, please visit the Duane Morris Artificial Intelligence Blog.

Privacy Laws + Banks + FinTech = New U.S. Guidance on Risk Management for Third-Party Relationships

Three federal agencies jointly issued a guidance that banks are expected to monitor their financial technology partners to ensure compliance with privacy, fair lending, and anti-money laundering laws.

The “Interagency Guidance on Third-Party: Risk Management” was issued jointly by: (1) Board of the Federal Reserve System [OP-1752], (2) Department of the Treasury Office of the Comptroller of the Currency [OCC-2021-0011], and (3) Federal Deposit Insurance Corporation [RIN 3064-ZA26], with a final guidance date of June 6, 2023 (“Guidance”).  The Guidance offers the three U.S. agencies’ views on sound risk management principles for banking organizations when developing and implementing risk management practices for all stages in the life cycle of third-party relationships.

Prior guidance is rescinded and replaced by the Guidance

The Guidance rescinds and replaces the following previously issued guidance by the three federal agencies:

  • Board’s 2013 guidance: SR Letter 13-19/CA Letter 13-21, “Guidance on Managing Outsourcing Risk” (December 5, 2013, updated February 26, 2021)
  • FDIC’s 2008 guidance:  FIL-44-2008, “Guidance for Managing Third-Party Risk” (June 6, 2008)
  • OCC’s 2013 Guidance and its 2020 frequently asked questions: OCC Bulletin 2013-29, “Third-Party Relationships: Risk Management Guidance,” and OCC Bulletin 2020-10, “Third-Party Relationships: Frequently Asked Questions to Supplement OCC Bulletin 2013-29.” Additionally, the OCC also issued foreign-based third-party guidance, OCC Bulletin 2002-16, “Bank Use of Foreign-Based Third-Party Service Providers: Risk Management Guidance,” which is not being rescinded but instead supplements the final guidance.

The Guidance seeks to establish a consistent approach which puts the onus on banks to obtain information from and ensure compliance from its third-party fintech relationships.  In other words, banks are responsible for knowing how their fintech partners: (1) are operating and (2) are complying with applicable federal law.

Obligations concerning privacy laws and cross-border flow of information 

The Guidance discusses factors to consider when evaluating whether to enter into a relationship with a third party, including the compliance of privacy laws.  Regarding contracts between a bank and a foreign-based third party, the Guidance notes the importance of:

  • privacy laws
  • cross-border flow of information
  • choice-of law and jurisdictional provisions that provide dispute adjudication

In sum, the 68-page Guidance sets forth a bank’s risk management obligations when contracting with third-party fintech.  As privacy laws and cross-border flow of information continually increase, the Guidance sets forth the criteria to analyze within these contracts.

 

 

FTC’s Proposed Click to Cancel Rule for Online Commerce

The Federal Trade Commission’s proposed click to cancel rule requires companies to provide more detailed information and notices about cancelling automatic renewals, subscriptions, and memberships which are prevalent in online commerce.  The proposed rule, titled Negative Option Rule, is at: https://www.ftc.gov/system/files/ftc_gov/pdf/p064202_negative_option_nprm.pdf

The goal of the proposed rule is to combat unfair or deceptive practices that include recurring charges for products or services consumers do not want and cannot cancel without undue difficulty.  The FTC is currently seeking comments on the proposed rule until April 19, 2023.

The proposed rule would require canceling via a negative-option program to be easy and available through the same means as signing up.  For example, if a company offers one-click membership sign-up through its website, then the company must also offer one-click cancellation through the same website.

Other substantive requirements of the proposed rule include annual reminders for customers of programs that do not involve the shipment of physical goods, pre-billing disclosure requirements, express consent for subscription terms separate from the rest of the transaction, and limits on the ability to offer special deals to customers attempting to cancel.

To help comply with this anticipated rule, companies should:

    • Catalog:  Catalog their negative-option marketing offerings under the broad definition provided by the FTC under the proposed rule
    • Representations:  Review the processes associated with these offerings, including representations they make concerning any aspect of a product or service involving negative-option marketing to ensure they are accurate
    • Pre-bill disclosures:  Review pre-billing disclosures to ensure all material terms of a deal are disclosed to consumers before they enter their billing information and that express consent to the subscription is obtained
    • Involve IT:  Communicate with their IT departments to develop a simple cancellation procedure which includes annual notifications for consumers.

Preservation of Ephemeral Messaging for Business Purposes

Ephemeral messaging is short-lived, yet the data preservation and regulatory obligations remain.

Ephemeral messaging apps – like WhatsApp and SnapChat – are a form of digital communication available for a limited time and then deleted.  The two key characteristics of ephemeral messaging are: (1) automated deletion of message content for both the sender and the receiver and (2) end-to-end encryption which enhances privacy by making it more difficult for hackers and others to read the encrypted data while it is in transition between devices.

The three degrees of ephemerality in messaging apps are:

  1. Pure which involves the permanent and automated deletion of messages;
  2. Quasi which permits preservation of messages in certain circumstances; and
  3. Non-ephemeral in which messages usually remain on a source (such as a server) and may not include end-to-end encryption.

The benefits of ephemeral messaging include:

  • Information governance: Data storage and records preservation/management are reduced by ephemeral messaging.
  • Legal compliance: Encryption and automatic deletion of personal data help reduce exposure if a data breach occurs.
  • Data security: Even if a mobile device is lost, the automatic deletion of data will likely protect against hackers.

The legal risks of ephemeral messaging include: (1) complying with subpoenas and (2) preservation of data when litigation is “reasonably anticipated”.

Subpoenas often define documents and communications broadly to capture all communications, including ephemeral messaging.  Thus, the failure to preserve documents may result in an inability to fully comply with a subpoena and/or a criminal exposure, particularly if the subpoena was issued by the government.

Regarding the preservation of data, legal hold policies may need to be amended to address ephemeral messaging, including when a company is dealing with government regulators.  See e.g., Federal Trade Commission v. Noland, et al., Case No. CV-20-00047-PHX-DWL (D. Ariz. 2021) (sanctioning defendants for installing and using ephemeral messaging after learning they were investigation targets).

Some regulators caution against the use of ephemeral messaging.  For example:

  • The U.S. Securities and Exchange Commission (“SEC”) issued a guidance in 2018 that prohibits business use of apps which permit automatic destruction of messages.
  • The U.S. Department of Justice (“DOJ”) updated its Evaluation of Corporate Compliance Programs in March 2023 which discusses the factors that prosecutors should consider in conducting an investigation of a corporation including the adequacy and effectiveness of the corporation’s compliance program at the time of the offence as well as at the time of the charging decision.

Accordingly, establishing adequate and effective corporate compliance programs are important, including:

  1. establishing a corporate compliance program which is monitored, updated, and works in practice, and
  2. reviewing the company’s document-retention policies and procedures, including whether they address ephemeral messaging and mobile device data.

In sum, although ephemeral messaging is short-lived, the consequences – of failing to comply with data preservation and regulatory obligations – may be long lasting.

 

 

ChatGPT in Class Action Litigation

Daily news reports about ChatGPT are ubiquitous. Can it replace legal tasks undertaken by humans (with law degrees and state bar licenses)? Can lawyers use it to enhance their legal work? Quite naturally, this raises the issue of whether ChatGPT will make its way into class action litigation – where the stakes are enormous, and the workloads of lawyers involved in those cases are enormous.

To read the full text of this post by Duane Morris attorney Brandon Spurlock, please visit the Duane Morris Class Action Defense Blog.

District Court Reaffirms Dismissal of Wiretapping Claims Under California Invasion of Privacy Act

On the heels of holding that defendants’ use of session replay software did not constitute a violation of the California Invasion of Privacy Act, Judge William Alsup in Williams v. What If Holdings LLC and ActiveProspect Inc. has now denied the plaintiff’s request for leave to amend. In doing so, the court reaffirmed its previous holding that the plaintiff’s allegations only established that ActiveProspect’s use of session replay software functioned as a tool that supported What If’s management of its own website data, and not as a means of eavesdropping and aggregating information for ActiveProspect’s own purposes.

Read the full Alert on the Duane Morris LLP website.

Will Website Chat Feature Wiretapping Lawsuits Rise?

Entering the conversation, the United States District Court for the Central District of California recently denied a motion to dismiss claims alleging that a website’s chat features and use of session replay software violate the California Invasion of Privacy Act (CIPA). Notably, this court rejected a forum selection clause in the website’s terms of use and went on to hold that allegations that the plaintiff shared “personal information” in the chat were sufficient to maintain a claim.

Read the full Alert on the Duane Morris LLP website.

Privacy Concerns for Health Apps

Free health apps – often funded by advertising revenue – may result in disclosure of private health information to third parties without permission from consumers.

A company that operates a health app or collects consumer health data should analyze how ad-tracking tools are used within their ecosystem.  In 2021, the Federal Trade Commission (“FTC”) issued a policy statement clarifying mobile health app makers’ obligations to notify consumers if their data is exposed or shared without their permission, and the FTC stated that the policy was meant to fill a “gap” in regulations for health apps which generally are not covered by the Health Insurance Portability and Accountability Act (“HIPPA”).

Failure to fulfil these obligations may result in a government action, such as an action by the FTC which: (1) has authority over businesses that collect health information under the FTC Act and (2) may bring enforcement actions regarding deceptive claims about the use or disclosure of health data.  Recent federal and state enforcement actions include:

  • FTC action: Flo Health Inc. settled FTC allegations that the company shared health information of its users with outside data analytics providers after promising such information would be kept private.  The FTC filed the Complaint against Flo Health asserting that Flo Health: (1) disclosed health data from millions of users of its Flo Period & Ovulation Tracker app to third parties that provided marketing and analytics services to the app, including Facebook’s analytics division and Google’s analytics division, (2) disclosed sensitive health information, such as the fact of a user’s pregnancy, to third parties in the form of “app events,” which is app data transferred to third parties for various reasons and, (3) did not limit how third parties could use this health data.
  • California AG action: Glow Inc. settled a probe by the California Attorney General regarding its fertility-tracking mobile app that stores personal and medical information.  The Attorney General’s Complaint alleged that the app: (1) failed to adequately safeguard health information, (2) allowed access to user’s information without the user’s consent, and (3) had additional security problems with the app’s password change function that could have allowed third parties to reset user account passwords and access information in those accounts without user consent.  Within the settlement, Glow was required to: (1) incorporate privacy and security design principles into its app and (2) obtain affirmative consent from users prior to sharing or disclosing personal, medical, or sensitive information and require the users to revoke previously granted consent.

In sum, a company that operates a health app or collects consumer health data should analyze how ad-tracking tools are used within their ecosystem.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress