- Broad Washington “My Health My Data” Law Takes Effect in March
- Five Key Privacy and Data Security Considerations Heading into 2024
- Costs and the Recent Evolution of Healthcare Data Breaches
- FCC Rule Set to Require “One-To-One” Written Consent on Lead Generator Websites
- FTC Adds Data Breach Reporting Requirement to Its GLB Safeguards Rule Applicable to Nonbank Financial Institutions
- Lessons From Verizon's Cybersecurity FCA Self-Disclosure
- The Legal Issues Surrounding Deepfakes
- FTC and HHS Alert Parties in the Health Arena that Tracking Technologies Pose Privacy and Security Risks
- Safeguarding Your Online Marketplace Against Bad Actors
- Texas Enacts Data Privacy and Security Act
- State Privacy Law
- Data Breach
- Artificial Intelligence
- National Security
- Data Privacy
- Cyber Insurance
- U.S. Law
- Infosec Plan
- INFORM Consumers Act
- Consumer Protection
- Financial Institutions
- Website Accessibility
- Workplace Privacy
- Vendor Management
- SHIELD Act
- Owen Agho
- Denise M. Barnes
- Danielle F. Bass
- Jewel Haji Boelstler
- Sara J. Brundage
- Brandy Bruyere, NCCO
- Daniel S. Elkus
- Angela I. Gamalski
- Emily E. Garrison
- Michael P. Hindelang, CIPP/US, CIPM
- Karl A. Hochkammer, CIPP/US
- Rachel M. Hofstatter
- Matthew Keuten
- Molly K. McGinley, CIPP/US
- Emory D. Moore Jr.
- Ahmad H. Sabbagh
- Jad Sheikali, CIPP/US
- Jenna R. Simon
- Katarina Vickovic
- Steven M. Wernikoff
- Mahja D. Zeon
- January 2024
- December 2023
- November 2023
- October 2023
- July 2023
- June 2023
- May 2023
- March 2023
- February 2023
- January 2023
- November 2022
- October 2022
- September 2022
- August 2022
- June 2022
- May 2022
- April 2022
- March 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- April 2020
- March 2020
Legal developments in data, privacy, cybersecurity, and other emerging technology issues
Washington state’s My Health My Data Act (“MHMD”) goes into effect on March 31, 2024. Entities should carefully evaluate whether MHMD applies to them in light of the law’s broad applicability, an expansive definition of consumer health data, strict consent requirements and a unique private right of action. This post answers questions about which entities are subject to MHMD, and what the law requires entities to do.
Privacy and data security laws and regulations continue to evolve quickly, and companies processing personal data have an increasing array of issues to manage. As we enter 2024, below are five key considerations for companies managing privacy and data security risks.
Data breaches in the healthcare industry are a costly and legally evolving issue. The sophistication of threat actors and their ability to navigate IT systems using constantly changing tactics has made it difficult to predict and, in some cases, respond to a breach. The recent aggressive enforcement by the Federal Trade Commission (the “FTC”) of its Health Breach Notification Rule (the “HBNR”), as well as its proposed changes to the HBNR, have expanded the factors companies must consider when analyzing and responding to potential breaches of health data.
On November 22, 2023, the Federal Communications Commission issued a proposed rule that likely will considerably alter the online lead generation industry, including the use of comparison shopping websites. The proposed rule addresses a number of areas, but, notably, the rule would require texters and callers using certain regulated technologies to obtain prior express written consent from a single seller at a time to comply with the Telephone Consumer Protection Act (“TCPA”). The FCC is expected to pass the rule during its December 13, 2023 meeting.
Last week, the FTC amended its Gramm-Leach-Bliley Safeguards Rule, supplementing the additions to the rule that it announced in 2021 and that have been in effect since June 2023. The recent amendment will require nonbank financial institutions to notify the FTC when there is an unauthorized acquisition of unencrypted customer information involving 500 or more consumers. This notification requirement, which is scheduled to go into effect in May 2024, adds to the growing list of notifications that a company must consider after a data incident, including the SEC’s recently enacted rules requiring registrants to disclose material cybersecurity incidents.
Last week, the FTC and HHS’ Office for Civil Rights (OCR) sent a joint letter to approximately 130 hospitals and telehealth providers concerning the privacy and security risks related to the use of online tracking technologies integrated into their websites or mobile apps. The agencies assert that these tracking technologies – such as the Meta/Facebook pixel and Google Analytics – gather identifiable information about users when they interact with a website or mobile app, often without users’ knowledge and in ways that are hard for users to avoid.
Last updated: January 17, 2024
To assist privacy practitioners keep track of new state laws, below is a chart containing links to the major enacted state privacy laws and their respective regulations. Bookmark this page, and we will update this chart periodically as new laws are enacted.
Since the arrival of AI programs like OpenAI’s ChatGPT, Google’s Bard, and other similar technologies (“Generative AI”) in late 2022, more programs have been introduced and several existing programs have been upgraded or enhanced, including ChatGPT’s upgrade to ChatGPT-4. Our previous posts have identified the features and functionality of Generative AI programs and outlined the emerging regulatory compliance requirements related to such programs. This post discusses how regulatory agencies worldwide have begun to address these issues.
Since late 2022, terms like “large language models,” “chat-bots,” and “natural language processing models” increasingly have been used to describe artificial intelligence (AI) programs that collect data and respond to questions in a human-like fashion, including Bard and ChatGPT. Large language models collect data from a wide range of online sources, including books, articles, social media accounts, blog posts, databases, websites, and other general online content. They then provide logical and organized feedback in response to questions or instructions posed by users. The technology is capable of improving its performance and otherwise building its knowledge base through its internal analysis of user interactions, including the questions that users ask and the responses provided. These AI programs have a variety of applications and benefits, but businesses should be aware of potential privacy and other risks when adopting the technology.
On February 17, 2023, the FTC brought its first civil enforcement action under the Telemarketing Sales Rule, 16 C.F.R. Part 310 (“TSR”), in nearly one year. In U.S. v. Stratics Networks Inc., et al., which was filed in the U.S. District Court for the Southern District of California, the FTC seeks to stop a group of companies and individuals that it claims are “responsible for delivering tens of millions of unwanted Voice Over Internet Protocol (VoIP) and ringless voicemail (RVM) phony debt service robocalls to consumers nationwide.” Because the FTC is seeking civil penalties, the Complaint was filed by the Department of Justice on behalf of the FTC.
As seen from the recent release of the ChatGPT artificial intelligence (“AI”) tool, AI technologies have a major potential to transform society rapidly. However, the technologies also pose potential unique risks. Because AI risk management is a key component of responsible development and use of AI systems, the National Institute of Standards and Technology last week released its voluntary AI Risk Management Framework, which will be a helpful resource to assist businesses to responsibly incorporate AI into their processes, products and services.
Because the use of passwords alone is a relatively weak method to prove identity, enforcement agencies are ramping up pressure for companies to implement multi-factor authentication (MFA) both internally and to customers for online services. MFA makes it more difficult for cyber threat actors to gain access to networks and information systems if authentication information, such as passwords, is compromised through phishing attacks or other means. Below is information that may be helpful in assessing whether your company should implement MFA, and how to do so.
The DOJ recently published guidance regarding website accessibility under the Americans with Disabilities Act (ADA). This guidance reiterated the DOJ’s longstanding position that websites of businesses open to the public (defined as “places of public accommodations” under Title III of the ADA) are required to be accessible to people with disabilities and provided some non-binding indicators of what it means for a website to be accessible.
On August 11th, the Federal Trade Commission kicked off of its long-awaited privacy rulemaking by releasing an Advanced Notice of Proposed Rulemaking (ANPR). The ANPR is the beginning of what likely will be a lengthy process conducted pursuant to the FTC’s Magnuson-Moss rulemaking authority. The ANPR is extremely broad, raising 95 questions directed at nearly every type of data collection. Notably, in promulgating a rule, the FTC must demonstrate that each practice regulated is either deceptive or unfair and is prevalent in the market.
As 2023 approaches, organizations must again address new and modified laws governing Data Subject Requests (DSRs). Of course, the rollout of additional privacy regulations has become almost routine. But as the growing number of jurisdictions empower individuals with the right to opt out of more types of processing and access, rectify, or delete personal data, the legal and operational challenges of these laws continue to accelerate. Organizations – especially those with lean privacy and legal ops functions – will need to be strategic in addressing the expanding regulatory burden.
With that in mind, we offer a few issues to address as you map out your next steps when it comes to DSRs.
The FTC issued a policy statement yesterday notifying education technology companies that the agency is committed to ensuring that ed tech tools comply with the Children’s Online Privacy Protection Act (“COPPA”). COPPA requires that websites or services covered under COPPA obtain a parent’s – or in some cases, a school’s – consent before collecting personal information from children under 13. COPPA also limits how long companies may keep children’s personal information and requires that companies properly safeguard information. The policy statement signals that the FTC will be scrutinizing COPPA compliance by providers of ed tech and other covered online services.
Last week, the New York Attorney General’s office offered guidance regarding credential stuffing, a common and costly attack on businesses and consumers, in which threat actors repeatedly attempt to log in to online accounts using usernames and passwords stolen from other online services. Credential stuffing takes advantage of three aspects of the online ecosystem: (1) most online accounts utilize usernames and passwords; (2) a steady flow of data breaches has resulted in billions of stolen credentials being leaked onto the dark web for other threat actors to exploit; and (3) consumers tend to reuse the same passwords across multiple online services.
Last week, the Federal Bureau of Investigation issued a private industry notification warning that “ransomware actors are very likely using significant financial events, such as mergers and acquisitions, to target and leverage victim companies for ransomware infections.” The FBI cautioned that ransomware attackers research publicly available information and target companies involved in significant, time-sensitive financial dealings such as M&A and other transactions. This initial reconnaissance, according to the FBI, is later followed by a ransomware attack and a subsequent threat that unless the victim pays the ransom, the attackers will disclose the information publicly, causing potential investor backlash and affecting the victim’s stock value.
The Federal Trade Commission recently announced a newly updated rule concerning the data security safeguards required for financial institutions to protect their customers’ financial information. The FTC’s updated Safeguards Rule, which originally was mandated by Congress under the 1999 Gramm-Leach-Bliley Act, requires non-banking financial institutions, such as mortgage brokers, motor vehicle dealers, and payday lenders, to develop, implement, and maintain a comprehensive security system to keep their customers’ information safe. The new rule more closely aligns with the NY Department of Financial Services Cybersecurity Regulation.
On September 21, 2021, the U.S. Department of Treasury’s Office of Foreign Assets Control (OFAC) issued an updated ransomware advisory (the “2021 Guidance”), which supersedes its 2020 ransomware guidance (the “2020 Guidance”), discussed in a previous post on this blog.
In the 2021 Guidance, OFAC notes that ransomware payment demands have escalated during the COVID-19 pandemic as U.S. businesses maintain significant online and internet-connected activities. OFAC identifies a 21 percent increase in ransomware attacks and a 225 percent increase in ransomware losses as reported by the Federal Bureau of Investigation (FBI). The pandemic has presented numerous opportunities for cyber actors to target system vulnerabilities, particularly smaller businesses and municipal entities with limited resources for cybersecurity investments as well as entities supporting critical infrastructure, such as hospitals, that are likely to make quick payments to avoid service disruptions to patients.
Today, the European Commission (“EC”) adopted new standard contractual clauses (“SCCs”) reflecting new requirements under the European Union’s General Data Protection Regulation (“GDPR”). The SCCs are intended to provide standardized templates for companies to utilize to comply with the GDPR’s data protection requirements.
In late 2020, a sophisticated adversary used the SolarWinds Orion Platform to plant covert backdoors in the networks of thousands of companies and government agencies. The attack confirms the importance of vigorous third-party risk management. Last month, the New York State Department of Financial Services (“NYDFS”) issued a report on the SolarWinds attack and provided the following steps that companies can take to reduce supply chain risk:
Yesterday, the U.S. Supreme Court, in AMG Capital Management, LLC v. FTC, sharply curtailed the ability of the Federal Trade Commission to obtain restitution and disgorgement in enforcement actions. In a 9-0 decision, the court found that Section 13(b) of the FTC Act, which authorizes the FTC to seek permanent injunctions in federal court, did not also authorize the Commission to obtain court-ordered monetary relief.
With the passage of the Cybersecurity Affirmative Defense Act, Utah became the second state – after Ohio’s Data Protection Act in 2018 – to create an affirmative defense to certain causes of action stemming from a data breach. The law provides an affirmative defense under Utah law and in Utah courts to certain tort claims arising out of a data breach if the company demonstrates that it created, maintained, and reasonably complied with a written cybersecurity program.
With Governor Ralph Northam’s signature yesterday, the Consumer Data Protection Act (“CDPA”) became law, making Virginia the second state after California to enact a comprehensive privacy law (with apologies to Nevada, which also has passed more modest privacy legislation). Although similar in many respects to the California Consumer Privacy Act (“CCPA”), which was recently updated by the Consumer Privacy Rights Act (“CPRA”), the law contains terminology more consistent with the European Union’s General Data Protection Regulation (“GDPR”).
In Tsao v. Captiva MVP Restaurant Partners, LLC, the Eleventh Circuit joined the federal appellate courts holding that a consumer’s exposure to a substantial risk of future identity theft, and efforts to mitigate the risk of future identity theft, are not sufficient to confer Article III standing. The decision highlights federal court’s struggle with the standing requirements in a data breach case, and possibly raises the likelihood that the U.S. Supreme Court will address the issue.
Given the speculation and concern over ransomware attacks impacting the 2020 U.S. election, the recent spate of private companies falling victim to such attacks, and the October 1, 2020 advisory issued by the Department of Treasury (“Advisory”), it is no surprise that ransomware is trending in cybersecurity.
On September 23, 2020, Representatives Bob Latta (R-Ohio) and Greg Walden (R-Ore.) re-introduced the “Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution Act’’ or the ‘‘SELF DRIVE Act” to create a federal framework for autonomous vehicles (“AVs”). The measure lacks bipartisan support and is not expected to reach the floor of the House of Representatives during this session. But the continued effort demonstrates the importance that many lawmakers put on promoting a U.S.-led effort in the development of self-driving vehicles. The matter likely will be among the key transportation themes before the next session of Congress, which convenes in January. On the Senate side, policymakers have not advanced autonomous vehicle bills. In the previous congressional session, an autonomous vehicle policy measure advanced in the House but came up short in the Senate.
A number of U.S. federal agencies have authority to issue a type of administrative subpoena called a Civil Investigative Demand (“CID”) to obtain relevant information as part of an investigation. For example, both the Federal Trade Commission (“FTC”) and the Consumer Financial Protection Bureau (“CFPB”) have authority to issue CIDs to obtain documents and testimony in investigations related to privacy, data security, deceptive marketing, and financial fraud. This article identifies some items to consider when receiving a CIDs based on my experience issuing and reviewing hundreds of CIDs as an enforcement attorney in the Chicago office of the FTC.
As schools increasingly are adjusting to remote learning and utilizing education technology (“ed tech”) services, both schools and their ed tech service providers need to consider the appropriate collection and usage of student personal information. Here are some tips for protecting students’ privacy and safeguarding personal data:
New York’s Stop Hacks and Improve Electronic Data Security Act (the “SHIELD Act”) took effect on March 21, 2020. The Act expands existing state breach notification requirements and imposes specific data security protections for covered businesses that own or license the private information of New York residents, regardless of whether those businesses are based in New York. The Act also broadens the definition of “private information” to include new types and combinations of data.
On March 31, 2020, Washington Senate Bill No. 6280 (the “Act”) became law, codifying one of the most detailed facial recognition regulations in the country. The Act regulates state and local government agencies in Washington using or intending to develop, procure, or use a facial recognition service but also includes important considerations for companies designing this technology.
Under extraordinary circumstances, businesses are quickly adapting to remote work on a large scale. In doing so, companies should promote best practices to protect sensitive data. Below are some techniques that your company can employ to help ensure that sensitive personal or company information stays safe: