Search:

Recent Posts

Popular Topics

Contributors

Archives

Legal developments in data, privacy, cybersecurity, and other emerging technology issues

Posts in Data Privacy.

If your company transfers sensitive personal data of U.S. individuals to entities or persons associated with certain countries deemed foreign adversaries, two federal programs designed to address national security risks should be on your radar -- the Department of Justice’s Data Security Program (DSP) and the Protecting Americans’ Data from Foreign Adversaries Act (PADFAA). While different, both frameworks address risks of data exploitation by adversarial nations and have significant potential penalties for non-compliance. PADFAA is a law that was enacted in June 2024; the DSP is a DOJ-administered program born from an executive order, and the DOJ has announced that it will begin enforcing the framework on July 8, 2025.

Since the arrival of AI programs like OpenAI’s ChatGPT, Google’s Bard, and other similar technologies (“Generative AI”) in late 2022, more programs have been introduced and several existing programs have been upgraded or enhanced, including ChatGPT’s upgrade to ChatGPT-4. Our previous posts have identified the features and functionality of Generative AI programs and outlined the emerging regulatory compliance requirements related to such programs. This post discusses how regulatory agencies worldwide have begun to address these issues.

Since late 2022, terms like “large language models,” “chat-bots,” and “natural language processing models” increasingly have been used to describe artificial intelligence (AI) programs that collect data and respond to questions in a human-like fashion, including Bard and ChatGPT. Large language models collect data from a wide range of online sources, including books, articles, social media accounts, blog posts, databases, websites, and other general online content. They then provide logical and organized feedback in response to questions or instructions posed by users. The technology is capable of improving its performance and otherwise building its knowledge base through its internal analysis of user interactions, including the questions that users ask and the responses provided. These AI programs have a variety of applications and benefits, but businesses should be aware of potential privacy and other risks when adopting the technology.

Last week, the Consumer Financial Protection Bureau (“CFPB”) took a significant step forward in enhancing consumer control over private financial data when it launched a rulemaking process under Section 1033 of the Dodd–Frank Wall Street Reform and Consumer Protection Act (“Section 1033”). Section 1033 requires the CFPB to implement a rule to allow consumers to access their financial information. Currently, there is no duty under Section 1033 to maintain or keep any information about a consumer. The CFPB has yet to adopt a rule relating to data access, despite its authority to do so.

Jump to Page

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.