insights

How Technology is Shaping Clinical Trial Agreements

With artificial intelligence (AI) advancing at lightning speed, cybersecurity threats multiplying, and electronic health records (EHRs) now the norm, it’s time to revisit your clinical trial agreements (CTAs). Emerging technologies are transforming the clinical research landscape, and your contracts need to keep up. A CTA refresh is a strategic move to safeguard sensitive data, manage cybersecurity risks, and ensure regulatory compliance.

This post explores emerging trends in clinical research and technology, highlighting the risks posed by rapid technological advancements. We examine how these developments affect key CTA provisions and provide practical, technology-inspired updates for your CTAs.

This is Part 1 of a two-part series. Part 1 starts with foundational topics including cybersecurity and EHR standards, and then delves into two emerging drivers of change:  secondary use of study data and AI. Part 2 will explore remote monitoring, eConsent, decentralized clinical trials (DCTs), and digital health technologies (DHTs), and take a deep look into three critical areas for risk mitigation:  indemnification, limitation of liability, and insurance. Whether you’re drafting new CTAs or revising existing ones, our insights will help you manage risks and keep your contracts aligned with today’s legal and technological landscape. Let’s get started.

I.  Cybersecurity

Background

Cybersecurity awareness exploded during the pandemic and continues to merit very close attention. As cyberattacks grow in sophistication and frequency, the healthcare industry has become one of the primary targets. In our post about cybersecurity for CTAs during the COVID-19 pandemic, we addressed cybersecurity issues that contracting parties need to consider when drafting and negotiating CTAs. Here, we focus on what has changed since then.

What’s New

Ransomware attacks on the healthcare sector rose sharply in 2023, nearly doubling from the previous year.[1] Despite these trends (and the urging of worried counsel), CTAs still lag in addressing cybersecurity, and many remain silent on the issue. The good news is that study sponsors and study sites are increasingly prioritizing cybersecurity, and more frequently addressing it in CTAs. Often, sites take the initiative in adding cybersecurity language, but sponsors are also starting to proactively integrate it into their CTA templates.  In addition, more sites and sponsors maintain cyber liability insurance today than five years ago.

Contracting Party Perspectives

  • Sites: As HIPAA covered entities, sites have historically been more attuned to security risks and often lead the push for cybersecurity provisions in the CTA.
  • Sponsors:  With the continued escalation of cyber incidents and the proliferation of state privacy and security laws, sponsors have become increasingly concerned about balancing cybersecurity obligations within the CTA.

Options for Addressing Cybersecurity

If addressed in the CTA, cybersecurity obligations typically appear as a standalone security clause or embedded within confidentiality provisions.

  • Standalone Security Clause. The study obligations, HIPAA, monitoring or recordkeeping sections of the CTA may include an independent cybersecurity provision that imposes one or more of the following obligations in the event of a security incident (or possibly a suspected incident):
    • Notice
    • Cooperation
    • Breach notification to participants
    • Mitigation
    • Remediation
    • Communications to the public
    • Coverage of costs
    • Encryption
    • Endpoint protection
    • Other IT requirements
  • Confidentiality Section. Cybersecurity obligations in the confidentiality section can be unintentional and indirect, or deliberately structured to address specific cyber risk issues:
    • Coverage by Default. Confidentiality provisions inherently have cybersecurity implications. If a recipient of confidential information under a CTA experiences a cyber event affecting that information, the recipient may be in breach of the confidentiality provision. This will depend on the CTA’s language and the event’s impact. While confidentiality obligations offer some cyber protection, they typically fall short of the safeguards found in a standalone security provision.
    • Intentional Coverage. Confidentiality sections can expressly address cybersecurity by adding or refining the notice requirements. Many CTAs do not mandate notice of unauthorized use or disclosure of confidential information, but if they do, then expanding the notice to include unauthorized access to, modification, or destruction of confidential information strengthens cyber protections. Some CTAs go a step further, incorporating system breach into the notice requirement.
    • Sweeping Scope. Whether or not the confidentiality section explicitly covers cyber events, a broad definition of confidential information by its nature increases the recipient’s cyber risk simply because there is more data to protect.
    • Low/No Fault Attacks. Advances in cybercrime technology expose recipients to a higher risk of confidential breaches. Before electronic data capture (EDC) systems, confidential information was largely paper-based, making access easier to control. Today, cyberattacks – such as ransomware targeting vendors or subcontractors – can lead to unauthorized access without negligence by the site or sponsor.
    • One-way or Mutual. Historically, confidentiality provisions in CTAs were one-sided, protecting only the sponsor’s confidential information.  Increasingly, they are mutual, requiring both parties to assess the cybersecurity implications.

Key Takeaways

  • Include Mutual Security Obligations. Although cybersecurity language is often one-sided, either party can experience a security incident that impacts the other. At a minimum, both parties need to receive notice of a security incident (or suspected incident) so they can take steps to mitigate the impact on their data and systems. The Standalone Security Clause section above outlines elements to include, some of which may prolong negotiations, especially regarding responsibility for costs. Making the provision mutual usually yields a more balanced result for both parties. All security obligations need to be vetted by each party’s IT department.
  • Careful Review of Confidentiality Obligations. Each party should thoroughly consider the confidentiality section to avoid unintended consequences. If cybersecurity risks are addressed through the confidentiality language, recognize that this approach is usually an incomplete solution.
  • Cyber Insurance. The CTA should require both parties to maintain adequate cyber liability insurance. Maintaining it is a baseline requirement for any credible healthcare business or institution today. If the other party does not have it, that is a red flag.  We will explore cyber insurance, including first and third party protections, in Part 2 of this series.
  • No Standard Language…Yet. The clinical research industry has not yet established standard cybersecurity terms for CTAs. Sponsors and sites should continue refining language to address the issues above and work to establish CTA language that is widely acceptable.
  • Key Risk Mitigation Issues.  As we elaborated on here, additional cybersecurity concerns include:
    • Coordinating cybersecurity terms with:
      • Indemnification
      • Limitation of liability
      • Vendor agreements
    • Implementing appropriate controls for remote monitoring and ensuring access is limited to only the data necessary for remote source data verification.
    • Clearly defining key terms and avoiding vague or overly broad language.
  • Artificial Intelligence.  Any AI usage agreed upon by the parties (see Section IV below) should be vetted for cybersecurity risks. Proactive assessment helps reduce vulnerabilities.

II.  EHR System Standards

Background

  • Role of EHRs. EHRs play a significant role in modern clinical studies. Sites manage EHRs, which house a wide range of patient medical information, including medical history, diagnoses, lab results, prescriptions, and treatment plans. Given the wealth of information contained in EHRs, they serve as a key source of data in clinical trials. Investigators often rely heavily on EHR data to recruit patients, analyze patient data, aggregate data, maintain study records, and facilitate post-study follow up.
  • FDA Guidance. In July, 2018, FDA issued guidance titled “The Use of Electronic Health Record Data in Clinical Investigations” (EHR Guidance) to advise sponsors, investigators, and other stakeholders on the use of EHR data in FDA-regulated clinical investigations. In the EHR Guidance, FDA states that “[s]ponsors and clinical investigators should ensure that policies and processes for the use of EHRs at the clinical investigation site are in place and that there are appropriate security measures employed to protect the confidentiality and integrity of the study data.” FDA then sets forth recommendations for sites and sponsors with respect to such security measures.[2]
  • EHR Growth. EHRs have become even more ubiquitous in the six years since the EHR Guidance was released. As of 2021, 96% of non-federal acute care hospitals and 78% of office-based physicians had adopted a certified EHR (up from 28% and 34%, respectively, in 2011).[3] Today, except with respect to a limited number of small or rural providers, it is very rare for a site to not have an EHR, making it important for sponsors and sites to understand what safeguards are in place to protect the confidentiality, security, and integrity of data contained in EHR systems.

Contracting Party Perspectives

  • Sites:
    • Control of EHR System.  The site owns and maintains the EHR system, often contracted through a third-party vendor. Accordingly, the site controls the EHR system and would consider it inappropriate for sponsors to impose extensive obligations on the site’s EHR.
    • Reasonable Reps and Certifications.  Due to the complexity of EHR systems, sites often seek to limit overly restrictive CTA language or broad representations or certifications regarding their EHR. Instead, they want to ensure that any representations or certifications they make in a CTA align with applicable legal requirements but do not go beyond what is necessary.
  • Sponsors:
    • Integrity of Study Data. The source data contained in an EHR system underpins much of the study data that sponsors use for regulatory submissions. Sponsors need assurance regarding the integrity of that source data.
    • Compliance with EHR Guidance.  To help ensure the reliability of source data and delivered study data (particularly if the EHR system communicates directly with sponsor’s EDC system), sponsors often focus on whether a site’s EHR system complies with the EHR Guidance and incorporates appropriate data integrity measures and cybersecurity controls. The sponsor’s CTA template may include a certification by the site and investigator(s) that their EHR systems meet the EHR Guidance standards.

 

Key Takeaways

  • EHR Controls.  EHRs need to have appropriate controls in place to protect the confidentiality, security, and integrity of study data.
  • EHR Guidance Standards.  Sponsors should include language in the CTA requiring EHR system compliance with the EHR Guidance. This could include requiring sites to confirm that their EHR systems are certified under the Health IT Certification Program of the Office of the National Coordinator for Health Information Technology (ONC)[4] or that the EHR system’s controls include data use policies and processes, data security protection measures, and other protections required under the EHR Guidance.[5]
  • Careful Review of Reps and Certifications.  Sites should understand what security standards their EHR systems meet and make sure that any EHR-related representations or certifications they make are narrowly tailored and in line with industry standards.

III.  Secondary Research and Use

Background

A large industry has developed around secondary research, which is the reuse of information or biological specimens collected during clinical research for an unrelated, or “secondary” research activity. For example, sites may contribute source records or leftover specimens to a database or repository or make their EHR systems containing study-related data available internally or to third parties for research or other purposes. Over time, and particularly with AI, the concept of secondary research has expanded beyond traditional research activities, evolving into broader application, or “secondary use,” that includes non-research purposes. CTAs often address secondary research and use of information, either head-on or indirectly through the NERF (defined below) or EHR provision.  This section focuses on data, not specimens. [6]

NERF to Study Data

In the CTA, the sponsor often grants the site a non-exclusive, royalty-free license (NERF) to use the study data for internal, non-commercial academic research, patient (or participant) care, and education purposes. This license may be subject to the CTA’s confidentiality and publication obligations, and may prohibit sublicensing, among other things. The NERF enables the site to use the data generated during the study for purposes unrelated to the study, such as to advance their academic mission. However, advances in technology and the increased sophistication of electronic databases are prompting sponsors to scrutinize this provision more carefully.

  • Contracting Party Perspectives
    • Sites:
      • Necessary for Academic Mission. Many sites require a NERF to advance their academic mission, including research, education, publication and participant or patient care.  Sites want to make sure the NERF terms do not unduly restrict them from carrying out their mission.
      • Realistic Compliance. Sites need NERF terms that align with their actual policies, procedures, and technological capabilities.
    • Sponsors:
      • Maintaining Competitive Edge. Sponsors need the NERF to be narrowly tailored to shield their study data from competitors and protect their intellectual property.
      • Ensuring “Internal” Truly Means Internal. Sponsors want clarity that study data will remain truly “internal” and “non-commercial.” Without precise wording and key restrictions, these two terms could be interpreted more loosely than intended. For example, can study data be used for research by the site (which is internal) that is funded by a competitor, non-profit, or the government (which are external)?
      • Access and Controls. The question of whether the site’s exercise of the NERF is “internal” or “non-commercial” may ultimately depend on who can exercise the NERF (is sublicensing permitted?), who has access to NERF-generated data (“secondary data” or “NERF results”), and database controls to prevent unintended access.
      • AI Considerations. Sponsors are starting to consider whether AI will have access to the study data and, if so, who benefits from the AI access:  The site? The AI vendor? Other third parties?  Does any data ingested by the AI refine the AI? The answers may turn “internal” into “external.”
      • Secondary Data Risks. The parties often do not thoroughly assess whether secondary data generated under the NERF remains internal, non-commercial, or confidential. In the past, sponsor concerns were largely theoretical, but with AI and evolving data-sharing practices, the risks are now very real.  For example, if secondary data is pooled with data from other sources, can sponsor-specific data be isolated (perhaps through AI)? How high is the risk of competitor access?
      • High Stakes for Clinical Stage Companies. For sponsors whose entire pipeline hinges on a single investigational product, competitive exposure will “make or break” the company.  These sponsors want to prevent the exercise of the NERF from crossing the line from internal use to competitor access or commercial use.

EHR Access

Health systems and universities are increasingly leveraging their EHR systems for research, AI development and operational improvements. As with NERFs, this access raises significant concerns regarding secondary research or data use.

  • Contracting Party Perspectives
    • Sites:
      • EHR Ownership. As mentioned above in Section II, the site owns or controls the EHR system and considers any restrictions on EHR use, including on study source documents, to be inappropriate.
      • Patient Record Requests. Patients have the right to request their records, and sites must honor these requests—regardless of sponsor concerns about secondary use.
      • AI and Healthcare Improvements. Many sites are actively collaborating with AI vendors to enhance healthcare delivery, and they do not want sponsors restricting their ability to license access to their EHR systems to AI vendors.
    • Sponsors:
      • Competitive Risk. EHR records may contain sponsor names, protocols, study products and study adverse event information. If competitors gain access to this information through secondary research or data use, the risks are substantial.
      • Risk Levels Vary. Sponsors with a single investigational product face the greatest exposure.
      • De-Identification Isn’t a Solution. When questioned about these risks, some sites assure sponsors that they de-identify the study source records under HIPAA prior to permitting third party access to their EHR. De-identifying records under HIPAA does not remove non-protected health information (PHI) data like sponsor names and protocol titles, leaving sponsors vulnerable.

Key Takeaways

  • Understand each Party’s Perspective. Sponsors and sites each have legitimate concerns about secondary access to and use of study data, whether through the NERF or the EHR system. Taking the time to learn and understand the other party’s perspective will help the parties structure the CTA accordingly.
  • Use Cases and Downstream Data Access. Contracting parties should discuss possible use and access cases for study-related data held at the site. Who has access to study data? The NERF results? EHR records? For what purpose? Can they share the data with third parties? Addressing these questions early avoids conflicts later.
  • Account for AI’s Role. The parties should define AI’s role in accessing and handling the NERF data, NERF results, and EHR data relating to the study.   See Section IV below.
  • Technology Benefits vs. Competitive Risk. New technologies bring promise, but also risk. Sponsors and sites need to assess whether the CTA language enables exposure of sensitive data to unintended parties, including how attenuated or diluted that risk is (or isn’t).
  • Precise NERF. The NERF should be precisely tailored to ensure both parties understand – and are comfortable with – its scope. The parties should discuss what database controls and safeguards the site can realistically implement to ensure compliance with the NERF, keeping AI in mind.
  • Practical EHR Terms.  Sites should make sure they can implement any EHR terms (such as access restrictions or data removal) they agree to.
  • Audit Database Security.  Sites should evaluate their data access policies. Key questions include:
    • Who can access the study data under the NERF?
    • Where are the NERF results stored?
    • How is access monitored and restricted?
    • What is the institution’s policy on making data available (including EHR records) for secondary research or use?
    • Does AI have access?  If so, under what guardrails?

IV.  Artificial Intelligence

Background

We could write an entire blog post about AI[7] in clinical trials. While AI has been around for decades, the rise of ChatGPT and large language models has rapidly expanded AI’s role in clinical research.  AI now supports drug discovery, patient-trial matching, participant adherence, trial design optimization, synthetic control arms, and adverse event prediction and monitoring.

But this article focuses on a critical issue: What happens when AI accesses study data without all parties knowing? As AI integrates deeper into clinical research, concerns around unauthorized access, data sharing, and confidentiality are becoming impossible to ignore.

  • Tech Giant Access to Healthcare Institution Data. For over a decade, tech giants have been partnering with healthcare systems to access EHR data and develop machine learning tools for predicting medical events.[8] These activities, combined with the recent surge in AI applications, raise serious concerns for clinical trial stakeholders – particularly around data sharing, confidentiality, and competitive risks.
  • Tradeoffs.  More data means better AI models, leading to improved healthcare operations and analytics.  But AI access to study data comes with significant risks, particularly for sponsors.  The CTA parties risk having critical confidential data made accessible to third parties – though typically as part of a vast dataset that may make individual study data difficult to extract or trace back to its source. While sponsors face the greatest exposure, both sides need to be aware of the implications.
  • Confidentiality Concerns.
    • Lawsuits Over AI Ingesting Proprietary Content. A wave of lawsuits highlights the risks of AI ingesting proprietary content without consent. On February 13, 2025, major publishers, including The Atlantic, Forbes, the LA Times and Condé Nast, sued startup Cohere Inc. for copyright and trademark infringement. The lawsuit alleges Cohere used over 4,000 copyrighted works to train its AI, displayed large portions (or entire articles) to end users, and generated false articles misattributed to the publishers.[9]
    • A Growing Legal Trend. Similar lawsuits are piling up: The New York Times sued OpenAI and Microsoft in December 2023; NewsCorp sued Perplexity in October 2023; and Thomson Reuters won a case against AI company Ross Intelligence in February 2024. These lawsuits underscore AI’s legal and ethical challenges in using proprietary data. [Id.]
    • The Risk for Clinical Trials. A key issue in these lawsuits is whether AI tools can refine themselves using ingested data. While most AI companies claim they do not train models with user data, fine-tuning remains common. For sponsors in clinical trials, confidential study data could be processed, repurposed and potentially exposed through AI models.

Contracting Party Perspectives

  • Both parties:
    • Unauthorized Use and Disclosure of Confidential Information and PHI. Using AI on study data or documents may unintentionally make confidential data accessible – perhaps in diluted fashion – by third parties in violation of the CTA confidentiality obligations or PHI restrictions.
    • Vendor-Based Risk. If parties are unaware of the AI tools their vendors use or if vendors lack proper AI controls, their vendors’ actions may place the parties in violation of  the same CTA sections.
    • State Law. Understanding how AI is being used is essential to complying with emerging state AI and privacy laws that may apply.
  • Sites:
    • Operational Freedom. Sites want to avoid sponsor-imposed restrictions on AI tools used in daily operations (e.g., EHR, document management systems, clinical trial management system (CTMS), NERF databases).
    • Confidentiality and PHI. AI use by sponsors or vendors could inadvertently breach CTA confidentiality or PHI obligations relating to site confidential information.
    • Competitive Risk Dilution. Study data absorbed into AI models may become part of vast datasets, reducing traceability to sponsors or study products.
    • Implementation Challenges. Completely banning AI tools is often impractical and difficult for sites to enforce.
  • Sponsors:
    • Loss of Confidentiality.
      • If AI ingests study data from site databases and the AI is accessible by third parties, competitors may be able to extract study-related data.
      • If sites use AI for document review, and the CTA itself is sponsor confidential information, they may violate the CTA’s confidentiality obligations.
    • Competitive Risks. AI access to study data may affect the sponsor’s:
      • Market reach and share
      • Intellectual property protection
      • Patentability
      • FDA review process
    • Data Gateways. AI operates within databases, posing risks through the NERF, EHR systems (see Section III above on secondary research and use), and site document management tools like CTMS.

Key Issues to Consider

AI in clinical trials is in its early stages, and most CTAs do not explicitly address AI…yet. Prudent stakeholders should understand AI’s role in studies to prevent unintended consequences. While AI provisions in CTAs are not standard, it’s critical for stakeholders to assess how AI may interact with study-related data.  Questions and considerations include:

  • AI Access to Sponsor Confidential Information
    • Does AI at the site interact with study data, the protocol, NERF data, other sponsor confidential information or EHR records? Consider AI in the site’s and its vendors’ systems as well as tools used by study personnel.
    • Does the site’s document management system have built-in AI tools (e.g. Microsoft Copilot)? Is the EHR system ONC-certified AI?
    • Does study data refine AI for broader use, benefiting other customers? If so, is the data sufficiently diluted to prevent it from being linked back to the sponsor?
  • AI Access to NERF Data
    • Where does the site store study data subject to NERF rights?
    • Are databases containing secondary NERF data accessible by AI?
    • Could AI-ingested data be traceable back to the sponsor or study product?
  • AI Access to Site Confidential Information
    • Sites should determine whether sponsor tools, CROs, database vendors or cloud service providers use AI.
    • Remote source data verification poses a lower risk due to site-controlled monitoring systems.
  • AI Access to Patient Data (PHI)
    • Both parties must assess whether AI use triggers state privacy law compliance obligations (and stay tuned for evolving AI regulations).
    • Sites should ensure AI vendors sign business associate agreements (BAAs) where required and that they address AI-specific risks.
    • Sponsors should evaluate whether their or their vendors’ AI tools risk unauthorized PHI disclosure in violation of the CTA.
    • Sponsors should consider whether EHR records containing study adverse event information require redaction before AI access.
  • Downstream Obligations.  Unregulated AI use may violate CTA requirements to bind employees, agents, and contractors to the confidentiality and PHI protections.
  • Current Industry Practices.  Approaches to AI in CTAs vary:
    • Silence: No mention of AI.
    • Strict Prohibitions: Ban on AI interacting with the CTA or study data.
    • Tailored NERF and/or EHR Language: AI-driven concerns addressed in these provisions.
    • Confidentiality Provisions: AI prohibition tucked into confidentiality sections.

Key Takeaways

  • Learn
    • Study sites and sponsors should discuss how each party – and its vendors – use AI in relation to study data and other confidential information.
    • Conduct AI audits to assess:
      • What AI tools are in use?
      • For what purpose?
      • Are they home-grown or third party?
      • What safeguards exist?
      • How is data processed and stored?
      • What privacy, confidentiality, and cybersecurity measures are in place?
      • Is a blanket AI ban practical or enforceable?
      • Has a data mapping exercise been conducted to track where data is received, generated or stored?
    • Be Transparent and Consider Guardrails:
      • Adding an AI provision to the CTA will prompt discussions among the stakeholders and drive internal policy development.
      • Define clear AI usage restrictions in the CTA based on the outcome of these discussions.
      • AI is quickly being adopted, sometimes by employees or vendors not subject to formal policies or oversight. Sites and sponsors should investigate their employee and vendor usage of AI.
      • AI governance is evolving; stakeholders must update contracts and practices accordingly.
  • Vendor Due Diligence
    • Ensure vendor contracts for any study-related services, including clinical trial management, database hosting, and data processing, include:
      • Security certifications (ideally SOC 2 or ISO 27001)
      • Comprehensive documentation (including transparency on training data)
      • Audit rights
      • Indemnification provisions
      • Incident response policies
      • Policies for AI hallucinations
      • Legal compliance requirements
      • Ongoing monitoring and governance
    • Align AI Terms with Other CTA Provisions:
      • Align AI-related clauses with the CTA’s confidentiality, publication, security, NERF, EHR, indemnification, and limitation of liability provisions.
      • Coordinate AI-related terms with known AI applications in the study, including those referenced in the protocol, incorporated into DHTs, or embedded in the study product.

 V.  Conclusion

Technology is revolutionizing clinical trials, but with innovation comes complexity.  Clear governance, contract adaptations and proactive risk management are essential. AI is just one piece of the puzzle.  Protecting research investments means strengthening cybersecurity, keeping pace with evolving EHR system standards, setting clear policies for secondary data usage through NERF and EHR systems, and addressing AI.

Looking ahead, organizations must take a forward-thinking approach to technology-related risk management while strengthening data governance frameworks to stay compliant and competitive.

This first installment focused on data access, usage, and sharing. In Part 2 we will examine operational and risk mitigation aspects of clinical trials, including critical contractual safeguards like indemnification, limitation of liability, and insurance.

Stay with us as we break down the key strategies for navigating the evolving landscape of CTAs.

 

 

[1] Ransomware Attacks Surge in 2023; Attacks on Healthcare Sector Nearly Double, https://www.dni.gov/files/CTIIC/documents/products/Ransomware_Attacks_Surge_in_2023.pdf (Feb. 28, 2023).

[2] In the EHR Guidance, FDA makes clear that it does not intend to assess EHR systems for compliance with Part 11. Even though Part 11 does not apply, the EHR Guidance does include other important standards.

[3] Office of the National Coordinator for Health Information Technology, National Trends in Hospital and Physician Adoption of Electronic Health Records (last visited Jan. 20, 2025).

[4] In July, the federal Department of Health and Human Services announced a reorganization of ONC, the federal office that is responsible for establishing and overseeing a national health information technology infrastructure. ONC has been renamed the Assistant Secretary for Technology Policy and Office of the National Coordinator for Health Information Technology (“ASTP/ONC”). ASTP/ONC oversees technology, data, and AI policy and strategy and has established several new roles within the office, including a Chief AI Officer.

[5] In December 2023, the U.S. Department of Health and Human Services issued a final rule titled “Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing,” which, among other things, establishes transparency requirements for AI and other predictive algorithms that are part of certified health IT. This rule provides additional considerations in connection with EHR systems that incorporate AI tools. https://www.healthit.gov/sites/default/files/page/2023-12/hti-1-final-rule.pdf.

[6] A discussion of biobanking and other secondary use of biological specimens is beyond the scope of this post.

[7] There are many important and often confusing terms used when referring to AI, including model, algorithm, tool, program, etc. For purposes of this post, we are going to refer to AI generally to mean any or all of these terms.

[8] IBM’s Watson (2015), Google’s DeepMind (2016) and Google’s partnerships with universities and health systems including the University of Chicago (which spawned a class action lawsuit in 2019 that was eventually dismissed), UCSF, and Ascension Providence (Project Nightengale, which sparked an HHS investigation in 2019 and Congressional attention in 2020) started this trend.

[9] Alexandra Bruelle. “Publishers Sue AI Startup Over Content Use.” The Wall Street Journal, February 14, 2025, page B1.

This contents of this alert should not be construed as legal advice or a legal opinion on any specific facts or circumstances. This content is not intended to and does not, by its receipt, create an attorney-client relationship. The contents are intended for general informational purposes only. We urge you to consult your attorney about the specific situation and any legal questions you may have. Attorney advertising in some jurisdictions. © 2025 Leibowitz Law. All rights reserved. “Leibowitz Law” is a trade name of Leibowitz LLC.

Want to receive notifications?

To be notified when we post new Insights, please sign up for our email list. As industry thought leaders, Leibowitz Law Insights address developing issues at the intersection of law, regulation, technology and life sciences…

SIGN UP FOR OUR LATEST INSIGHTS