In many cases, AI enters clinical trial operations not through deliberate deployment, but through tools already embedded in trial systems, vendor platforms, and everyday productivity software.
This two-part series examines how artificial intelligence is being used in clinical trial operations and the contractual and operational risks that follow. Part 1 outlines where AI appears in clinical trial operations and supporting technologies, and the questions companies and organizations should ask when AI touches Data (defined below). Part 2 addresses the contract provisions that respond to those risks, including intellectual property, data rights, regulatory compliance, cybersecurity, monitoring and validation, and risk allocation.
As we discussed in our prior post on how technology is shaping clinical trial agreements, contracts have already been evolving to address increasingly complex data flows and technology-enabled services. The introduction of AI builds on those same pressures, but with additional considerations around data use, accountability, and oversight.
Many of the governance and contracting issues addressed here also arise in digital health platforms used in clinical care and healthcare operations. However, AI incorporated into regulated medical products themselves, such as Software as a Medical Device (SaMD), raises additional product and regulatory issues beyond the scope of this discussion.
For purposes of this post, “Data” refers broadly to data, documents, communications, and other information relating to the clinical trial, including outputs generated by AI systems using such information.
AI is increasingly embedded throughout clinical trial operations. It appears across several areas, including study design and recruitment, day-to-day operational support, AI embedded within trial platforms, regulatory and reporting activities and everyday productivity tools.
Not all AI use in clinical trial operations carries the same level of risk. For example, the level of concern increases where AI used for operational efficiencies impacts patient safety, drug quality, or the reliability of clinical study results. This distinction is reflected in January 2025 FDA draft guidance, which focuses on AI used to produce information or data intended to support regulatory decision-making.
Examples of AI in clinical trial operations include:
Sponsors, sites, CROs, and service providers must understand where AI is embedded in trial operations and who remains accountable for its use.
Even where AI is deployed by sites or vendors, sponsors remain responsible for the data and analyses submitted to regulators. This creates direct regulatory exposure for sponsors, as FDA may scrutinize how AI is used in relation to trial data at inspections or in submission reviews, even where that AI is used by sites, CROs or vendors.
In practice, AI can enter clinical trial operations without intentional deployment, through software updates, vendor platform features, or everyday productivity tools used by trial personnel.
Relationships to Review: Organizations need to review their relationships with entities across the trial ecosystem for AI usage and accountability, including:
Productivity Tools: Personnel may use everyday productivity tools to summarize protocols, draft narratives, or translate consent forms, which can result in Data being processed outside of controlled systems.
Downstream Risk: Accountability does not stop with the immediate contracting partner. Contracting parties may rely on downstream technology providers, creating third- and fourth-party AI risk that should be addressed through diligence and contractual controls.
Layered Relationships: AI functionality is often delivered through layered vendor relationships, with technology providers relying on downstream AI developers or cloud services.
Example: A sponsor may access an EDC platform through a CRO. The CRO may license the platform from an EDC vendor, which embeds AI functionality developed by another provider. In this structure, Data may pass through multiple organizations before the output is delivered, making it difficult to understand how Data is processed, used, or stored.
Limited Visibility: If AI functionality is embedded within vendor platforms, sponsors and sites may have limited visibility into how Data is processed or flows through the vendor’s technology stack, including downstream AI providers. This may limit their ability to conduct diligence or exercise oversight over how Data is impacted by AI.
Lack of Awareness: A related risk arises when vendors deploy AI tools without the sponsor’s or site’s knowledge or approval. In these situations, the AI provider may not be subject to the sponsor’s or site’s security review, data governance policy or contractual controls.
Contracts: For this reason, organizations should review the entire vendor stack for AI and ensure that contracts address downstream providers that may access or process Data.
To assess AI risk, organizations must understand what AI tools are being used—by themselves, their vendors, and their contracting partners—and how those tools interact with Data.
Key questions include:
AI use in clinical trial operations is expanding rapidly, and these examples represent only a subset of current use cases. As AI becomes more deeply embedded across trial operations and supporting technologies, organizations must understand where it is used and how it interacts with Data.
In Part 2, we examine how contracts address these risks.
This contents of this alert should not be construed as legal advice or a legal opinion on any specific facts or circumstances. This content is not intended to and does not, by its receipt, create an attorney-client relationship. The contents are intended for general informational purposes only. We urge you to consult your attorney about the specific situation and any legal questions you may have. Attorney advertising in some jurisdictions. © 2025 Leibowitz Law. All rights reserved. “Leibowitz Law” is a trade name of Leibowitz LLC.
To be notified when we post new Insights, please sign up for our email list. As industry thought leaders, Leibowitz Law Insights address developing issues at the intersection of law, regulation, technology and life sciences…