ARTICLE
19 December 2023

AI Policies Needed To Avoid Liability In Government Contracting

WR
Wiley Rein

Contributor

Wiley is a preeminent law firm wired into Washington. We advise Fortune 500 corporations, trade associations, and individuals in all industries on legal matters converging at the intersection of government, business, and technological innovation. Our attorneys and public policy advisors are respected and have nuanced insights into the mindsets of agencies, regulators, and lawmakers. We are the best-kept secret in DC for many of the most innovative and transformational companies, business groups, and nonprofit organizations. From autonomous vehicles to blockchain technologies, we combine our focused industry knowledge and unmatched understanding of Washington to anticipate challenges, craft policies, and formulate solutions for emerging innovators and industries.
Wiley's Nick Peterson and Craig Smith examine key considerations for contractors when developing, implementing, and enforcing AI use policies to minimize FCA exposure.
United States Government, Public Sector
To print this article, all you need is to be registered or login on Mondaq.com.

Wiley's Nick Peterson and Craig Smith examine key considerations for contractors when developing, implementing, and enforcing AI use policies to minimize FCA exposure.

Generative AI's potential is vast, and could allow contractors to leverage generative AI tools to help draft proposals, set pricing, analyze procurement data, or even predict potential risks in projects.

Contractors must evaluate carefully about when and how to use generative AI tools. If a contractor fails to have adequate policies and processes in place, it could expose the contractor to liability under the False Claims Act.

Generative AI is a subset of artificial intelligence that focuses on creating and generating content, such as text, images, or other data. It's generally understood that generative AI tools are imperfect. Generative AI's very nature, relying on machine learning algorithms and huge datasets, introduces complexities that may lead to errors or biases in the output.

One can see the prospect that inaccurate information developed with help of AI tools will ultimately be submitted to the government. This prospect will naturally implicate the FCA.

The FCA is a federal law that imposes civil liability on anyone who knowingly submits or causes the submission of false or fraudulent claims to the government for payment or approval. The FCA imposes liability not only for intentional fraud but also for conduct exhibiting reckless disregard for the truth.

Reckless disregard encompasses a gross deviation from the standard of care that a reasonable person would exercise, or a deliberate indifference to the truth or falsity of the information provided to the government.

If potentially inaccurate information was developed with help of generative AI tools, understanding whether such information was developed knowingly or with reckless disregard will be key for determining liability under the FCA. In these situations, scrutiny will likely turn to a contractor's generative AI use policies and practices. The focus will be on the amount and type of oversight the company requires when employees use generative AI tools.

To determine the appropriate level of oversight, contractors should focus on four primary considerations: what AI tools are permitted, who is permitted to use those tools, how those tools can be used, and how steps will be retraced.

What Are Permitted AI Tools?

There are already numerous generative AI tools available and many more are sure to come on the market soon. A company should vet such tools or have a procedure to vet them. This procedure could involve understanding the training data, assessing potential biases, and validating the accuracy of AI-generated content.

Who Are Permitted AI Users?

A company should determine which employees are permitted to use AI tools in the performance of their job. Some job categories may be particularly ill-suited for AI use, such as positions that deal closely with confidential or classified information. A company may also determine it's too difficult to provide oversight if every employee starts using generative AI tools. A company should have procedures in place to grant approval to employees wanting to use generative AI tools.

What Are Permitted Uses?

Contractors may find it difficult to know what uses may be most valuable, as people are still learning about such tools and what they can do. A contractor may find it easiest to simply require employees to identify each intended use, and get permission for each such use, beforehand.

How Will Steps Be Retraced?

Consider a request to explain how a contractor team prepared a report submitted to an agency three years earlier. If the team used generative AI, they may want to explain the tool they used, the prompts they gave, and the outputs generated (before human-in-the-loop review) at the time. Capturing these data points in the moment may be important, because even the most consistent and refined generative AI tool may yield markedly different outputs when prompted years apart—which could make it harder to show the team's work report was reasonable and consistent with contract terms and expected standards of care.

Addressing the considerations is an important and necessary step in responsible AI use, but this isn't the end. Contractors should ensure that the individuals making AI decisions understand generative AI.

Contractors should involve IT professionals to stay informed about advancements in AI technology and update their systems and procedures accordingly. At the same time, contractors shouldn't delegate these responsibilities to IT professionals who may not understand the underlying business.

Contractors should also collaborate with their legal department, which can provide valuable insights into potential pitfalls, compliance requirements, and evolving legal standards. Contractors who put in the effort could reap the full potential of AI while minimizing undue risk.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More