Decoding Colorado's Artificial Intelligence Act

FB
Frost Brown Todd

Contributor

Frost Brown Todd is a full-service law firm with more than 575 lawyers operating in 17 offices across nine states and Washington, D.C. Dedicated to refining the art of client service, we leverage technical, industry and legal knowledge and hands-on experience to serve a diverse client base, from leading multinationals to small, entrepreneurial companies.
Businesses developing or using artificial intelligence (AI) systems in Colorado may be required to assess and disclose the risks of Algorithmic Discrimination...
United States Technology
To print this article, all you need is to be registered or login on Mondaq.com.

Businesses developing or using artificial intelligence (AI) systems in Colorado may be required to assess and disclose the risks of Algorithmic Discrimination (as defined) under Colorado's newly passed AI Act (the “Act”). The Act creates a rebuttable presumption that reasonable care was used if businesses are in compliance with its specific requirements and any additional requirements promulgated by the Colorado attorney general. These requirements include comprehensive risk management and disclosure obligations that, in some cases, go beyond Algorithmic Discrimination protections. In addition, the Act imposes a transparency obligation to disclose the use of AI to Colorado residents. Violations of the Act constitute deceptive trade practices and are enforced exclusively by the Colorado attorney general. Colorado Governor Jared Polis signed the Act into law on May 17, 2024, and businesses are required to comply with the Act by February 1, 2026.

What Are High-Risk AI Systems, and Does My AI System Fall Under This Regulation?

The Act focuses its regulatory scope on high-risk AI systems (“High-Risk AI Systems”) that algorithmically discriminate against consumers. Algorithmic Discrimination occurs when an AI system unlawfully disfavors or differentially affects individuals or groups of individuals of a protected class. High-Risk AI Systems are those that make or are a substantial factor in making consequential decisions.

Consequential decisions are those that have materially legal or similarly significant effects on providing or denying, or the cost or terms of, consumer opportunities in education, employment, housing, and insurance, as well as financial or lending services, essential government services, health care services, and legal services. For example, an AI system that decides who is going to be hired by a company or that makes a bank's lending decision for an individual customer may be considered a High-Risk AI System.

Additionally, the Act excludes a list of technologies from the definition of a High-Risk AI system such as web caching, AI-enabled video games, cybersecurity, databases, spam-and robocall-filtering, and web hosting.

Is My Business Required to Comply with the AI Act?

The Act creates two categories of requirements depending on whether the business is a Developer or Deployer.

Developers

A Developer is any person doing business in Colorado that develops or intentionally and substantially modifies an AI system. Among other requirements, Developers are required to disclose reasonably foreseeable uses and known harmful or inappropriate risks of the High-Risk AI System to Deployers. In addition, Developers are required to assist the Deployer by providing necessary information for Deployers' compliance obligations, such as high-level summaries of the types of training data, known or reasonably foreseeable risks of Algorithmic Discrimination, and the purpose of the AI System.

Deployers

A Deployer is any person doing business in Colorado that deploys a High-Risk AI System. Deployers are required, among other things, to maintain a risk-management policy and program that identifies, documents, and mitigates known or reasonably foreseeable risks of Algorithmic Discrimination. Requirements for this policy and program include describing the principles, processes, and personnel that the Deployer uses to identify, document, and mitigate known or reasonably foreseeable risks of Algorithmic Discrimination. Deployers also are required to conduct an impact assessment in accordance with specific requirements set forth in the Act.

The Act creates an exception for small businesses. If a Deployer has fewer than 50 full-time employees and does not use their own data to train the High-Risk AI System, the Act exempts that Deployer from the most onerous requirements of the Act. Regardless of this exception, Deployers are required notify the Colorado attorney general within 90 days if they discover Algorithmic Discrimination in their High-Risk AI Systems.

Deployers and Developers

The Act requires both Deployers and Developers to provide a notice to consumers if the AI system is intended to interact with the consumer, unless it is obvious that the interaction is with an AI system.

Deployers and developers are also required to disclose how High-Risk AI Systems will impact the consumer and how consumers may challenge the use of AI. This includes providing consumers with disclosure statements on the nature of the consequential decision, a plain language description of the High-Risk AI System, its purpose and decision-making role, instructions on how to access the detailed risk mitigation documentation, consumers' access rights, the right to challenge the consequential decision, and the Deployer or Developer's contact information.

Steps Businesses Can Take Now to Prepare for Upcoming Compliance

Businesses can begin to prepare for compliance by taking the following steps:

  • Determine if the business meets the definition of a Developer or Deployer under the Act.
  • Identify any AI systems that may be High-Risk AI Systems subject to the Act.
  • Watch for any forthcoming rules from the Colorado attorney general.
  • Work with the business' legal advisor to create a compliance checklist from the Act's requirements that apply to the business.
  • Begin to implement the items on the checklist.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More