A Bill for an Act
Page 1, Line 101Concerning implementing consumer protections in
Page 1, Line 102interactions with artificial intelligence systems
Page 1, Line 103before October 1, 2026.
Bill Summary
(Note: This summary applies to this bill as introduced and does not reflect any amendments that may be subsequently adopted. If this bill passes third reading in the house of introduction, a bill summary that applies to the reengrossed version of this bill will be available at http://leg.colorado.gov.)
The bill establishes that the use of artificial intelligence systems or required disclosure artificial intelligence systems (artificial intelligence systems) must comply with the "Colorado Consumer Protection Act". The attorney general may bring a claim against a developer or a deployer that uses an artificial intelligence system in a way that violates the "Colorado Consumer Protection Act". A developer or a deployer of an artificial intelligence system must disclose to a consumer when the consumer is interacting with the artificial intelligence system and not with a human in certain circumstances. The bill establishes certain requirements for claims brought by the attorney general and parameters for court orders resulting from those claims. The attorney general may adopt rules for the implementation and enforcement of this provision of the bill.
A developer of an artificial intelligence system is also subject to the provisions of the "Colorado Anti-discrimination Act" if the artificial intelligence system is deployed in a way that violates the "Colorado Anti-discrimination Act". An individual may file a complaint with the Colorado civil rights division against the developer if the developer's artificial intelligence system discriminates against the individual in certain circumstances.
The bill requires that contracts entered into by a Colorado public school, a state agency, or other public entity comply with the provisions of the "Colorado Consumer Protection Act" or the "Colorado Anti-discrimination Act" in relation to the use and deployment of artificial intelligence systems and that a contractor agrees to indemnify and hold harmless a state agency or public entity.
Page 2, Line 1Be it enacted by the General Assembly of the State of Colorado:
Page 2, Line 2SECTION 1. In Colorado Revised Statutes, 6-1-1702, amend (1),
Page 2, Line 3(2) introductory portion, (3)(a), (4)(a) introductory portion, (5) introductory portion, and (7) as follows:
Page 2, Line 46-1-1702. Developer duty to avoid algorithmic discrimination
Page 2, Line 5- required documentation. (1) On and after
February 1, 2026 OctoberPage 2, Line 61, 2026, a developer of a high-risk artificial intelligence system shall use
Page 2, Line 7reasonable care to protect consumers from any known or reasonably
Page 2, Line 8foreseeable risks of algorithmic discrimination arising from the intended
Page 2, Line 9and contracted uses of the high-risk artificial intelligence system. In any
Page 2, Line 10enforcement action brought on or after
February 1, 2026 October 1,Page 2, Line 112026, by the attorney general pursuant to section 6-1-1706, there is a
Page 2, Line 12rebuttable presumption that a developer used reasonable care as required
Page 2, Line 13under this section if the developer complied with this section and any
Page 3, Line 1additional requirements or obligations as set forth in rules
promulgated adopted by the attorney general pursuant to section 6-1-1707.Page 3, Line 2(2) On and after
February 1, 2026 October 1, 2026, and exceptPage 3, Line 3as provided in subsection (6) of this section, a developer of a high-risk
Page 3, Line 4artificial intelligence system shall make available to the deployer or other developer of the high-risk artificial intelligence system:
Page 3, Line 5(3) (a) Except as provided in subsection (6) of this section, a
Page 3, Line 6developer that offers, sells, leases, licenses, gives, or otherwise makes
Page 3, Line 7available to a deployer or other developer a high-risk artificial
Page 3, Line 8intelligence system on or after
February 1, 2026 October 1, 2026, shallPage 3, Line 9make available to the deployer or other developer, to the extent feasible,
Page 3, Line 10the documentation and information, through artifacts such as model cards,
Page 3, Line 11dataset cards, or other impact assessments, necessary for a deployer, or
Page 3, Line 12for a third party contracted by a deployer, to complete an impact assessment pursuant to section 6-1-1703 (3).
Page 3, Line 13(4) (a) On and after
February 1, 2026 October 1, 2026, aPage 3, Line 14developer shall make available, in a manner that is clear and readily
Page 3, Line 15available on the developer's website or in a public use case inventory, a statement summarizing:
Page 3, Line 16(5) On and after
February 1, 2026 October 1, 2026, a developerPage 3, Line 17of a high-risk artificial intelligence system shall disclose to the attorney
Page 3, Line 18general, in a form and manner prescribed by the attorney general, and to
Page 3, Line 19all known deployers or other developers of the high-risk artificial
Page 3, Line 20intelligence system, any known or reasonably foreseeable risks of
Page 3, Line 21algorithmic discrimination arising from the intended uses of the high-risk
Page 3, Line 22artificial intelligence system without unreasonable delay but no later than
Page 3, Line 23ninety days after the date on which:
Page 4, Line 1(7) On and after
February 1, 2026 October 1, 2026, the attorneyPage 4, Line 2general may require that a developer disclose to the attorney general, no
Page 4, Line 3later than ninety days after the request and in a form and manner
Page 4, Line 4prescribed by the attorney general, the statement or documentation
Page 4, Line 5described in subsection (2) of this section. The attorney general may
Page 4, Line 6evaluate such statement or documentation to ensure compliance with this
Page 4, Line 7part 17, and the statement or documentation is not subject to disclosure
Page 4, Line 8under the "Colorado Open Records Act", part 2 of article 72 of title 24.
Page 4, Line 9In a disclosure pursuant to this subsection (7), a developer may designate
Page 4, Line 10the statement or documentation as including proprietary information or
Page 4, Line 11a trade secret. To the extent that any information contained in the
Page 4, Line 12statement or documentation includes information subject to
Page 4, Line 13attorney-client privilege or work-product protection, the disclosure does not constitute a waiver of the privilege or protection.
Page 4, Line 14SECTION 2. In Colorado Revised Statutes, 6-1-1703, amend (1),
Page 4, Line 15(2)(a) introductory portion, (3)(a), (3)(c), (3)(g), (4)(a) introductory
Page 4, Line 16portion, (4)(b) introductory portion, (5)(a) introductory portion, (7), and (9) as follows:
Page 4, Line 176-1-1703. Deployer duty to avoid algorithmic discrimination
Page 4, Line 18- risk management policy and program. (1) On and after
February 1,Page 4, Line 19
2026 October 1, 2026, a deployer of a high-risk artificial intelligencePage 4, Line 20system shall use reasonable care to protect consumers from any known or
Page 4, Line 21reasonably foreseeable risks of algorithmic discrimination. In any
Page 4, Line 22enforcement action brought on or after
February 1, 2026 October 1,Page 4, Line 232026, by the attorney general pursuant to section 6-1-1706, there is a
Page 4, Line 24rebuttable presumption that a deployer of a high-risk artificial intelligence
Page 4, Line 25system used reasonable care as required under this section if the deployer
Page 5, Line 1complied with this section and any additional requirements or obligations
Page 5, Line 2as set forth in rules
promulgated adopted by the attorney general pursuant to section 6-1-1707.Page 5, Line 3(2) (a) On and after
February 1, 2026 October 1, 2026, andPage 5, Line 4except as provided in subsection (6) of this section, a deployer of a
Page 5, Line 5high-risk artificial intelligence system shall implement a risk management
Page 5, Line 6policy and program to govern the deployer's deployment of the high-risk
Page 5, Line 7artificial intelligence system. The risk management policy and program
Page 5, Line 8must specify and incorporate the principles, processes, and personnel that
Page 5, Line 9the deployer uses to identify, document, and mitigate known or
Page 5, Line 10reasonably foreseeable risks of algorithmic discrimination. The risk
Page 5, Line 11management policy and program must be an iterative process planned,
Page 5, Line 12implemented, and regularly and systematically reviewed and updated over
Page 5, Line 13the life cycle of a high-risk artificial intelligence system, requiring
Page 5, Line 14regular, systematic review and updates. A risk management policy and
Page 5, Line 15program implemented and maintained pursuant to this subsection (2) must be reasonable considering:
Page 5, Line 16(3) (a) Except as provided in subsections (3)(d), (3)(e), and (6) of this section:
Page 5, Line 17(I) A deployer, or a third party contracted by the deployer, that
Page 5, Line 18deploys a high-risk artificial intelligence system on or after
February 1,Page 5, Line 19
2026 October 1, 2026, shall complete an impact assessment for the high-risk artificial intelligence system; andPage 5, Line 20(II) On and after
February 1, 2026 October 1, 2026, a deployer,Page 5, Line 21or a third party contracted by the deployer, shall complete an impact
Page 5, Line 22assessment for a deployed high-risk artificial intelligence system at least
Page 5, Line 23annually and within ninety days after any intentional and substantial
Page 6, Line 1modification to the high-risk artificial intelligence system is made available.
Page 6, Line 2(c) In addition to the information required under subsection (3)(b)
Page 6, Line 3of this section, an impact assessment completed pursuant to this
Page 6, Line 4subsection (3) following an intentional and substantial modification to a
Page 6, Line 5high-risk artificial intelligence system on or after
February 1, 2026Page 6, Line 6October 1, 2026, must include a statement disclosing the extent to which
Page 6, Line 7the high-risk artificial intelligence system was used in a manner that was
Page 6, Line 8consistent with, or varied from, the developer's intended uses of the high-risk artificial intelligence system.
Page 6, Line 9(g) On or before
February 1, 2026 October 1, 2026, and at leastPage 6, Line 10annually thereafter, a deployer, or a third party contracted by the deployer,
Page 6, Line 11must review the deployment of each high-risk artificial intelligence
Page 6, Line 12system deployed by the deployer to ensure that the high-risk artificial intelligence system is not causing algorithmic discrimination.
Page 6, Line 13(4) (a) On and after
February 1, 2026 October 1, 2026, and noPage 6, Line 14later than the time that a deployer deploys a high-risk artificial
Page 6, Line 15intelligence system to make, or be a substantial factor in making, a consequential decision concerning a consumer, the deployer shall:
Page 6, Line 16(b) On and after
February 1, 2026 October 1, 2026, a deployerPage 6, Line 17that has deployed a high-risk artificial intelligence system to make, or be
Page 6, Line 18a substantial factor in making, a consequential decision concerning a
Page 6, Line 19consumer shall, if the consequential decision is adverse to the consumer, provide to the consumer:
Page 6, Line 20(5) (a) On and after
February 1, 2026 October 1, 2026, andPage 6, Line 21except as provided in subsection (6) of this section, a deployer shall make
Page 6, Line 22available, in a manner that is clear and readily available on the deployer's website, a statement summarizing:
Page 7, Line 1(7) If a deployer deploys a high-risk artificial intelligence system
Page 7, Line 2on or after
February 1, 2026 October 1, 2026, and subsequentlyPage 7, Line 3discovers that the high-risk artificial intelligence system has caused
Page 7, Line 4algorithmic discrimination, the deployer, without unreasonable delay, but
Page 7, Line 5no later than ninety days after the date of the discovery, shall send to the
Page 7, Line 6attorney general, in a form and manner prescribed by the attorney general, a notice disclosing the discovery.
Page 7, Line 7(9) On and after
February 1, 2026 October 1, 2026, the attorneyPage 7, Line 8general may require that a deployer, or a third party contracted by the
Page 7, Line 9deployer, disclose to the attorney general, no later than ninety days after
Page 7, Line 10the request and in a form and manner prescribed by the attorney general,
Page 7, Line 11the risk management policy implemented pursuant to subsection (2) of
Page 7, Line 12this section, the impact assessment completed pursuant to subsection (3)
Page 7, Line 13of this section, or the records maintained pursuant to subsection (3)(f) of
Page 7, Line 14this section. The attorney general may evaluate the risk management
Page 7, Line 15policy, impact assessment, or records to ensure compliance with this part
Page 7, Line 1617, and the risk management policy, impact assessment, and records are
Page 7, Line 17not subject to disclosure under the "Colorado Open Records Act", part 2
Page 7, Line 18of article 72 of title 24. In a disclosure pursuant to this subsection (9), a
Page 7, Line 19deployer may designate the statement or documentation as including
Page 7, Line 20proprietary information or a trade secret. To the extent that any
Page 7, Line 21information contained in the risk management policy, impact assessment,
Page 7, Line 22or records includes information subject to attorney-client privilege or
Page 7, Line 23work-product protection, the disclosure does not constitute a waiver of the privilege or protection.
Page 7, Line 24SECTION 3. In Colorado Revised Statutes, 6-1-1704, amend (1) as follows:
Page 8, Line 16-1-1704. Disclosure of an artificial intelligence system to
Page 8, Line 2consumer. (1) On and after
February 1, 2026 October 1, 2026, andPage 8, Line 3except as provided in subsection (2) of this section, a deployer or other
Page 8, Line 4developer that deploys, offers, sells, leases, licenses, gives, or otherwise
Page 8, Line 5makes available an artificial intelligence system that is intended to
Page 8, Line 6interact with consumers shall ensure the disclosure to each consumer who
Page 8, Line 7interacts with the artificial intelligence system that the consumer is interacting with an artificial intelligence system.
Page 8, Line 8SECTION 4. Act subject to petition - effective date. This act
Page 8, Line 9takes effect at 12:01 a.m. on the day following the expiration of the
Page 8, Line 10ninety-day period after final adjournment of the general assembly; except
Page 8, Line 11that, if a referendum petition is filed pursuant to section 1 (3) of article V
Page 8, Line 12of the state constitution against this act or an item, section, or part of this
Page 8, Line 13act within such period, then the act, item, section, or part will not take
Page 8, Line 14effect unless approved by the people at the general election to be held in
Page 8, Line 15November 2026 and, in such case, will take effect on the date of the official declaration of the vote thereon by the governor.