A Bill for an Act
Page 1, Line 101Concerning measures effective no later than June 30, 2026, to
Page 1, Line 102increase transparency for algorithmic systems.
Bill Summary
(Note: This summary applies to this bill as introduced and does not reflect any amendments that may be subsequently adopted. If this bill passes third reading in the house of introduction, a bill summary that applies to the reengrossed version of this bill will be available at http://leg.colorado.gov.)
In 2024, the general assembly enacted Senate Bill 24-205, which created consumer protections in interactions with artificial intelligence systems (provisions). The bill eliminates these provisions and:
- Defines "algorithmic decision system" (system) to mean any machine-based system or computational process that uses statistical modeling, data analytics, artificial intelligence, or machine learning to generate a simplified output or is capable, for a given set of human-defined objectives, of making predictions or recommendations and is used to assist, inform, or replace human decision-making;
- Requires a developer of a system to, on and after February 1, 2026, provide certain disclosures to a deployer of the system;
- Requires a deployer of a system to, on and after February 1, 2026, provide certain disclosures to an individual who is or will be affected by a decision made, informed, or influenced by a system and provide the individual with a procedure to correct the accuracy of data that was used by the system;
- Provides that a developer and deployer of a system are jointly and severally liable for a violation of any law that results from the deployer's use of the developer's system;
- Requires a person that makes available a generative artificial intelligence system to disclose to an individual interacting with the generative artificial intelligence system that the individual is interacting with a generative artificial intelligence system;
- Clarifies that a violation of the bill's requirements is an unfair or deceptive trade practice under the "Colorado Consumer Protection Act"; and
- Permits the attorney general to adopt rules implementing the provisions of the bill.
Page 2, Line 1Be it enacted by the General Assembly of the State of Colorado:
Page 2, Line 2SECTION 1. In Colorado Revised Statutes, 6-1-1702, amend (1),
Page 2, Line 3(2) introductory portion, (3)(a), (4)(a) introductory portion, (5) introductory portion, and (7) as follows:
Page 2, Line 46-1-1702. Developer duty to avoid algorithmic discrimination
Page 2, Line 5- required documentation. (1) On and after
February 1, 2026 June 30,Page 2, Line 62026, a developer of a high-risk artificial intelligence system shall use
Page 2, Line 7reasonable care to protect consumers from any known or reasonably
Page 2, Line 8foreseeable risks of algorithmic discrimination arising from the intended
Page 2, Line 9and contracted uses of the high-risk artificial intelligence system. In any
Page 3, Line 1enforcement action brought on or after
February 1, 2026 June 30, 2026,Page 3, Line 2by the attorney general pursuant to section 6-1-1706, there is a rebuttable
Page 3, Line 3presumption that a developer used reasonable care as required under this
Page 3, Line 4section if the developer complied with this section and any additional
Page 3, Line 5requirements or obligations as set forth in rules
promulgated adopted by the attorney general pursuant to section 6-1-1707.Page 3, Line 6(2) On and after
February 1, 2026 June 30, 2026, and except asPage 3, Line 7provided in subsection (6) of this section, a developer of a high-risk
Page 3, Line 8artificial intelligence system shall make available to the deployer or other developer of the high-risk artificial intelligence system:
Page 3, Line 9(3) (a) Except as provided in subsection (6) of this section, a
Page 3, Line 10developer that offers, sells, leases, licenses, gives, or otherwise makes
Page 3, Line 11available to a deployer or other developer a high-risk artificial
Page 3, Line 12intelligence system on or after
February 1, 2026 June 30, 2026, shallPage 3, Line 13make available to the deployer or other developer, to the extent feasible,
Page 3, Line 14the documentation and information, through artifacts such as model cards,
Page 3, Line 15dataset cards, or other impact assessments, necessary for a deployer, or
Page 3, Line 16for a third party contracted by a deployer, to complete an impact assessment pursuant to section 6-1-1703 (3).
Page 3, Line 17(4) (a) On and after
February 1, 2026 June 30, 2026, a developerPage 3, Line 18shall make available, in a manner that is clear and readily available on the
Page 3, Line 19developer's website or in a public use case inventory, a statement summarizing:
Page 3, Line 20(5) On and after
February 1, 2026 June 30, 2026, a developer ofPage 3, Line 21a high-risk artificial intelligence system shall disclose to the attorney
Page 3, Line 22general, in a form and manner prescribed by the attorney general, and to
Page 3, Line 23all known deployers or other developers of the high-risk artificial
Page 4, Line 1intelligence system, any known or reasonably foreseeable risks of
Page 4, Line 2algorithmic discrimination arising from the intended uses of the high-risk
Page 4, Line 3artificial intelligence system without unreasonable delay but no later than ninety days after the date on which:
Page 4, Line 4(7) On and after
February 1, 2026 June 30, 2026, the attorneyPage 4, Line 5general may require that a developer disclose to the attorney general, no
Page 4, Line 6later than ninety days after the request and in a form and manner
Page 4, Line 7prescribed by the attorney general, the statement or documentation
Page 4, Line 8described in subsection (2) of this section. The attorney general may
Page 4, Line 9evaluate such statement or documentation to ensure compliance with this
Page 4, Line 10part 17, and the statement or documentation is not subject to disclosure
Page 4, Line 11under the "Colorado Open Records Act", part 2 of article 72 of title 24.
Page 4, Line 12In a disclosure made pursuant to this subsection (7), a developer may
Page 4, Line 13designate the statement or documentation as including proprietary
Page 4, Line 14information or a trade secret. To the extent that any information contained
Page 4, Line 15in the statement or documentation includes information subject to
Page 4, Line 16attorney-client privilege or work-product protection, the disclosure does not constitute a waiver of the privilege or protection.
Page 4, Line 17SECTION 2. In Colorado Revised Statutes, 6-1-1703, amend (1),
Page 4, Line 18(2)(a) introductory portion, (3)(a), (3)(c), (3)(g), (4)(a) introductory
Page 4, Line 19portion, (4)(b) introductory portion, (5)(a) introductory portion, (7), and (9) as follows:
Page 4, Line 206-1-1703. Deployer duty to avoid algorithmic discrimination
Page 4, Line 21- risk management policy and program. (1) On and after
February 1,Page 4, Line 22
2026 June 30, 2026, a deployer of a high-risk artificial intelligencePage 4, Line 23system shall use reasonable care to protect consumers from any known or
Page 4, Line 24reasonably foreseeable risks of algorithmic discrimination. In any
Page 5, Line 1enforcement action brought on or after
February 1, 2026 June 30, 2026,Page 5, Line 2by the attorney general pursuant to section 6-1-1706, there is a rebuttable
Page 5, Line 3presumption that a deployer of a high-risk artificial intelligence system
Page 5, Line 4used reasonable care as required under this section if the deployer
Page 5, Line 5complied with this section and any additional requirements or obligations
Page 5, Line 6as set forth in rules
promulgated adopted by the attorney general pursuant to section 6-1-1707.Page 5, Line 7(2) (a) On and after
February 1, 2026 June 30, 2026, and exceptPage 5, Line 8as provided in subsection (6) of this section, a deployer of a high-risk
Page 5, Line 9artificial intelligence system shall implement a risk management policy
Page 5, Line 10and program to govern the deployer's deployment of the high-risk
Page 5, Line 11artificial intelligence system. The risk management policy and program
Page 5, Line 12must specify and incorporate the principles, processes, and personnel that
Page 5, Line 13the deployer uses to identify, document, and mitigate known or
Page 5, Line 14reasonably foreseeable risks of algorithmic discrimination. The risk
Page 5, Line 15management policy and program must be an iterative process planned,
Page 5, Line 16implemented, and regularly and systematically reviewed and updated over
Page 5, Line 17the life cycle of a high-risk artificial intelligence system, requiring
Page 5, Line 18regular, systematic review and updates. A risk management policy and
Page 5, Line 19program implemented and maintained pursuant to this subsection (2) must be reasonable considering:
Page 5, Line 20(3) (a) Except as provided in subsections (3)(d), (3)(e), and (6) of this section:
Page 5, Line 21(I) A deployer, or a third party contracted by the deployer, that
Page 5, Line 22deploys a high-risk artificial intelligence system on or after
February 1,Page 5, Line 23
2026 June 30, 2026, shall complete an impact assessment for thePage 5, Line 24high-risk artificial intelligence system; and
Page 6, Line 1(II) On and after
February 1, 2026 June 30, 2026, a deployer, orPage 6, Line 2a third party contracted by the deployer, shall complete an impact
Page 6, Line 3assessment for a deployed high-risk artificial intelligence system at least
Page 6, Line 4annually and within ninety days after any intentional and substantial
Page 6, Line 5modification to the high-risk artificial intelligence system is made available.
Page 6, Line 6(c) In addition to the information required under subsection (3)(b)
Page 6, Line 7of this section, an impact assessment completed pursuant to this
Page 6, Line 8subsection (3) following an intentional and substantial modification to a
Page 6, Line 9high-risk artificial intelligence system on or after
February 1, 2026 JunePage 6, Line 1030, 2026, must include a statement disclosing the extent to which the
Page 6, Line 11high-risk artificial intelligence system was used in a manner that was
Page 6, Line 12consistent with, or varied from, the developer's intended uses of the high-risk artificial intelligence system.
Page 6, Line 13(g) On or before
February 1, 2026 June 30, 2026, and at leastPage 6, Line 14annually thereafter, a deployer, or a third party contracted by the deployer,
Page 6, Line 15must review the deployment of each high-risk artificial intelligence
Page 6, Line 16system deployed by the deployer to ensure that the high-risk artificial intelligence system is not causing algorithmic discrimination.
Page 6, Line 17(4) (a) On and after
February 1, 2026 June 30, 2026, and no laterPage 6, Line 18than the time that a deployer deploys a high-risk artificial intelligence
Page 6, Line 19system to make, or be a substantial factor in making, a consequential decision concerning a consumer, the deployer shall:
Page 6, Line 20(b) On and after
February 1, 2026 June 30, 2026, a deployer thatPage 6, Line 21has deployed a high-risk artificial intelligence system to make, or be a
Page 6, Line 22substantial factor in making, a consequential decision concerning a
Page 6, Line 23consumer shall, if the consequential decision is adverse to the consumer, provide to the consumer:
Page 7, Line 1(5) (a) On and after
February 1, 2026 June 30, 2026, and exceptPage 7, Line 2as provided in subsection (6) of this section, a deployer shall make
Page 7, Line 3available, in a manner that is clear and readily available on the deployer's website, a statement summarizing:
Page 7, Line 4(7) If a deployer deploys a high-risk artificial intelligence system
Page 7, Line 5on or after
February 1, 2026 June 30, 2026, and subsequently discoversPage 7, Line 6that the high-risk artificial intelligence system has caused algorithmic
Page 7, Line 7discrimination, the deployer, without unreasonable delay, but no later than
Page 7, Line 8ninety days after the date of the discovery, shall send to the attorney
Page 7, Line 9general, in a form and manner prescribed by the attorney general, a notice disclosing the discovery.
Page 7, Line 10(9) On and after
February 1, 2026 June 30, 2026, the attorneyPage 7, Line 11general may require that a deployer, or a third party contracted by the
Page 7, Line 12deployer, disclose to the attorney general, no later than ninety days after
Page 7, Line 13the request and in a form and manner prescribed by the attorney general,
Page 7, Line 14the risk management policy implemented pursuant to subsection (2) of
Page 7, Line 15this section, the impact assessment completed pursuant to subsection (3)
Page 7, Line 16of this section, or the records maintained pursuant to subsection (3)(f) of
Page 7, Line 17this section. The attorney general may evaluate the risk management
Page 7, Line 18policy, impact assessment, or records to ensure compliance with this part
Page 7, Line 1917, and the risk management policy, impact assessment, and records are
Page 7, Line 20not subject to disclosure under the "Colorado Open Records Act", part 2
Page 7, Line 21of article 72 of title 24. In a disclosure made pursuant to this subsection
Page 7, Line 22(9), a deployer may designate the statement or documentation as including
Page 7, Line 23proprietary information or a trade secret. To the extent that any
Page 7, Line 24information contained in the risk management policy, impact assessment,
Page 8, Line 1or records includes information subject to attorney-client privilege or
Page 8, Line 2work-product protection, the disclosure does not constitute a waiver of the privilege or protection.
Page 8, Line 3SECTION 3. In Colorado Revised Statutes, 6-1-1704, amend (1) as follows:
Page 8, Line 46-1-1704. Disclosure of an artificial intelligence system to
Page 8, Line 5consumer. (1) On and after
February 1, 2026 June 30, 2026, and exceptPage 8, Line 6as provided in subsection (2) of this section, a deployer or other developer
Page 8, Line 7that deploys, offers, sells, leases, licenses, gives, or otherwise makes
Page 8, Line 8available an artificial intelligence system that is intended to interact with
Page 8, Line 9consumers shall ensure the disclosure to each consumer who interacts
Page 8, Line 10with the artificial intelligence system that the consumer is interacting with an artificial intelligence system.
Page 8, Line 11SECTION 4. Act subject to petition - effective date. This act
Page 8, Line 12takes effect at 12:01 a.m. on the day following the expiration of the
Page 8, Line 13ninety-day period after final adjournment of the general assembly; except
Page 8, Line 14that, if a referendum petition is filed pursuant to section 1 (3) of article V
Page 8, Line 15of the state constitution against this act or an item, section, or part of this
Page 8, Line 16act within such period, then the act, item, section, or part will not take
Page 8, Line 17effect unless approved by the people at the general election to be held in
Page 8, Line 18November 2026 and, in such case, will take effect on the date of the official declaration of the vote thereon by the governor.