A Bill for an Act
Page 1, Line 101Concerning measures effective no later than June 30, 2026, to
Page 1, Line 102increase transparency for algorithmic systems.
Bill Summary
(Note: This summary applies to this bill as introduced and does not reflect any amendments that may be subsequently adopted. If this bill passes third reading in the house of introduction, a bill summary that applies to the reengrossed version of this bill will be available at http://leg.colorado.gov.)
In 2024, the general assembly enacted Senate Bill 24-205, which created consumer protections in interactions with artificial intelligence systems (provisions). The bill eliminates these provisions and:
- Defines "algorithmic decision system" (system) to mean any machine-based system or computational process that uses statistical modeling, data analytics, artificial intelligence, or machine learning to generate a simplified output or is capable, for a given set of human-defined objectives, of making predictions or recommendations and is used to assist, inform, or replace human decision-making;
- Requires a developer of a system to, on and after February 1, 2026, provide certain disclosures to a deployer of the system;
- Requires a deployer of a system to, on and after February 1, 2026, provide certain disclosures to an individual who is or will be affected by a decision made, informed, or influenced by a system and provide the individual with a procedure to correct the accuracy of data that was used by the system;
- Provides that a developer and deployer of a system are jointly and severally liable for a violation of any law that results from the deployer's use of the developer's system;
- Requires a person that makes available a generative artificial intelligence system to disclose to an individual interacting with the generative artificial intelligence system that the individual is interacting with a generative artificial intelligence system;
- Clarifies that a violation of the bill's requirements is an unfair or deceptive trade practice under the "Colorado Consumer Protection Act"; and
- Permits the attorney general to adopt rules implementing the provisions of the bill.
Page 2, Line 1Be it enacted by the General Assembly of the State of Colorado:
Page 2, Line 2SECTION 1. In Colorado Revised Statutes, repeal and reenact, with amendments, part 17 of article 1 of title 6 as follows:
Page 2, Line 3PART 17
ALGORITHMIC SYSTEMS
Page 2, Line 46-1-1701. Short title.The short title of this part 17 is the "Colorado Artificial Intelligence Sunshine Act".
Page 2, Line 56-1-1702. Definitions - rules.As used in this part 17, unless the context otherwise requires:
Page 2, Line 6(1) (a) "Algorithmic decision system" means any
Page 3, Line 1machine-based system or computational process that uses
Page 3, Line 2statistical modeling, data analytics, artificial intelligence, or
Page 3, Line 3machine learning to generate a simplified output, including
Page 3, Line 4scores, classifications, or recommendations, or is capable, for
Page 3, Line 5a given set of human-defined objectives, of making predictions or
Page 3, Line 6recommendations and is used to assist, inform, or replace human decision-making.
Page 3, Line 7(b) "Algorithmic decision system" does not include the following:
Page 3, Line 8(I) Databases, spreadsheets, or other tools that merely
Page 3, Line 9organize data already in the possession of the human user of the system;
Page 3, Line 10(II) Junk email filters;
(III) Firewalls;
Page 3, Line 11(IV) Anti-virus software;
(V) Calculators;
Page 3, Line 12(VI) Spell-checking;
(VII) Anti-malware;
Page 3, Line 13(VIII) Artificial-intelligence-enabled video games;
(IX) Cybersecurity;
Page 3, Line 14(X) Data storage;
(XI) Internet domain registration;
Page 3, Line 15(XII) Internet website loading;
(XIII) Networking;
Page 3, Line 16(XIV) Spam call and robocall filtering;
(XV) Web caching;
Page 3, Line 17(XVI) Web hosting or similar technology; or
Page 4, Line 1(XVII) Technologies that are solely used to order office
Page 4, Line 2supplies, schedule meetings, automate inventory tracking, or
Page 4, Line 3perform, assist, or administer similar ministerial administrative support functions.
Page 4, Line 4(2) (a) "Biometric identifier" means data generated by the
Page 4, Line 5technological processing, measurement, or analysis of an
Page 4, Line 6individual's biological, physical, or behavioral characteristics,
Page 4, Line 7which data can be processed for the purpose of uniquely identifying the individual.
Page 4, Line 8(b) "Biometric identifier" includes:
(I) A fingerprint;
Page 4, Line 9(II) A voiceprint;
(III) A scan or record of an eye retina or iris;
Page 4, Line 10(IV) A facial map, facial geometry, or facial template; or
Page 4, Line 11(V) Other unique biological, physical, or behavioral patterns or characteristics.
Page 4, Line 12(3) "Deploy" means to use an algorithmic decision system.
Page 4, Line 13(4) "Deployer" means a person doing business in this state that deploys an algorithmic decision system.
Page 4, Line 14(5) "Develop" means to design, build, or train an
Page 4, Line 15algorithmic decision system or to knowingly and materially
Page 4, Line 16modify, adapt, or combine an existing machine-based system or
Page 4, Line 17computational process for use as an algorithmic decision system.
Page 4, Line 18(6) "Developer" means a person or the person's agent doing business in this state that:
Page 4, Line 19(a) Develops an algorithmic decision system; or
Page 5, Line 1(b) Sells, leases, distributes, or otherwise makes available an algorithmic decision system to a deployer.
Page 5, Line 2(7) "Personal characteristics" include:
(a) Personal data, as defined in section 6-1-1303 (17);
Page 5, Line 3(b) Sensitive data, as defined in section 6-1-1303 (24);
Page 5, Line 4(c) Genetic information, as defined in section 10-3-1104.6 (2)(c);
Page 5, Line 5(d) A biometric identifier;
Page 5, Line 6(e) An individual's economic situation, health, personal
Page 5, Line 7preferences, affiliations, interests, reliability, behavior, location, or movements; and
Page 5, Line 8(f) Inferences associated with a group, band, class, or tier of individuals to which the individual belongs.
Page 5, Line 9(8) "Plain language" means communication that is:
Page 5, Line 10(a) Clear, concise, and easy to understand for the
Page 5, Line 11intended audience, including people with disabilities, people with limited education, and English language learners; and
Page 5, Line 12(b) Available in English, Spanish, and any other relevant languages required by the attorney general by rule.
Page 5, Line 136-1-1703. Disclosure requirements for developers of
Page 5, Line 14algorithmic decision systems. (1) On and after February 1, 2026, a
Page 5, Line 15developer shall, consistent with any form and manner
Page 5, Line 16prescribed by the attorney general, provide to each deployer of the developer's algorithmic decision system:
Page 5, Line 17(a) An analysis of whether and how any intended uses, or
Page 5, Line 18reasonably foreseeable uses or misuses, of the algorithmic
Page 5, Line 19decision system pose a known or reasonably foreseeable risk of violating this article 1 or parts 3 to 8 of article 34 of title 24;
Page 6, Line 1(b) A description of any steps taken by the developer to
Page 6, Line 2mitigate any identified risks of violations of this article 1 or parts 3 to 8 of article 34 of title 24;
Page 6, Line 3(c) A statement describing the intended uses and
Page 6, Line 4reasonably foreseeable misuses of the algorithmic decision system; and
Page 6, Line 5(d) All other information necessary to allow the
Page 6, Line 6deployer to comply with the deployer's obligations under this part 17.
Page 6, Line 76-1-1704. Disclosure requirements for deployers of
Page 6, Line 8algorithmic decision systems. (1) On and after February 1, 2026, a
Page 6, Line 9deployer shall, either directly or through a developer or other
Page 6, Line 10third party, provide the disclosures required by subsection (2)
Page 6, Line 11of this section directly to an individual who is or will be
Page 6, Line 12affected by a decision made, informed, or influenced by an
Page 6, Line 13algorithmic decision system, which decision has a material legal
Page 6, Line 14or similarly significant effect on the provision or denial to the individual of, or the cost or terms of:
Page 6, Line 15(a) Education enrollment or an education opportunity;
(b) Employment or an employment opportunity;
Page 6, Line 16(c) A financial or lending service;
(d) An essential government service;
Page 6, Line 17(e) A health-care service;
(f) Housing;
Page 6, Line 18(g) Insurance; or
Page 6, Line 19(h) A legal service.
Page 7, Line 1(2) (a) Before a deployer deploys an algorithmic decision
Page 7, Line 2system to make, inform, or influence a decision affecting an
Page 7, Line 3individual as described in subsection (1) of this section, the
Page 7, Line 4deployer shall provide the individual with a notice, in plain
Page 7, Line 5language and consistent with any form and manner prescribed
Page 7, Line 6by the attorney general, that the deployer will be using an
Page 7, Line 7algorithmic decision system to make, inform, or influence a decision concerning the individual, which notice must include:
Page 7, Line 8(I) The name of the developer or developers of the algorithmic decision system;
Page 7, Line 9(II) The trade name and version number of the algorithmic decision system;
Page 7, Line 10(III) The nature of the decision and the stage in the
Page 7, Line 11decision-making process during which the algorithmic decision system will be used; and
Page 7, Line 12(IV) The contact information for the deployer.
Page 7, Line 13(b) As soon as practicable, and no later than thirty days
Page 7, Line 14after the deployment of an algorithmic decision system to make,
Page 7, Line 15inform, or influence a decision as described in subsection (1) of
Page 7, Line 16this section, a deployer shall provide an affected individual, in
Page 7, Line 17plain language and consistent with any form and manner prescribed by the attorney general, with:
Page 7, Line 18(I) A list of the types, categories, and sources of personal
Page 7, Line 19characteristics associated with the individual that were
Page 7, Line 20analyzed, predicted, input into, inferred, or collected by the algorithmic decision system;
Page 7, Line 21(II) A list of the twenty personal characteristics of the
Page 8, Line 1individual that most substantially influenced the output of the
Page 8, Line 2algorithmic decision system or, if the algorithmic decision
Page 8, Line 3system's output was influenced by fewer than twenty personal
Page 8, Line 4characteristics, a list of all personal characteristics that influenced the output; and
Page 8, Line 5(III) Information on how the individual can exercise their rights pursuant to section 6-1-1705.
Page 8, Line 66-1-1705. Individual right to access and correct data used by
Page 8, Line 7an algorithmic decision system - procedures. (1) An individual
Page 8, Line 8affected by a decision made, informed, or influenced by an
Page 8, Line 9algorithmic decision system, as described in section 6-1-1704 (1), has a right to:
Page 8, Line 10(a) Access any personal characteristics of the individual
Page 8, Line 11that were analyzed by, predicted by, input into, inferred by, or collected by an algorithmic decision system; and
Page 8, Line 12(b) Challenge and correct any inaccurate data.
Page 8, Line 13(2) A deployer or developer that has access to an
Page 8, Line 14individual's data shall create reasonable, accessible, and
Page 8, Line 15concise procedures in plain language to allow the individual to
Page 8, Line 16exercise the individual's rights pursuant to subsection (1) of this section.
Page 8, Line 176-1-1706. Disclosure requirements - generative artificial
Page 8, Line 18intelligence systems - definition. (1) Pursuant to any requirements
Page 8, Line 19established by the attorney general, a person that deploys,
Page 8, Line 20offers, sells, leases, licenses, gives, or otherwise makes
Page 8, Line 21available a generative artificial intelligence system that is
Page 8, Line 22intended to interact with an individual shall disclose to each
Page 9, Line 1individual who interacts with the generative artificial
Page 9, Line 2intelligence system the fact that the individual is interacting with a generative artificial intelligence system.
Page 9, Line 3(2) As used in this section, "generative artificial
Page 9, Line 4intelligence system" means an artificial intelligence system that:
Page 9, Line 5(a) Is trained on data;
Page 9, Line 6(b) Interacts with an individual using text, audio, or visual communication; and
Page 9, Line 7(c) Generates unscripted outputs similar to outputs created by a human, with limited or no human oversight.
Page 9, Line 86-1-1707. Joint and several liability for a developer and
Page 9, Line 9deployer of an algorithmic decision system. (1) Notwithstanding
Page 9, Line 10the requirements regarding liability in section 13-21-111.5, on
Page 9, Line 11and after the effective date of this part 17, as amended, the
Page 9, Line 12developer and deployer of an algorithmic decision system are
Page 9, Line 13jointly and severally liable for a violation of law facilitated by the deployer's use of the algorithmic decision system.
Page 9, Line 14(2) Notwithstanding subsection (1) of this section, a
Page 9, Line 15developer is not jointly and severally liable if the developer
Page 9, Line 16can demonstrate that the violation of law resulted from a
Page 9, Line 17misuse of the algorithmic decision system by the deployer, the
Page 9, Line 18developer took all reasonable steps available, contractual or otherwise, to prevent the misuse, and the developer:
Page 9, Line 19(a) Did not intend and could not have reasonably foreseen the misuse; or
Page 9, Line 20(b) Specifically disallowed the misuse in disclosures pursuant to section 6-1-1703 (1).
Page 10, Line 1(3) Nothing in this section limits, displaces, or otherwise
Page 10, Line 2affects any liability that a developer may have in the
Page 10, Line 3developer's own right, separate and apart from liability under
Page 10, Line 4this section, for a violation of state or federal law. Compliance
Page 10, Line 5with the requirements of this part 17 is not a defense to, and
Page 10, Line 6does not otherwise excuse, noncompliance with any applicable law.
Page 10, Line 76-1-1708. Enforcement. (1) A violation of this part 17
Page 10, Line 8constitutes an unfair or deceptive trade practice pursuant to section 6-1-105 (1)(hhhh).
Page 10, Line 9(2) This part 17 does not provide the basis for a private right of action.
Page 10, Line 10(3) Nothing in this part 17 preempts or otherwise affects
Page 10, Line 11any other right, claim, remedy, presumption, or defense
Page 10, Line 12available at law or in equity, including any right available
Page 10, Line 13pursuant to laws governing anti-discrimination, competition, privacy, or unfair and deceptive acts and practices.
Page 10, Line 146-1-1709. Rules.The attorney general may adopt rules as necessary to implement and enforce this part 17.
Page 10, Line 15SECTION 2. Safety clause. The general assembly finds,
Page 10, Line 16determines, and declares that this act is necessary for the immediate
Page 10, Line 17preservation of the public peace, health, or safety or for appropriations for
Page 10, Line 18the support and maintenance of the departments of the state and state institutions.