Senate Committee of Reference Report

Committee on Judiciary

Strikethrough:
removed from existing law
Screen Reader Only:
all text indicated as strikethrough will begin as 'deleted from existing statue' and finish with 'end deletion'
All-caps or Bold and Italic:
added to existing law
Screen Reader Only:
all text indicated as all-caps or bold and italic will begin as 'added to existing law' and finish with 'end insertion'

April 21, 2025

After consideration on the merits, the Committee recommends the following:

SB25-288     be amended as follows, and as so amended, be referred to the Committee of the Whole with favorable recommendation:

Page 1, Line 1Amend printed bill, page 3, after line 11 insert:

Page 1, Line 2"(1)  "Broadcaster" means an entity that operates a

Page 1, Line 3licensed AM, FM, or television broadcast facility under the

Page 1, Line 4jurisdiction of the federal communications commission,

Page 1, Line 5including a digital platform owned and operated by the entity.".

Page 1, Line 6Renumber succeeding subsections accordingly.

Page 1, Line 7Page 8, line 15, strike "By law enforcement;" and substitute "To law

Page 1, Line 8enforcement; or".

Page 1, Line 9Page 8, line 16, strike "or".

Page 1, Line 10Page 8, strike line 17.

Page 1, Line 11Page 10, after line 2 insert:

Page 1, Line 12"(6) (a)  Notwithstanding any other provision of this part

Page 1, Line 1315, a broadcaster is not liable pursuant to this part 15 solely

Page 1, Line 14for the broadcast, rebroadcast, or publication of third-party

Page 1, Line 15content that contains or is alleged to contain an intimate

Page 1, Line 16digital depiction if the broadcaster:

Page 1, Line 17(I)  Did not create, alter, or materially contribute to the

Page 1, Line 18development of the intimate digital depiction;

Page 1, Line 19(II)  Lacked actual knowledge that the content was an

Page 1, Line 20intimate digital depiction that the depicted individual did not

Page 1, Line 21consent to have disclosed; and

Page 1, Line 22(III)  Upon obtaining the actual knowledge described in

Page 1, Line 23subsection (6)(a)(II) of this section, acted promptly and in good

Page 2, Line 1faith to remove, cease further dissemination of, or otherwise

Page 2, Line 2limit access to the content, when reasonably feasible.

Page 2, Line 3(b)  This subsection (6) does not limit liability if a

Page 2, Line 4broadcaster knowingly or recklessly broadcasts, publishes, or

Page 2, Line 5distributes content in violation of this part 15, or fails to

Page 2, Line 6respond to a valid request to remove the material.".

Page 2, Line 7Renumber succeeding subsection accordingly.

Page 2, Line 8Page 11, line 10, after "add" insert "(1.7),".

Page 2, Line 9Page 11, strike lines 11 through 13 and substitute:

Page 2, Line 10"18-6-403.  Sexual exploitation of a child - legislative

Page 2, Line 11declaration - definitions. (1.7)  The general assembly further finds

Page 2, Line 12and declares that:

Page 2, Line 13(a)  Due to advances in technology and artificial

Page 2, Line 14intelligence, perpetrators can generate depictions of children

Page 2, Line 15via computer programming that are indistinguishable from

Page 2, Line 16depictions of real children; use partial images of real children

Page 2, Line 17to create a composite image that is unidentifiable as a

Page 2, Line 18particular child and that prevents even experts from

Page 2, Line 19concluding that partial images of real children were used; and

Page 2, Line 20disguise pictures of real children being abused by making the

Page 2, Line 21images appear computer-generated, thereby avoiding detection

Page 2, Line 22and prosecution under previous statutes; and

Page 2, Line 23(b)  Sexually exploitative material results from the abuse

Page 2, Line 24of real children, whether or not the artificial generation or

Page 2, Line 25modification involves an identifiable child. Artificially

Page 2, Line 26generated child sexual abuse material re-victimizes actual

Page 2, Line 27child victims, as their images are collected from technological

Page 2, Line 28sources, including the internet, and studied by artificial

Page 2, Line 29intelligence. The danger facing Colorado's children who are

Page 2, Line 30abused with the aid of sexually exploitative material is just as

Page 2, Line 31great when the abuser uses material produced in whole or in

Page 2, Line 32part by computer programming or artificial intelligence as when

Page 2, Line 33the material consists of images of real children.

Page 2, Line 34(c)  Without legislative action, the difficulties that

Page 2, Line 35members of law enforcement who specialize in investigating

Page 2, Line 36internet crimes against children face will continue to intensify

Page 2, Line 37and threaten to render unenforceable our laws that protect

Page 2, Line 38real children. It is contrary to the values of the people of

Page 2, Line 39Colorado to tolerate the possession, creation, or dissemination

Page 2, Line 40of sexually abusive content containing images that are

Page 3, Line 1virtually indistinguishable from those of real children.

Page 3, Line 2(2)  As used in this section, unless the context otherwise requires:".