Legislature(2023 - 2024)GRUENBERG 120
04/01/2024 01:00 PM House JUDICIARY
Note: the audio
and video
recordings are distinct records and are obtained from different sources. As such there may be key differences between the two. The audio recordings are captured by our records offices as the official record of the meeting and will have more accurate timestamps. Use the icons to switch between them.
Audio | Topic |
---|---|
Start | |
HB358 | |
Adjourn |
* first hearing in first committee of referral
+ teleconferenced
= bill was previously heard/scheduled
+ teleconferenced
= bill was previously heard/scheduled
+= | HB 358 | TELECONFERENCED | |
+ | TELECONFERENCED |
HB 358-PROHIBIT AI-ALTERED REPRESENTATIONS 2:37:03 PM CHAIR VANCE announced that the first order of business would be HOUSE BILL NO. 358, "An Act relating to use of artificial intelligence to create or alter a representation of the voice or likeness of an individual." [Before the committee, adopted as the working document on 3/22/24 and amended on 3/27/24, was the proposed committee substitute (CS) for HB 358, Version 33- LS1272\U, Walsh, 3/21/24, ("Version U").] 2:37:33 PM REPRESENTATIVE ALLARD moved to adopt Amendment 9 to Version U, labeled 33-LS1272\U.13, Walsh, 4/1/24, which read: Page 2, following line 6: Insert a new bill section to read: "* Sec. 3. AS 11.61.127(b) is amended to read: (b) This section does not apply to (1) persons providing plethysmograph assessments in the course of a sex offender treatment program that meets the minimum standards under AS 33.30.011(a)(5); or (2) an employee of an interactive computer service, Internet service provider, cloud service provider, or telecommunications network who, while acting in the scope of employment, possesses or accesses the material described in (a) of this section solely to prevent, detect, report, or otherwise respond to the production, generation, manipulation, or modification of the material; in this paragraph, "interactive computer service" has the meaning given in AS 15.80.009." Renumber the following bill sections accordingly. Page 4, line 15, following "Act,": Insert "AS 11.61.127(b), as amended by sec. 3 of this Act," Page 4, line 16: Delete "sec. 3" Insert "sec. 4" REPRESENTATIVE CARPENTER objected for the purpose of discussion. 2:37:40 PM BOB BALLINGER, Staff, Representative Sarah Vance, Alaska State Legislature, on behalf of Representative Vance, noted that Amendment 9 had come from TechNet. He indicated that the proposed amendment would allow social media platforms to search for [child pornography] and remove it from their website. 2:39:21 PM REPRESENTATIVE C. JOHNSON directed attention to line 5 of Amendment 9 and asked for the definition of "plethysmograph." MR. BALLINGER deferred to the Department of Law (DOL). 2:40:04 PM KACI SCHROEDER, Assistant Attorney General, Legal Services Section, Criminal Division, Department of Law (DOL), defined "plethysmograph" as a device that is physically attached to a person to try to gauge his/her sexual response to stimuli. CHAIR VANCE asked what section of law Amendment 9 belongs to and why it would be necessary. 2:41:04 PM MS. SCHROEDER said the section of law relates to the possession of child pornography. She explained that [with regard to plethysmograph assessments], a probation officer may gauge, through the device, a person's sexual response to various images. CHAIR VANCE asked why [paragraph] (2) is necessary in this area of statute. MS. SCHROEDER explained that the section would allow internet platforms to search for child pornography for the purpose of taking it down without the risk of prosecution. 2:42:24 PM REPRESENTATIVE CARPENTER removed his objection. There being no further objection, Amendment 9 was adopted. 2:42:33 PM REPRESENTATIVE ALLARD moved to adopt Amendment 10 to Version U, labeled 33-LS1272\U.15, Walsh, 4/1/24, which read: Page 2, lines 16 - 17: Delete "use a deepfake in an electioneering communication made with the intent to influence an election" Insert "create an electioneering communication with the intent to influence an election knowing that the electioneering communication includes a deepfake" Page 2, line 19: Delete "in violation of" Insert "included in an electioneering communication that violates" Page 2, line 21: Delete "deepfake" Insert "electioneering communication" Page 2, line 22: Delete "deepfake" Insert "electioneering communication" Page 2, line 24, following "communication": Insert "with the intent to influence an election and knowing that the electioneering communication includes a deepfake" Page 2, line 26: Delete "in violation of" Insert "included in an electioneering communication that violates" Page 2, line 27: Delete "deepfake" Insert "electioneering communication" REPRESENTATIVE CARPENTER objected for the purpose of discussion. 2:42:40 PM MR. BALLINGER explained Amendment 10 was drafted by request of the Motion Picture Association to clarify that the association would not be held liable for unknowingly presenting a deepfake in electioneering communications. 2:44:12 PM REPRESENTATIVE SUMNER said he opposed the proposed amendment because those held liable for creating [deepfakes] may be held beyond [the state's] jurisdiction. He opined that the provision may make the bill substantially less effective. REPRESENTATIVE GRAY agreed with Representative Sumner and emphasized the need to put the onus on the individual who is posting it. MR. BALLIGNER directed attention to page 2, line 19 [of Version U], indicating that the person who created the deepfake or retained the services of another to create the deepfake [would be held liable]. 2:47:00 PM The committee took a brief at-ease at 2:47 p.m. 2:47:52 PM REPRESENTATIVE SUMNER opined that the bill should discourage a candidate from boosting or spreading deepfake material [even if he/she was not the creator]. He stated his belief that the legislation would be moot if that loophole is allowed. 2:49:08 PM REPRESENTATIVE SUMNER moved to adopt Conceptual Amendment 1 to Amendment 10 to strike "create" and insert "use" on line 4. REPRESENTATIVE GRAY objected. 2:49:58 PM REPRESENTATIVE ALLARD asked for Mr. Ballinger's opinion on Conceptual Amendment 1 to Amendment 4. MR. BALLINGER opined that the change would "[fill] a loophole." 2:51:01 PM The committee took a brief at-ease. 2:52:21 PM REPRESENTATIVE SUMNER removed Conceptual Amendment 1 to Amendment 10 in light of the conceptual amendment not achieving its intended purposes. REPRESENTATIVE ALLARD withdrew Amendment 10. REPRESENTATIVE GRAY objected. He suggested that the committee vote the amendment down and re-write it. CHAIR VANCE explained that withdrawing the amendment would have the same impact. REPRESENTATIVE GRAY removed his objection. There being no further objection, Amendment 10 was withdrawn. 2:55:21 PM CHAIR VANCE sought further questions on Version U, as amended. 2:56:16 PM REPRESENTATIVE GRAY asked whether a person who unknowingly used deepfake material in electioneering [communications] could be held in any way responsible if he/she didn't create it. 2:56:59 PM IAN WALSH, Attorney, Legislative Legal Services, Legislative Affairs Agency (LAA), said the prohibition in subsection (a) would appear to prohibit that hypothetical circumstance; however, subsection (b) only permits an individual to bring action to recover damages from someone who created the deepfake, retained the services of another to create it, or removes the disclosure statement. He opined that [Amendment 10] was one way of addressing the discrepancy between subsections (a) and (b). He shared his understanding that instead, the committee would prefer to broaden subsection (b) to include more than just the person who creates the deepfake. 2:58:15 PM CHAIR VANCE announced that CSHB 358, Version U, as amended, would be held over. She said her will was to create a foundational piece [of legislation] that would establish protections.
Document Name | Date/Time | Subjects |
---|---|---|
HB 358 - Amendment #9 (U.13) by Rep. Vance.pdf |
HJUD 4/1/2024 1:00:00 PM |
HB 358 |
HB 358 - Amendment #10 (U.15) by Rep. Vance.pdf |
HJUD 4/1/2024 1:00:00 PM |
HB 358 |