Legislature(2023 - 2024)GRUENBERG 120
03/25/2024 01:00 PM House JUDICIARY
Note: the audio
and video
recordings are distinct records and are obtained from different sources. As such there may be key differences between the two. The audio recordings are captured by our records offices as the official record of the meeting and will have more accurate timestamps. Use the icons to switch between them.
| Audio | Topic |
|---|---|
| Start | |
| HB358 | |
| Adjourn |
* first hearing in first committee of referral
+ teleconferenced
= bill was previously heard/scheduled
+ teleconferenced
= bill was previously heard/scheduled
| += | HB 338 | TELECONFERENCED | |
| + | TELECONFERENCED | ||
| += | HB 358 | TELECONFERENCED | |
HB 358-PROHIBIT AI-ALTERED REPRESENTATIONS
1:05:44 PM
CHAIR VANCE announced that the first order of business would be
HOUSE BILL NO. 358, "An Act relating to use of artificial
intelligence to create or alter a representation of the voice or
likeness of an individual." [Before the committee, adopted as
the working document on 3/22/24, was the proposed committee
substitute (CS) for HB 358, Version 33-LS1272\U, Walsh, 3/21/24
("Version U").]
1:06:43 PM
REPRESENTATIVE MIKE CRONK, Alaska State Legislature, prime
sponsor of CSHB 358, Version U, introduced himself and deferred
to his staff.
1:08:15 PM
DAVE STANCLIFF, Staff, Alaska State Legislature, on behalf of
Representative Cronk, prime sponsor of CSHB 358, Version U,
shared that he had been working closely with a "big tech"
representative on language that would make the bill better. He
characterized the bill as landmark legislation and emphasized
the need to do things right because [other legislation] would be
built on it rapidly.
1:09:22 PM
REPRESENTATIVE CARPENTER asked Mr. Stancliff to disclose which
company he had been working with.
MR. STANCLIFF identified his contact as Ben Moore who had
introduced himself as having worked with every major technology
[company] except Microsoft and TikTok.
1:10:14 PM
BOB BALLINGER, Staff, Representative Sarah Vance, Alaska State
Legislature, on behalf of Representative Vance, noted that Ben
Moore is affiliated with an organization, called TechNet, which
is an association of technical providers and technology
companies.
CHAIR VANCE said she had not drafted any amendments because she
wanted to allow the committee to determine which direction to go
in light of the conversations [with TechNet].
1:11:11 PM
MR. BALLINGER offered the following recommended changes based on
the conversation with TechNet: make the depiction of child
sexual abuse material apply to "viewing, production, and
distribution," in addition to possession; on page 2, line 3,
include "it is not a defense to this section that an individual
depicted is no longer a child"; grant entities, like TechNet,
exemption from civil or criminal penalties for violations of
this act for actions taken to prevent, detect, protect against,
report, or respond to the production, generation, incorporation,
synthetization of child sexual abuse material;
1:14:49 PM
REPRESENTATIVE CARPENTER sought to confirm that Mr. Ballinger
was referring to platforms that may want to rid their system [of
pornographic material], which by doing so, would require
conducting a search that would violate the proposed legislation.
MR. BALLINGER confirmed that Representative Carpenter's
understanding was accurate, adding that a similar exemption is
provided to law enforcement for the purposes of investigation.
1:15:45 PM
MR. BALLINGER resumed his explanation of the following
recommendations: specify that unless the person or entity
removes the disclosure set forth in AS 15.80.009, liability
would not be applied to the following entities: interactive
computer service, internet service provider, telecommunications
network, or radio/television/broadcaster, including cable,
satellite television operator, programmer, or producer. In
addition, TechNet recommended using the term "materially
deceptive media" in place of "deepfake."
1:17:53 PM
REPRESENTATIVE GRAY asked why the use of a disclaimer was being
allowed, as opposed to outlawing the use of AI in electioneering
communications entirely.
MR. BALLINGER said he did not know the answer. He surmised that
it would make it more "manageable and defensible" on the grounds
of First Amendment rights.
REPRESENTATIVE SUMNER agreed that prohibiting a form of
political satire could be a substantial First Amendment issue.
CHAIR VANCE said the point of this discussion was to have an
open, transparent conversation about the use of AI.
1:21:14 PM
REPRESENTATIVE CARPENTER asked whether the committee should
consider criminal violations in response to egregious conduct,
as opposed to civil liability. He pointed out that once a deep
fake is posted, the damage is already done.
CHAIR VANCE added that part of it, is considering what authority
to give to the Alaska Public Offices Commission (APOC).
MR. STANCLIFF pondered the best way to define "deepfake" and
said it would be up to the committee to decide.
1:24:30 PM
MR. BALLINGER noted that TechNet had suggested updating the
definition of "artificial intelligence" to "a machine-based
system that for explicit or implicit objectives, infers from the
input it receives how to generate outputs, such as predictions,
content recommendations, or decisions that can influence
physical or virtual environments. Different AI systems vary in
their levels of autonomy and adaptiveness after deployment." In
regard to the disclosure statement, TechNet suggested that the
candidate or sponsor responsible for the advertisement or
electioneering communication include a clear and conspicuous
disclosure that states, "This (image/audio/video) includes
materially deceptive media." Furthermore, he suggested
including a subsection under the definition of "deepfake" or
"materially deceptive media" which states, "is intended to cause
harm to the individual whose appearance, action, or speech has
been manipulated."
1:28:03 PM
REPRESENTATIVE CARPENTER pointed out that AS 11.46.565 [criminal
impersonation in the first degree] and AS 11.46.570 [criminal
impersonation in the second degree] could apply to the conduct
in question.
CHAIR VANCE stated that the committee needs to be clear and
deliberative on the specifics of deepfakes and the use of AI, as
programs, such as spell check, are considered AI.
1:33:23 PM
REPRESENTATIVE SUMNER asked whether Chair Vance was implying
that using spell check to write a statement that misrepresents
an opponent could fall under the proposed legislation. He
opined that such a proposal could be problematic.
CHAIR VANCE said she was saying that the use of AI needs to be
clearly defined because it's already being used in many ways.
She explained that if the legislation is too broad, it could
inadvertently encompass things, like spell check and Photoshop,
so the committee needs to be intentional in narrowly defining
its appropriate use and that which would need a disclaimer.
REPREPRESENTATIVE GROH said he was struck by three things [with
regard to AI]: how little is known, the rapidly changing field,
and the enormous potential for both good and ill.
1:38:31 PM
MR. STANCLIFF noted that the bill sponsor would like the bill to
stay focused on individual protections. He encouraged the
committee to start small and stay limited in scope; in addition,
he pointed out that there was a more comprehensive bill in the
House State Affairs Standing Committee.
1:41:05 PM
REPRESENTATIVE GRAY directed attention to page 2, lines 3-6 of
Version U, and asked whether AI created pornography depicting
[legislators] is legal under current law.
1:41:56 PM
KACI SCHROEDER, Assistant Attorney General, Legal Services
Section, Criminal Division, Department of Law (DOL), confirmed
that it is not a crime. She noted that she had posed the
question to [her colleagues at DOL] when the incident involving
[explicit deepfake images of Taylor Swift] occurred, and it was
determined that it would possibly be a civil matter, not a
crime.
REPRESENTATIVE GRAY asked whether [the legislature] could make
it a crime.
MS. SCHROEDER indicated that it would be a policy call. She
noted that there may be First Amendment issues.
1:42:50 PM
REPRESENTATIVE GRAY expressed his interest in outlawing the use
of AI to make child sexual abuse material [even if the identity
of the child could not be proven].
MS. SCHROEDER noted that AS 11.61.127, as it's currently
written, provides that the identity of the child does not have
to be proven; however, she was unsure how it would work with the
proposed language.
1:45:25 PM
REPRESENTATIVE GRAY considered a hypothetical scenario in which
AI was used to alter real child sexual abuse material by making
the face unrecognizable. He asked whether that would still be
considered child sexual abuse material.
MS. SCHROEDER directed attention to page 1, line 14 [of Version
u], and explained that it doesn't matter if the face has been
blurred, as long as part of the individual can be identified as
an actual child under the age of 18.
REPRESENTATIVE GRAY asked whether the bill would run into
constitutional problems if the language "and the depiction is
recognizable as an identifiable, actual child from the child's
face, likeness, or other distinguishing characteristics" was
deleted from page 2, line 5.
MS. SCHROEDER responded, "That's an open question." She
explained that [DOL] had considered limiting it to an
identifiable child, adding that there is an analysis already in
play for morphed child pornography that would also apply to
images that are identifiable as an actual child.
1:48:07 PM
REPRESENTATIVE CARPENTER asked how [the state] would be able to
prosecute if there's no actual individual to point to.
MS. SCHROEDER stated that there is caselaw surrounding morphed
child pornography with reference to its emotional and
reputational harm. She shared her understanding that the same
argument could extend to the circumstance of a wholly,
artificially created image.
1:49:37 PM
CHAIR VANCE referred to Version U and asked whether AI generated
pornography of an adult would fall under the civil liability for
defamation.
MS. SCHROEDER said potentially, it could fall under that.
CHAIR VANCE asked how often AI had been used as evidence during
criminal trials.
MS. SCHROEDER stated that the department had not seen AI used
very much, if at all, in the context of criminal law. Of the
cases involving pornography, she explained that some morphed
child pornography was encountered; however, it involved an
actual image of a child, not something that had been wholly
created by AI.
1:52:20 PM
REPRESENTATIVE GRAY asked, if an amendment were passed [in
another bill] that defines AI as people, whether wholly AI
generated depictions of unrecognizable people could be outlawed
on the basis of defaming the AI as a person.
MS. SCHROEDER did not know the answer, adding that the question
would require further research.
1:53:50 PM
CHAIR VANCE announced that CSHB 358, Version U, would be held
over.
| Document Name | Date/Time | Subjects |
|---|---|---|
| HB 358 - Sponsor Statement.pdf |
HFSH 3/25/2024 1:00:00 PM HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
| HB 358 - Proposed CS v.U.pdf |
HJUD 3/22/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
| HB 358 - Sectional Analysis.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
| HB 358 - Statement of Zero Fiscal Impact.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
| HB 358 - Alaska Broadcasters Association - Support of Policy.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |
| HB 358 - Backup Document Articles & Research.pdf |
HJUD 3/13/2024 1:00:00 PM HJUD 3/15/2024 1:00:00 PM HJUD 3/25/2024 1:00:00 PM |
HB 358 |