03/27/2024 01:00 PM House JUDICIARY
| Audio | Topic |
|---|---|
| Start | |
| HB227 | |
| HB358 | |
| Adjourn |
+ teleconferenced
= bill was previously heard/scheduled
| += | HB 338 | TELECONFERENCED | |
| + | TELECONFERENCED | ||
| += | HB 107 | TELECONFERENCED | |
| += | HB 227 | TELECONFERENCED | |
| += | HB 358 | TELECONFERENCED | |
ALASKA STATE LEGISLATURE
HOUSE JUDICIARY STANDING COMMITTEE
March 27, 2024
1:04 p.m.
MEMBERS PRESENT
Representative Sarah Vance, Chair
Representative Jamie Allard, Vice Chair
Representative Ben Carpenter
Representative Craig Johnson
Representative Jesse Sumner
Representative Andrew Gray
Representative Cliff Groh
MEMBERS ABSENT
All members present
COMMITTEE CALENDAR
HOUSE BILL NO. 227
"An Act relating to liability of an electric utility for contact
between vegetation and the utility's facilities."
- MOVED CSHB 227(JUD) OUT OF COMMITTEE
HOUSE BILL NO. 358
"An Act relating to use of artificial intelligence to create or
alter a representation of the voice or likeness of an
individual."
- HEARD & HELD
HOUSE BILL NO. 338
"An Act relating to physician liability for gender transition
procedures performed on minors; and providing for an effective
date."
- BILL HEARING CANCELED
HOUSE BILL NO. 107
"An Act relating to criminal law definitions."
- BILL HEARING CANCELED
PREVIOUS COMMITTEE ACTION
BILL: HB 227
SHORT TITLE: ELECTRIC UTILITY LIABILITY
SPONSOR(s): REPRESENTATIVE(s) RAUSCHER
01/16/24 (H) PREFILE RELEASED 1/8/24
01/16/24 (H) READ THE FIRST TIME - REFERRALS
01/16/24 (H) ENE, JUD
01/23/24 (H) ENE AT 10:15 AM BARNES 124
01/23/24 (H) Heard & Held
01/23/24 (H) MINUTE(ENE)
01/25/24 (H) ENE AT 10:15 AM BARNES 124
01/25/24 (H) Moved HB 227 Out of Committee
01/25/24 (H) MINUTE(ENE)
01/26/24 (H) ENE RPT 4DP 3AM
01/26/24 (H) DP: BAKER, MCKAY, WRIGHT, RAUSCHER
01/26/24 (H) AM: SCHRAGE, ARMSTRONG, PRAX
03/06/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/06/24 (H) Heard & Held
03/06/24 (H) MINUTE(JUD)
03/08/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/08/24 (H) Heard & Held
03/08/24 (H) MINUTE(JUD)
03/11/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/11/24 (H) Heard & Held
03/11/24 (H) MINUTE(JUD)
03/27/24 (H) JUD AT 1:00 PM GRUENBERG 120
BILL: HB 358
SHORT TITLE: PROHIBIT AI-ALTERED REPRESENTATIONS
SPONSOR(s): REPRESENTATIVE(s) CRONK
02/20/24 (H) READ THE FIRST TIME - REFERRALS
02/20/24 (H) JUD
03/13/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/13/24 (H) Heard & Held
03/13/24 (H) MINUTE(JUD)
03/15/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/15/24 (H) Heard & Held
03/15/24 (H) MINUTE(JUD)
03/20/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/20/24 (H) <Bill Hearing Canceled>
03/22/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/22/24 (H) Heard & Held
03/22/24 (H) MINUTE(JUD)
03/25/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/25/24 (H) Heard & Held
03/25/24 (H) MINUTE(JUD)
03/27/24 (H) JUD AT 1:00 PM GRUENBERG 120
WITNESS REGISTER
REPRESENTATIVE GEORGE RAUSCHER
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Gave final comment on HB 227, as amended,
as the prime sponsor.
BOB BALLINGER, Staff
Representative Sarah Vance
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Explained amendments to the proposed CS for
HB 358, Version U, on behalf of Representative Vance.
IAN WALSH, Attorney
Legislative Legal Services
Legislative Affairs Agency
Juneau, Alaska
POSITION STATEMENT: Answered questions during the hearing on
the proposed CS for HB 358, Version U.
KACI SCHROEDER, Assistant Attorney General
Legal Services Section
Criminal Division
Department of Law
Juneau, Alaska
POSITION STATEMENT: Answered questions during the hearing on
the proposed CS for HB 358, Version U.
ACTION NARRATIVE
1:04:45 PM
CHAIR VANCE called the House Judiciary Standing Committee
meeting to order at 1:04 p.m. Representatives Allard,
Carpenter, Johnson, Sumner, Gray, Groh, and Vance were present
at the call to order.
HB 227-ELECTRIC UTILITY LIABILITY
1:05:32 PM
CHAIR VANCE announced that the first order of business would be
HOUSE BILL NO. 227, "An Act relating to liability of an electric
utility for contact between vegetation and the utility's
facilities."
CHAIR VANCE noted that Amendment 3 had been tabled [on 3/11/24].
1:06:04 PM
The committee took an at-ease from 1:06 p.m. to 1:08 p.m.
1:08:13 PM
REPRESENTATIVE GRAY moved Amendment 4 to HB 227, labeled 33-
LS0969\B.6, Walsh/A. Radford, 3/13/24, which read:
Page 1, line 14:
Delete "if a part of the trunk of the vegetation
is"
Page 2, line 1, following "right-of-way":
Insert ", even if the vegetation is rooted
outside the boundaries of the utility's real property,
lease, permit, easement, or right-of-way"
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
1:08:27 PM
REPRESENTATIVE GRAY explained that Amendment 4 would make the
language starting on page 1, line 13, read "A utility is not
liable for cutting, girdling, or otherwise injuring or removing
vegetation inside the boundaries of the utility's real property,
lease, permit, easement, or right-of-way, even if the vegetation
is rooted outside the boundaries of the utility's real property,
lease, permit, easement, or right-of-way." He indicated that
the proposed amendment would allow utilities to cut down
anything [inside the right-of-way] without being held liable.
1:09:19 PM
REPRESENTATIVE CARPENTER removed his objection. There being no
further objection, Amendment 4 was adopted.
CHAIR VANCE sought final comment on HB 227, as amended.
1:09:56 PM
REPRESENTATIVE GEORGE RAUSCHER, Alaska State Legislature, said
he appreciated Amendment 4 and directed members to a legal
memorandum ("memo"), dated 3/27/24 [hard copy included in the
committee packet].
1:10:37 PM
REPRESENTATIVE ALLARD moved to report HB 227, as amended, out of
committee with individual recommendations and the accompanying
fiscal notes. There being no objection, CSHB 227(JUD) was
reported from the House Judiciary Standing Committee.
1:11:02 PM
The committee took a brief at-ease.
1:11:14 PM
REPRESENTATIVE ALLARD gave Legislative Legal Services permission
to make all technical and conforming changes as necessary.
1:11:23 PM
The committee took an at-ease from 1:11 p.m. to 1:13 p.m.
HB 358-PROHIBIT AI-ALTERED REPRESENTATIONS
1:13:55 PM
CHAIR VANCE announced that the final order of business would be
HOUSE BILL NO. 358, "An Act relating to use of artificial
intelligence to create or alter a representation of the voice or
likeness of an individual." [Before the committee, adopted as
the working document on 3/22/24, was the proposed committee
substitute (CS) for HB 358, Version 33-LS1272\U, Walsh, 3/21/24
("Version U").]
1:14:52 PM
The committee took a brief at-ease.
1:15:03 PM
REPRESENTATIVE ALLARD moved to adopt Amendment 1 to Version U,
labeled 33-LS1272\U.10, A. Radford/Walsh, 3/26/24, which read:
Page 3, lines 23 - 30:
Delete all material and insert:
"(2) "deepfake" means any visual or audio
media that is created, altered, or otherwise
manipulated by artificial intelligence in a manner
that
(A) to a reasonable observer, appears to be
an authentic record of an individual's actual speech,
conduct, or likeness;
(B) conveys a fundamentally different
understanding or impression of the individual's
appearance, action, or speech than a reasonable person
would have from the unaltered, original version of the
individual's appearance, action, or speech; and
(C) is intended to harm the individual
whose appearance, action, or speech has been altered
or manipulated;"
REPRESENTATIVE CARPENTER objected.
1:15:16 PM
BOB BALLINGER, Staff, Representative Sarah Vance, Alaska State
Legislature, on behalf of Representative Vance, explained that
Amendment 1 would incorporate subparagraph (C) into the
definition of "deepfake" in an attempt to further qualify the
intent of the bill. Subparagraph (C) would specify that the
deepfake must be intended to cause harm to the individual whose
appearance, action, or speech has been altered or manipulated.
CHAIR VANCE sought questions from members of the committee.
1:17:16 PM
REPRESENTATIVE GRAY asked why the term "deepfake" had not been
replaced by ["materially deceptive media"].
MR. BALLINGER indicated that the disclaimer language "felt
awkward" [with the use of "materially deceptive media"].
REPRESENTATIVE GRAY pointed out that different terminology was
used in different settings. He opined that "deepfake" sounds
like slang, whereas "materially deceptive material" sounds more
official.
CHAIR VANCE agreed that "deepfake" sounds like slang. However,
she pointed out that using the term "materially deceptive media"
in the disclaimer for electioneering communications could be
harder to understand.
REPRESENTATIVE GRAY pointed out that the proposed disclaimer
language does not contain the term "deepfake" or "materially
deceptive media" at this time.
MR. BALLINGER acknowledged that the disclaimer would work with
either definition.
1:22:41 PM
REPRESENTATIVE CARPENTER asked why it was necessary to specify
the intent to cause harm.
MR. BALLINGER explained that the new subparagraph was included
to ensure that an individual who used Photoshop, for example, to
make himself/herself look better would not be criminalized.
REPRESENTATIVE SUMNER said, to Representative Carpenter's point,
it's important to realize that a campaign may include
independent expenditure groups that are not intending to harm a
candidate; however, by allowing them to put forward materially
deceptive media in support of their own principles or policy, it
may be harmful.
1:27:11 PM
REPRESENTATIVE CARPENTER opined that materially deceptive
material is deceptive whether or not the intent is to cause
harm.
MR. BALLINGER said the same thing could be said about deepfakes,
which is why the qualifiers are necessary.
REPRESENTATIVE CARPENTER shared his belief that the public would
like [elected officials] to err on the side of transparency and
to eliminate the fakeness.
REPRESENTATIVE ALLARD asserted that comments are often taken out
of context by the press.
1:29:29 PM
REPRESENTATIVE GRAY said his problem with subparagraph (C) was
the word "individual." He shared a hypothetical example to
suggest that the creation of a video that uses Nancy Pelosi to
endorse a conservative republican would be a deepfake, despite
it not being encompassed in the current definition.
MR. BALLINGER agreed with Representative Gray and questioned
where to draw the line. He claimed that everyone uses AI to
make themselves look better, and therefore, advised against
criminalizing that conduct.
1:33:57 PM
REPRESENTATIVE CARPENTER said he was not in favor of
subparagraph (C) being in statute and suggested changing "the
individual" in subparagraph (B) to "an individual".
REPRESENTATIVE GRAY shared his understanding that the term "the
individual" in subparagraph (B) is a reference back to the
language "a individual" in section (A) whose appearance, action,
or speech has been altered or manipulated.
REPRESENTATIVE CARPENTER asked whether "the individual" in
subparagraph (B) of Amendment 1 applied to the individual
identified in subparagraph (A) and whether changing it to "an
individual" would apply to more than one person.
1:39:26 PM
IAN WALSH, Attorney, Legislative Legal Services, Legislative
Affairs Agency (LAA), confirmed that "the individual" in
subparagraph (B) refers to the individual in subparagraph (A).
He directed attention [page 2], lines 18-27 of Version U and
explained that both subsections (b) and (c) permit causes of
action by an individual whose likeness has been manipulated.
Consequently, it would not allow someone other than Nancy Pelosi
in the Nancy Pelosi hypothetical to bring a cause of action if
Ms. Pelosi's likeness had been manipulated.
1:40:44 PM
REPRESENTATIVE CARPENTER maintained his objection.
A roll call vote was taken. Representatives Allard, C. Johnson,
and Vance voted in favor of Amendment 1. Representatives
Sumner, Gray, Groh, and Carpenter voted against it. Therefore,
Amendment 1 failed by a vote of 3-4.
1:41:34 PM
The committee took a brief at-ease.
1:41:58 PM
REPRESENTATIVE ALLARD moved to adopt Amendment 2 to Version U,
labeled 33-LS1272\U.4, Walsh, 3/26/24, which read:
Page 1, line 13, through page 2, line 6:
Delete all material and insert:
"(1) production of the material involved
the use of a child under 18 years of age who engaged
in the conduct; or
(2) the material depicts [A DEPICTION OF] a
part of an actual child under 18 years of age, or is a
representation that is indistinguishable from an
identifiable child under 18 years of age, who, by
manipulation, creation, or modification, including by
use of artificial intelligence, appears to be engaged
in the conduct."
Page 2, line 13, following "AS 11.46.990":
Insert ";
(3) "identifiable child" means an
individual who is recognizable as an actual child by
the child's face, likeness, or other distinguishing
characteristics, regardless of whether the individual
depicted is no longer under 18 years of age"
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
1:42:11 PM
MR. BALLINGER stated that Amendment 2 came from the Department
of Law (DOL). He described the proposed amendment as "a better
way at accomplishing what we were ... looking to do." He added
that Amendment 2 would clarify that [as it pertains to child
sexual abuse material], an image of a child who is recognizable
as an actual child but is no longer 18 years of age would be
covered by the bill.
1:42:59 PM
REPRESENTATIVE GRAY inquired about the language on line 8 of
Amendment 2.
MR. BALLINGER explained that the language on line 8 would
specify that an image of a child under the age of 18 that was
wholly created or manipulated with the use of artificial
intelligence (AI) would still fall into this category.
1:45:32 PM
REPRESENTATIVE CARPENTER asked whether Amendment 2 would empower
investigators or law enforcement [with regard to prosecutions
involving child sexual abuse material].
1:46:21 PM
KACI SCHROEDER, Assistant Attorney General, Legal Services
Section, Criminal Division, Department of Law (DOL), said
current law does not require prosecutors to prove the identity
of a child; however, because Amendment 2 specifically calls out
"an identifiable child," the department would be required to
prove the identity of the child if the proposed amendment were
to pass.
1:48:07 PM
REPRESENTATIVE CARPENTER asked how the department would
determine whether the child is 18 years old.
MS. SCHROEDER stated that under current law, the age of the
child must be proven, which can be difficult to ascertain and
sometimes requires expert testimony.
REPRESENTATIVE CARPENTER asked whether Amendment 2, as currently
written, would be enforceable.
MS. SCHROEDER said [DOL] could attempt to enforce it. She
reiterated that even under current law, identification is rare,
so the number of cases would be small.
1:49:44 PM
REPRESENTATIVE CARPENTER removed his objection. There being no
further objection, Amendment 2 was adopted.
1:50:04 PM
REPRESENTATIVE ALLARD moved to adopt Amendment 3 to Version U,
labeled 33-LS1272\U.6, Walsh, 3/26/24, which read:
Page 3, lines 13 - 17:
Delete all material and insert:
"(e) An interactive computer service, Internet
service provider, cloud service provider,
telecommunications network, or radio or television
broadcaster, including a cable or satellite television
operator, programmer, or producer, is not liable under
this section for hosting, publishing, or distributing
an electioneering communication provided by another
person. This subsection does not prevent an individual
from bringing an action under (b)(2) of this section
for removing a disclosure statement."
Page 4, lines 11 - 12:
Delete all material and insert:
"(4) "interactive computer service" has the
meaning given in 47 U.S.C. 230, as that section read
on January 1, 2024."
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
1:50:12 PM
MR. BALLINGER noted that Amendment 3 came from TechNet. He
explained that the proposed amendment would broaden the
exemption language that protects certain service providers from
liability unless they were to remove a disclaimer from the
media. In addition, it would add the date of January 1, 2024,
to the federal citation 47 U.S.C. 230 to ensure that if the
federal code were to change, state law would not change with it.
1:51:28 PM
REPRESENTATIVE SUMNER asked why [instead of referencing federal
statute], the language in 47 U.S.C. 230 wasn't inserted into the
bill.
MR. BALLINGER said the purpose was to ensure that state statute
reflects whatever was codified on January 1, 2024, in federal
code. He added that the language from 47 U.S.C. 230 could be
added at the committee's discretion.
REPRESENTATIVE SUMNER said his preference was to avoid making
references to federal code in state statute, and instead, insert
the necessary language. He expressed concern that if the
federal government were to change that code retroactively, it
could "make a mess of things."
REPRESENTATIVE ALLARD pointed out that other bills had been
passed with reference to federal statute. She asked how this
would be any different.
1:54:53 PM
The committee took an at-ease from 1:54 p.m. to 1:57 p.m.
1:57:19 PM
REPRESENTATIVE SUMNER moved to adopt conceptual Amendment 1 to
Amendment 3, which would delete lines 12-13 and insert the
following language:
or interactive computer service. The term "interactive
computer service" means any information service,
system, or access software provider that provides or
enables computer access by multiple users to a
computer server, including specifically a service or
system that provides access to the Internet and such
systems operated or services offered by libraries or
educational institutions.
CHAIR VANCE announced that there being no objection, Conceptual
Amendment 1 to Amendment 3 was adopted.
1:58:21 PM
REPRESENTATIVE GRAY sought to confirm that platforms, such as
Instagram, X, and Facebook could still be held liable for
spreading AI generated [electioneering communications].
MR. BALLINGER clarified that those platforms are included in
Amendment 3 and therefore, would be excluded from liability
unless they were to specifically remove a disclaimer.
REPRESENTATIVE GRAY asked whether [the committee] would be
laying the groundwork for deepfake material to flourish by
granting social media platforms immunity.
MR. BALLINGER shared his belief that immunity must be granted
from this type of prosecution unless these platforms were to
proactively and knowingly perpetuate the problem.
REPRESENTATIVE GRAY expressed concern that there would be no
incentive to stop something from going viral.
MR. BALLINGER stated that the bill would provide the ability to
request an injunction that requires the publisher to remove the
material. He shared his belief that by not granting immunity to
social media platforms, it could prevent the legislation from
moving forward and fundamentally change how people use social
media.
2:03:09 PM
REPRESENTATIVE CARPENTER urged social media websites to do a
better job at policing their content; nonetheless, he
acknowledged their First Amendment rights. He opined that
keeping responsibility with the person [who published the
content] is the right thing to do.
REPRESENTATIVE ALLARD shared her belief that there are
consequences and repercussions for speaking freely in this
country.
2:05:01 PM
REPRESENTATIVE GRAY asked where social media platforms are
included in Amendment 3.
MR. BALLINGER stated that they are encompassed in the definition
of "interactive computer service."
REPRESENTATIVE GRAY moved to adopt Conceptual Amendment 2 to
Amendment 3 to delete "an interactive computer service" on page
1, line 3.
REPRESENTATIVE ALLARD objected.
REPRESENTATIVE SUMNER objected.
2:06:33 PM
REPRESENTATIVE SUMNER said he objected to the proposed amendment
because [the committee] should be practical about what can be
passed.
CHAIR VANCE questioned the impact of removing "interactive
computer services" from the exemption in Amendment 3.
2:08:20 PM
MR. WALSH said ultimately, it may have no impact because 47
U.S.C. 230 has been interpreted by the courts to basically
provide immunity to social media platforms for moderation
decisions and content provided by third parties. He explained
that if "interactive computer service" were removed, federal law
still preempts state law and would still provide immunity. He
noted that the definition of "interactive computer service"
adopted by the committee includes terms, such as "information
service" and "active software provider" that are defined in
federal law. Consequently, the effect of Amendment 3, as
amended, might still require a reference to federal code to make
sense.
2:10:21 PM
REPRESENTATIVE ALLARD maintained her objection.
2:10:25 PM
A roll call vote was taken. Representatives Gray and Groh voted
in favor of Conceptual Amendment 2 to Amendment 3.
Representatives Allard, Carpenter, C. Johnson, Sumner, and Vance
voted against it. Therefore, Conceptual Amendment 2 to
Amendment 3 failed by a vote of 2-5.
2:11:11 PM
REPRESENTATIVE CARPENTER removed his objection to Amendment 3,
as amended.
REPRESENTATIVE GRAY objected.
2:11:22 PM
A roll call vote was taken. Representatives C. Johnson, Sumner,
Groh, Allard, Carpenter, and Vance voted in favor of Amendment
3, as amended. Representative Gray voted against it.
Therefore, Amendment 3, as amended, passed by a vote of 6-1.
2:12:01 PM
REPRESENTATIVE ALLARD moved to adopt Amendment 4 to Version U,
labeled 33-LS1272\U.8, Klein/Walsh, 3/26/24, which read:
Page 2, lines 9 - 12:
Delete all material and insert:
"(1) "artificial intelligence" means a
machine-based system that, for explicit or implicit
objectives, infers, from the input the system
receives, how to generate outputs, including
predictions, content, recommendations, and decisions
that can influence physical or virtual environments,
with different artificial intelligence systems varying
in their levels of autonomy and adaptiveness after
deployment;"
Page 3, lines 19 - 22:
Delete all material and insert:
"(1) "artificial intelligence" means a
machine-based system that, for explicit or implicit
objectives, infers, from the input the system
receives, how to generate outputs, including
predictions, content, recommendations, and decisions
that can influence physical or virtual environments,
with different artificial intelligence systems varying
in their levels of autonomy and adaptiveness after
deployment;"
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
2:12:03 PM
MR. BALLINGER stated that Amendment 4 would update the
definition of "artificial intelligence" to the language provided
by TechNet.
REPRESENTATIVE SUMNER said he supported the proposed amendment
because it would address previously raised issues.
REPRESENTATIVE GROH asked Mr. Ballinger to explain what
Amendment 4 would do in terms of sanctions.
MR. BALLINGER reiterated that Amendment 4 would insert the most
"up to date" definition of AI into the bill.
REPRESENTATIVE CARPENTER removed his objection. There being no
further objection, Amendment 4 was adopted.
2:13:41 PM
REPRESENTATIVE ALLARD moved to adopt Amendment 5 to Version U,
labeled 33-LS1272\U.3, Walsh, 3/26/24, which read:
Page 2, line 2:
Delete "or"
Page 2, line 6, following "characteristics":
Insert "; or
(3) material has been manipulated, created,
or modified using artificial intelligence to appear to
depict a child under 10 years of age engaging in the
conduct"
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
2:14:03 PM
REPRESENTATIVE GRAY explained that Amendment 5 would prohibit AI
depictions of a child under the age of 10 engaging in conduct
outlined in AS 11.41.455(a). He acknowledged that the provision
might run afoul of the First Amendment; however, he noted that
the Miller Test was established in Miller v. California to
determine whether [expression constitutes obscenity] and
reasoned that the depiction of a child under 10 years of age
being sexually abused would meet that standard.
2:15:43 PM
REPRESENTATIVE GROH noted that it is sometimes hard to tell
whether a person is 14 or 19 due to varying rates of development
and other factors. Conversely, he opined that it's easy to tell
whether a person had gone through puberty. Consequently, he
shared his belief that Amendment 4 makes sense.
2:17:12 PM
The committee took a brief at-ease.
2:18:23 PM
REPRESENTATIVE CARPENTER asked why it was necessary to identify
a child under 10 years of age, as opposed to a child under 18
years of age.
REPRESENTATIVE GRAY said he chose the age of 10 because if AI is
being used to wholly generate unidentifiable children, he wanted
to choose an age that would be clearly identifiable as a minor.
2:21:14 PM
CHAIR VANCE asked whether there was any risk to adopting
Amendment 4.
MS. WALSH deferred to DOL. He reported that before AI became
what it is today, the U.S. Supreme Court held that the
government interest in prohibiting virtual child pornography was
not sufficient for it to be allowed under the First Amendment.
2:22:27 PM
REPRESENTATIVE GROH said he would be interested in Ms.
Schroeder's perspective on Amendment 4.
MS. SCHROEDER explained that the bill would extend the
identifiable harm associated with child pornography and morphed
child pornography to identifiable children, which is untested.
Amendment 4 is one step away from that, she said. She
highlighted case law addressing child pornography developed
without the use of an actual child. She acknowledged that the
Miller Test helps define something as obscene, however, the
department was unable to "suss all that out" to give a
definitive answer on Amendment 4.
MR. BALLINGER commented on the Miller Test and referenced
arguments made in the Ashcroft case.
2:24:59 PM
CHAIR VANCE questioned how Amendment 4 would be enforced and
whether there would be a risk to adopting it.
MS. SHCROEDER said the risk would be if the department were to
prosecute under this statute, and the statute being declared
unconstitutional, which would [nullify] the criminal case.
2:27:36 PM
REPRESENTATIVE CARPENTER asked how the department would discern
whether the child in an AI generated image was under the age of
10 if there is no actual child to ask.
MS. SCHROEDER said [the child in] the image would have to appear
to a reasonable person to be under the age of 10. She pointed
out that the language "appears to depict" would soften the
requirements of having to hardline prove the child's age.
2:29:04 PM
REPRESENTATIVE GRAY, in wrap up, read the following language
from a legal memo from Mr. Walsh: "The First Amendment does
allow the government to prohibit material that meets the
obscenity standard in Miller v. California because it is not
protected speech." He opined that it would be worth trying to
prohibit "the most disgusting material imaginable."
2:29:48 PM
REPRESENTATIVE CARPENTER removed his objection.
CHAIR VANCE objected. She explained that she was not yet
comfortable with the proposed language and that it required
further development.
2:31:08 PM
A roll call vote was taken. Representatives Sumner, Gray, and
Groh voted in favor of Amendment 5. Representatives Carpenter,
C. Johnson, and Vance voted against it. Therefore, Amendment 5
failed by a vote of 3-3.
2:31:48 PM
REPRESENTATIVE GROH moved to adopt Amendment 6 to Version U,
labeled 33-LS1272\U.1, Gunther/Walsh, 3/26/24, which read:
Page 1, line 2:
Delete "and"
Following "communications":
Insert "; relating to the Alaska Artificial
Intelligence Task Force; and providing for an
effective date"
Page 4, following line 12:
Insert a new bill section to read:
"* Sec. 5. The uncodified law of the State of
Alaska is amended by adding a new section to read:
ALASKA ARTIFICIAL INTELLIGENCE TASK FORCE. (a)
The Alaska Artificial Intelligence Task Force is
created as a joint task force of the Alaska State
Legislature. The task force consists of seven voting
members appointed as follows:
(1) a member of the house of
representatives, appointed by the speaker of the house
of representatives, who shall serve as co-chair of the
task force;
(2) a member of the senate, appointed by
the president of the senate, who shall serve as co-
chair of the task force;
(3) a member who is an expert on law
enforcement and, if possible, has experience in the
usage of artificial intelligence systems, appointed by
the governor;
(4) a member who is an expert in
constitutional and legal rights, appointed by the
governor;
(5) three members who are academic faculty
members of the University of Alaska, appointed by the
Board of Regents; in appointing members under this
paragraph, the Board of Regents shall ensure that
(A) one member specializes in ethics and,
if possible, has experience in the ethics of
technology;
(B) one member specializes in computer
systems and, if possible, has experience in artificial
intelligence; and
(C) one member specializes in the economic
or social effects of new technology.
(b) A member appointed under (a) of this section
serves at the pleasure of the appointing authority.
(c) In calendar years 2025 and 2026, the task
force shall meet at least once each calendar quarter
at the call of the co-chairs. The co-chairs shall
determine whether the task force will meet in calendar
years 2027 and 2028 and notify the members. If the co-
chairs notify members that the task force will meet in
calendar years 2027 and 2028, the task force shall
meet at least once each calendar quarter at the call
of the co-chairs. A majority of the members of the
task force constitutes a quorum for the transaction of
business. A member of the task force participating in
a meeting by remote communication is present for the
purposes of establishing a quorum. Meetings of the
task force are subject to AS 44.62.310 - 44.62.319
(Open Meetings Act).
(d) The task force may adopt procedures for the
management and governance of the task force.
(e) Not later than February 1, 2027, and, if the
co-chairs of the task force determine that the task
force will meet in calendar years 2027 and 2028 under
(c) of this section, not later than February 1, 2029,
the task force shall submit a report to the chief
clerk of the house of representatives and the senate
secretary and notify the legislature that the report
is available. The report must
(1) contain a detailed review of the effect
of artificial intelligence technology on the state and
residents, businesses, and local governments in the
state; and
(2) provide recommendations on changes in
policy, including policies related to criminal and
civil liability for violations of law resulting from
the use of artificial intelligence by an individual,
an organization, a local government, or the state.
(f) In preparing a report required under (e) of
this section, the task force shall
(1) investigate how potential problems with
artificial intelligence may be addressed under state
law;
(2) determine how the application of state
law may be affected by artificial intelligence;
(3) review how other states have regulated
artificial intelligence;
(4) investigate the potential benefits and
harms of artificial intelligence on economic and
community development, including
(A) education, workforce development, and
employment in the state;
(B) the acquisition and disclosure of
confidential information;
(C) crime, public safety, and weaponry; and
(D) discrimination resulting from the use
of automated decision systems;
(5) determine the feasibility of using
artificial intelligence in the public sector,
including
(A) assessing the need for a state code of
ethics on the use of artificial intelligence systems
in state government;
(B) the effect of automated decision
systems on the constitutional and legal rights,
duties, and privileges of state residents; and
(C) the potential benefits available to and
liability of the state that may result from
implementing automated decision systems;
(6) investigate the effects of deepfakes on
the government, elections, and cybersecurity of the
state; and
(7) research the potential effects on the
private sector of any recommendation the task force
intends to make.
(g) The task force may use the research services
of the Legislative Affairs Agency.
(h) Members serve without compensation but are
entitled to per diem and travel expenses authorized
for members of boards and commissions under
AS 39.20.180.
(i) In this section,
(1) "algorithm" includes a procedure
incorporating machine learning or other artificial
intelligence techniques;
(2) "artificial intelligence" means systems
capable of
(A) perceiving an environment through data
acquisition and processing and interpreting the
derived information to take an action or to imitate
intelligent behavior given a specific goal; and
(B) learning and adapting behavior by
analyzing how the environment is affected by past
actions;
(3) "automated decision system" means an
algorithm that uses data-based analytics to make or
support governmental decisions, judgments, or
conclusions;
(4) "deepfake" means audio or visual
content generated or manipulated by artificial
intelligence that falsely appears to be authentic or
truthful and that features a depiction of an
individual appearing to say or do things the
individual did not say or do, without the individual's
consent."
Renumber the following bill section accordingly.
Page 4, line 17, following "of":
Insert "secs. 2 and 3 of"
Page 4, following line 17:
Insert new bill sections to read:
"* Sec. 7. Section 5 of this Act is repealed
February 2, 2029.
* Sec. 8. Section 5 of this Act takes effect
January 1, 2025."
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
2:31:55 PM
REPRESENTATIVE GROH explained Amendment 6 would create a state
task force composed of members of the legislature; experts in
law enforcement and constitutional and legal rights; and
academic faculty members of the University of Alaska. The task
force would be a way to understand this giant and multi-faceted
topic. He noted that the definitions included in the proposed
amendment differed from those adopted by the committee in
today's hearing.
2:35:22 PM
REPRESENTATIVE CARPENTER questioned the costs associated with
Amendment 6, noting that the bill was not scheduled to go to the
House Finance Committee at this time.
CHAIR VANCE asked whether Representative Groh had goals in mind
for the four-year task force.
REPRESENTATIVE GROH directed attention to page 3 of Amendment 6,
which outlined the proposed goals. He reiterated that AI could
both help and hurt [the state].
REPRESENTATIVE CARPENTER asked whether similar efforts were
being conducted by better funded and better equipped
organizations. He asked whether the committee could piggyback
on those efforts instead of spending the state's own money.
REPRESENTATIVE GROH stressed the nuance of Alaska-specific
applications.
2:39:19 PM
REPRESENTATIVE C. JOHNSON stated his belief that in four years,
AI won't look the same as it does today; consequently, the task
force could last forever. He added that he was reluctant to
invest money in something that's changing so fast.
REPRESENTATIVE GROH said he would be happy to shorten the
timeframe because the issue is critical to understand.
2:41:38 PM
REPRESENTATIVE SUMNER wondered whether the task force needs to
be established in statute or whether legislative committees can
do their own investigative work and report back to the
legislature.
REPRESENTATIVE CARPENTER recommended leaning on the computer
experts within the executive branch and creating intent language
to require a legislative report on technology security, as well
as recommendations on AI. He added that law enforcement could
be engaged as well to create a whole government approach.
2:43:17 PM
A roll call vote was taken. Representatives Sumner, Gray, and
Groh voted in favor of Amendment 6. Representatives Allard,
Carpenter, C. Johnson, and Vance voted against it. Therefore,
amendment 6 failed by a vote of 3-4.
CHAIR VANCE noted that Amendment 7 would not be offered.
2:44:01 PM
REPRESENTATIVE GRAY moved to adopt Amendment 8 to Version U,
labeled 33-LS1272\U.12, Walsh, 3/27/24, which read:
Page 1, line 2:
Delete "and"
Following "communications":
Insert "; and relating to reproductions of voice
or likeness using artificial intelligence"
Page 4, following line 12:
Insert a new bill section to read:
"* Sec. 5. AS 45.50 is amended by adding a new
section to read:
Sec. 45.50.905. Reproduction of voice or likeness
using artificial intelligence. (a) A person may not
knowingly, without the authorization of an individual
or, for an individual who is an unemancipated child
under 18 years of age, without the authorization of
the individual's parent or guardian,
(1) use artificial intelligence to create
or alter the voice or likeness of the individual; and
(2) make that voice or likeness
commercially available to the public.
(b) An individual whose voice or likeness is
depicted in violation of this section may bring an
action in the superior court for an injunction to
prohibit dissemination of the voice or likeness and to
recover damages.
(c) In this section,
(1) "artificial intelligence" has the
meaning given in AS 15.80.009;
(2) "likeness" means an image or video that
is readily identifiable as a particular individual;
(3) "voice" means a sound in a medium that
is readily identifiable as and attributable to a
particular individual, regardless of whether the sound
contains the actual voice or a simulation of the voice
of the individual."
Renumber the following bill section accordingly.
REPRESENTATIVE CARPENTER objected for the purpose of discussion.
2:44:06 PM
REPRESENTATIVE GRAY explained that Amendment 8 would create a
section similar to the Ensuring Likeness Voice and Image
Security (ELVIS) Act that was passed in Tennessee. The proposed
amendment would protect artists from having their voice or
likeness used for commercial purposes without their prior
authorization and allow them to sue for damages if the law is
violated.
2:45:32 PM
REPRESENTATIVE ALLARD sought to clarify whether [this type of
intellectual property] was already protected in existing state
or federal law.
MR. WALSH shared his understanding that there is no statutory
protection in current state law for the kind of media that
Amendment 8 would apply to. He noted that there is federal
copyright law that protects owners from unauthorized use of
their copyrighted works, in addition to a common law right of
publicity that is fairly undeveloped in Alaska courts but could
be applied to the media covered by Amendment 8.
2:47:43 PM
REPRESENTATIVE C. JOHNSON asked whether Amendment 8 would extend
existing copyright law.
MR. WALSH said the proposed amendment would not modify existing
copyright law, nor would it grant property rights under tenancy
law.
2:52:13 PM
CHAIR VANCE asked whether Amendment 8 would prohibit the use of
an AI generated image of President Trump or his voice for
commercial purposes.
MR. WALSH answered yes, if the use met the elements of the
proposed statute and there was no prior authorization from
President Trump.
REPRESENTATIVE ALLARD expressed concern that she could be sued
for creating a meme of, for example, Representative Gray.
MR. WALSH said under that hypothetical, Representative Gray
could only sue if his voice or likeness was made commercially
available to the public. He added that the inclusion of
"commercially available" means that there must be monetary gain.
2:54:21 PM
CHAIR VANCE posed a hypothetical scenario involving talk radio,
which uses the voices of different politicians to build an
introduction, and asked whether that would be considered a
violation of the proposed amendment.
MR. WALSH said it's unclear whether the voice or likeness in the
example was being made commercially available. He stated that
it would be open to interpretation and up to the courts to
decide.
2:55:47 PM
REPRESENTATIVE CARPENTER posed a hypothetical scenario in which
a deceased artist's voice was being reconstituted and used for
economic gain. He asked whether Amendment 8 would cover a
deceased individual.
MR. WALSH said, unlike the ELVIS Act, the proposed amendment
would not create a property right that survives the death of the
individual.
2:58:59 PM
REPRESENTATIVE GRAY explained that Amendment 8 was crafted
intentionally narrower than the ELVIS Act to protect living
artists from being used in this way. He reiterated that the
intent was to protect people from having their likeness used to
make money without their permission.
REPRESENTATIVE CARPENTER maintained his objection.
3:00:09 PM
A roll call vote was taken. Representatives Groh, Allard, and
Gray voted in favor of Amendment 8. Representatives Carpenter,
C. Johnson, Sumner, and Vance voted against it. Therefore,
Amendment 8 failed by a vote of 3-4.
CHAIR VANCE announced that CSHB 358, Version U, would be held
over.
3:02:24 PM
ADJOURNMENT
There being no further business before the committee, the House
Judiciary Standing Committee meeting was adjourned at 3:02 p.m.
| Document Name | Date/Time | Subjects |
|---|---|---|
| HB 227 - Amendment #3 (B.6) by Rep. Gray.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 227 |
| HB 227 - 03-27 Leg. Legal Memo.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 227 |
| HB 358 - Amendment #1 (U.10) by Rep. Vance.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #2 (U.4) by Rep. Vance.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #3 (U.6) by Rep. Vance.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #4 (U.8) by Rep. Vance.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #5 (U.3) by Rep. Gray.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #6 (U.1) by Rep. Groh.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #7 (U.5) by Rep. Vance.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 358 - Amendment #8 (U.12) by Rep. Gray.pdf |
HJUD 3/27/2024 1:00:00 PM |
HB 358 |
| HB 227 - Conn Letter of Opposition.pdf |
HJUD 3/27/2024 1:00:00 PM |
227 HB 227 |