04/29/2025 03:30 PM Senate STATE AFFAIRS
| Audio | Topic |
|---|---|
| Start | |
| SB26 | |
| SB102 | |
| SB37 | |
| SB33 | |
| SB2 | |
| Adjourn |
+ teleconferenced
= bill was previously heard/scheduled
| += | SB 37 | TELECONFERENCED | |
| *+ | SB 2 | TELECONFERENCED | |
| *+ | SB 33 | TELECONFERENCED | |
| += | SB 26 | TELECONFERENCED | |
| += | SB 102 | TELECONFERENCED | |
ALASKA STATE LEGISLATURE
SENATE STATE AFFAIRS STANDING COMMITTEE
April 29, 2025
3:31 p.m.
MEMBERS PRESENT
Senator Scott Kawasaki, Chair
Senator Jesse Bjorkman, Vice Chair
Senator Elvi Gray-Jackson
MEMBERS ABSENT
Senator Bill Wielechowski
Senator Robert Yundt
COMMITTEE CALENDAR
SENATE BILL NO. 26
"An Act petitioning the United States Department of
Transportation to change the time zones of Alaska; exempting the
state from daylight saving time; and providing for an effective
date."
- MOVED SB 26 OUT OF COMMITTEE
SENATE BILL NO. 102
"An Act exempting the state from daylight saving time; and
providing for an effective date."
- MOVED SB 102 OUT OF COMMITTEE
SENATE BILL NO. 37
"An Act relating to the Executive Budget Act; relating to
strategic plans, mission statements, performance plans, and
financial plans for executive branch agencies; and providing for
an effective date."
- MOVED SB 37 OUT OF COMMITTEE
SENATE BILL NO. 33
"An Act relating to defamation claims based on the use of
synthetic media; relating to the use of synthetic media in
electioneering communications; and providing for an effective
date."
- HEARD & HELD
SENATE BILL NO. 2
"An Act relating to disclosure of election-related deepfakes;
relating to use of artificial intelligence by state agencies;
and relating to transfer of data about individuals between state
agencies."
- HEARD & HELD
PREVIOUS COMMITTEE ACTION
BILL: SB 26
SHORT TITLE: ELIMINATE DAYLIGHT SAVING TIME
SPONSOR(s): SENATOR(s) MERRICK
01/22/25 (S) PREFILE RELEASED 1/10/25
01/22/25 (S) READ THE FIRST TIME - REFERRALS
01/22/25 (S) CRA, STA
03/11/25 (S) CRA AT 1:30 PM BELTZ 105 (TSBldg)
03/11/25 (S) <Bill Hearing Canceled>
03/18/25 (S) CRA AT 1:30 PM BELTZ 105 (TSBldg)
03/18/25 (S) Heard & Held
03/18/25 (S) MINUTE(CRA)
03/25/25 (S) CRA AT 1:30 PM BELTZ 105 (TSBldg)
03/25/25 (S) Moved SB 26 Out of Committee
03/25/25 (S) MINUTE(CRA)
03/26/25 (S) CRA RPT 3DP 1NR
03/26/25 (S) DP: MERRICK, DUNBAR, GRAY-JACKSON
03/26/25 (S) NR: YUNDT
04/22/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
04/22/25 (S) Heard & Held
04/22/25 (S) MINUTE(STA)
04/29/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
BILL: SB 102
SHORT TITLE: ELIMINATE DAYLIGHT SAVING TIME
SPONSOR(s): STATE AFFAIRS
02/19/25 (S) READ THE FIRST TIME - REFERRALS
02/19/25 (S) CRA, STA
03/11/25 (S) CRA AT 1:30 PM BELTZ 105 (TSBldg)
03/11/25 (S) Heard & Held
03/11/25 (S) MINUTE(CRA)
03/18/25 (S) CRA AT 1:30 PM BELTZ 105 (TSBldg)
03/18/25 (S) Heard & Held
03/18/25 (S) MINUTE(CRA)
03/25/25 (S) CRA AT 1:30 PM BELTZ 105 (TSBldg)
03/25/25 (S) Moved SB 102 Out of Committee
03/25/25 (S) MINUTE(CRA)
03/26/25 (S) CRA RPT 3DP 1NR
03/26/25 (S) DP: MERRICK, DUNBAR, GRAY-JACKSON
03/26/25 (S) NR: YUNDT
04/22/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
04/22/25 (S) Heard & Held
04/22/25 (S) MINUTE(STA)
04/29/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
BILL: SB 37
SHORT TITLE: STRATEGIC PLANS FOR STATE AGENCIES
SPONSOR(s): SENATOR(s) KAUFMAN
01/22/25 (S) PREFILE RELEASED 1/10/25
01/22/25 (S) READ THE FIRST TIME - REFERRALS
01/22/25 (S) STA, FIN
03/18/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
03/18/25 (S) Heard & Held
03/18/25 (S) MINUTE(STA)
04/29/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
BILL: SB 33
SHORT TITLE: SYNTHETIC MEDIA: LIABILITY; ELECTIONS
SPONSOR(s): SENATOR(s) CRONK
01/22/25 (S) PREFILE RELEASED 1/10/25
01/22/25 (S) READ THE FIRST TIME - REFERRALS
01/22/25 (S) STA, JUD
04/29/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
BILL: SB 2
SHORT TITLE: AI, DEEPFAKES, CYBERSECURITY, DATA XFERS
SPONSOR(s): SENATOR(s) HUGHES
01/22/25 (S) PREFILE RELEASED 1/10/25
01/22/25 (S) READ THE FIRST TIME - REFERRALS
01/22/25 (S) STA, JUD
04/29/25 (S) STA AT 3:30 PM BELTZ 105 (TSBldg)
WITNESS REGISTER
SENATOR KELLY MERRICK, District L
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Sponsor of SB 26.
KERRY CROCKER, Staff
Senator Kelly Merrick
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Presented a brief recap of SB 26.
JOE HAYES, Staff
Senator Scott Kawasaki
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Presented a brief recap of SB 102 on behalf
of the sponsor.
SENATOR JAMES KAUFMAN
District F
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Sponsor of SB 37.
MIKE COONS, representing self
Wasilla, Alaska
POSITION STATEMENT: Testified in support of SB 37.
SENATOR MIKE CRONK
District R
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Sponsor of SB 33.
PAUL MENKE, Staff
Senator Mike Cronk
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Provided the sectional analysis for SB 33.
SENATOR SHELLEY HUGHES
District M
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Sponsor of SB 2.
EIELIA PRESTON, Staff
Senator Shelly Hughes
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Co-presented the slideshow for SB 2.
SPENCE PURNELL, Resident Senior Fellow
Technology and Innovation
R Street Institute
Tampa, Florida
POSITION STATEMENT: Testified by invitation on SB 2.
DANIEL CASTRO, Vice President
Information Technology and Innovation Foundation
Washington, D.C.
POSITION STATEMENT: Testified by invitation on SB 2.
NATE PERSILY, Professor
Stanford Law School
Stanford, California
POSITION STATEMENT: Testified by invitation on SB 2.
MIKE COONS, representing self
Wasilla, Alaska
POSITION STATEMENT: Testified in support of SB 2.
ACTION NARRATIVE
3:31:52 PM
CHAIR KAWASAKI called the Senate State Affairs Standing
Committee meeting to order at 3:31 p.m. Present at the call to
order were Senators Bjorkman, Gray-Jackson, and Chair Kawasaki.
SB 26-ELIMINATE DAYLIGHT SAVING TIME
3:33:23 PM
CHAIR KAWASAKI announced the consideration of SENATE BILL NO. 26
"An Act petitioning the United States Department of
Transportation to change the time zones of Alaska; exempting the
state from daylight saving time; and providing for an effective
date."
3:33:41 PM
CHAIR KAWASAKI opened public testimony on SB 26; finding none,
he closed public testimony.
3:34:21 PM
KERRY CROCKER, Staff, Senator Kelly Merrick, Alaska State
Legislature, Juneau, Alaska, presented a brief recap of SB 26
and would permanently exempt Alaska from daylight savings time,
keeping the state on standard time year-round and eliminating
clock changes. SB 26 also seeks federal approval to move Alaska
to Pacific Standard Time, aligning the state with Seattle.
3:35:16 PM
SENATOR BJORKMAN asked if the exemption from daylight savings
time and the shift to Pacific Time occur simultaneously, and how
are the two changes connected.
3:35:31 PM
MR. CROCKER replied if SB 26 was passed, the state would
petition the U.S. Department of Transportation (DOT) to review
the time zone change request, hold public meetings and reach a
decision within a year based on the convenience of commerce.
3:36:25 PM
SENATOR BJORKMAN asked whether Alaska would change to standard
time while waiting for DOT to decide.
3:36:35 PM
MR. CROCKER responded no, the State of Alaska would stay on
current time until DOT decided
3:37:03 PM
CHAIR KAWASAKI solicited the will of the committee.
3:37:04 PM
SENATOR GRAY-JACKSON moved to report SB 26, work order 34-
LS0267\A, from committee with individual recommendations and
attached fiscal note(s).
3:37:24 PM
CHAIR KAWASAKI found no objection and SB 26 was reported from
the Senate State Affairs Standing Committee.
SB 102-ELIMINATE DAYLIGHT SAVING TIME
3:37:50 PM
CHAIR KAWASAKI announced the consideration of SENATE BILL NO.
102 "An Act exempting the state from daylight saving time; and
providing for an effective date."
3:38:11 PM
CHAIR KAWASAKI opened public testimony on SB 102; finding none,
he closed public testimony.
3:38:39 PM
JOE HAYES, Staff, Senator Scott Kawasaki, Alaska State
Legislature, Juneau, Alaska, presented a brief recap of SB 102
on behalf of the sponsor. He said the bill would keep Alaska on
standard time year-round, like Hawaii and Arizona. He said the
sponsor believes SB 102 is a good move for Alaska.
3:39:23 PM
SENATOR BJORKMAN stated that he opposes ending Daylight Savings
Time, arguing it provides valuable evening daylight for outdoor
activities, sports, and cultural events. He stressed that losing
an hour of light after school or work would significantly limit
opportunities, especially in spring and fall. He also warned of
unintended consequences, such as financial staff in Alaska
having to start work even earlier to align with New York
markets. For these reasons. He said for these reasons he cannot
support eliminating Daylight Savings Time.
3:42:48 PM
SENATOR GRAY-JACKSON stated that she supports SB 102 because her
constituents support the bill.
3:43:35 PM
CHAIR KAWASAKI solicited the will of the committee.
3:43:39 PM
SENATOR GRAY-JACKSON moved to report SB 102, work order 34-
LS0625\A, from committee with individual recommendations and
attached fiscal note(s).
3:43:58 PM
CHAIR KAWASAKI found no objection and SB 102 was reported from
the Senate State Affairs Standing Committee.
SB 37-STRATEGIC PLANS FOR STATE AGENCIES
3:44:17 PM
CHAIR KAWASAKI announced the consideration of SENATE BILL NO. 37
"An Act relating to the Executive Budget Act; relating to
strategic plans, mission statements, performance plans, and
financial plans for executive branch agencies; and providing for
an effective date."
3:44:39 PM
SENATOR JAMES KAUFMAN, District F, Alaska State Legislature,
Juneau, Alaska, sponsor of SB 37 presented a brief recap and
stated that SB 37 would place the state on a four-year strategic
operating plan, updated biennially, with annual funding plans
linked to execution, reforming the Executive Budget Act to
strengthen management and budgeting.
3:45:37 PM
CHAIR KAWASAKI opened public testimony on SB 37.
3:45:59 PM
MIKE COONS, representing self, Wasilla, Alaska, testified in
support of SB 37 and stated that the bill would make Alaska's
budget a performance and merit based system. Departments would
set measurable goals, with funding tied to results. Those who
meet or exceed goals would be recognized, while those who fall
short would face oversight. This ensures accountability, reduces
waste, and makes state spending more effective for Alaskans.
3:47:36 PM
CHAIR KAWASAKI closed public testimony on SB 37.
3:48:05 PM
CHAIR KAWASAKI solicited the will of the committee.
3:48:08 PM
SENATOR BJORKMAN moved to report SB 37, work order 34-LS0346\A,
from committee with individual recommendations and attached
fiscal note(s).
3:48:28 PM
CHAIR KAWASAKI found no objection and SB 37 was reported from
the Senate State Affairs Standing Committee.
3:48:42 PM
At ease.
SB 33-SYNTHETIC MEDIA: LIABILITY; ELECTIONS
3:49:52 PM
CHAIR KAWASAKI reconvened the meeting and announced the
consideration of SENATE BILL NO. 33 "An Act relating to
defamation claims based on the use of synthetic media; relating
to the use of synthetic media in electioneering communications;
and providing for an effective date."
3:50:23 PM
SENATOR MIKE CRONK, District R, Alaska State Legislature,
Juneau, Alaska, sponsor of SB 33 said the bill is a
reintroduction of last year's House Bill 58. He read the sponsor
statement:
[Original punctuation provided.]
"An Act relating to defamation claims based on the use
of synthetic media; relating to the use of synthetic
media in electioneering communications; and providing
for an effective date." The introduction of Artificial
Intelligence and synthetic media into modern mass
communication systems is a new topic that is ripe for
debate. Synthetic media production software is
becoming more advanced by the day and is reaching
exciting, but dangerous capabilities. It is now
possible for the voices and images of public figures
to be manipulated to depict a real person with uncanny
resemblance. Without a discerning eye, manipulated
images and audios can often be mistaken as a genuine
source. SB 33 is written as simply as possible to
address the use of synthetic media to create false
identities and cause harm. Without statutory
protections, individuals and organizations are
susceptible to wrongful harm and reputation damage. SB
33 establishes those safeguards and ensures that your
voice and image will only be yours and safe from harm.
3:51:46 PM
PAUL MENKE, Staff, Senator Mike Cronk, Alaska State Legislature,
Juneau, Alaska, provided the sectional analysis for SB 33:
[Original punctuation provided.]
Sectional Analysis for SB 33
"An Act relating to defamation claims based on the use
of synthetic media; relating to the use of synthetic
media in electioneering communications; and providing
for an effective date."
Section 1
Amends AS 09.65 by adding a new section, AS 09.65.360,
which establishes that defamation based on the use of
a deepfake is a claim for defamation per se, meaning
it is presumed to be damaging to a person's reputation
without any additional proof of harm.
3:52:26 PM
MR. MENKE continued with the sectional analysis for SB 33:
Section 2:
Amends AS 15.80 to include a new section, AS 15.80.009
(Synthetic media in electioneering communications) to
prohibit a person from knowingly using synthetic media
in campaign material. It provides that an individual
who is harmed by such behavior may bring an action
recover damages, attorney fees, costs, or an
injunction against the person who created,
disseminated, or removed a disclosure. It does allow
the use of altered material if it is properly
disclosed as material that has been manipulated.
Section 3:
Provides for an immediate effective date under AS
01.10.070(c)
3:53:34 PM
CHAIR KAWASAKI said he is familiar with last year's bill, noting
that the Broadcasters Association requested protection from
liability for publishing information they believed was
legitimate though later found false. He asked for clarification
on Section 2(e).
3:54:16 PM
MR. MENKE stated that Section 2(e) on page 3, lines 1-6 reads:
[Original punctuation provided.]
An interactive computer service, Internet service
provider, cloud service provider, telecommunications
network, or radio or television broadcaster, including
a cable or satellite television operator, programmer,
or producer, is not liable under this section for
hosting, publishing, or distributing an electioneering
communication provided by another person. This
subsection does not prevent an individual from
bringing an action under (b)(3) of this section for
removing a disclosure statement.
MR. MENKE replied that the only instance in which one of these
entities would be liable is if the entity physically or
electronically removed the disclosure themselves.
3:55:48 PM
CHAIR KAWASAKI opened public testimony on SB 33; he found none
and kept public testimony open.
3:56:29 PM
CHAIR KAWASAKI held SB 33 in committee.
SB 2-AI, DEEPFAKES, CYBERSECURITY, DATA XFERS
3:56:38 PM
CHAIR KAWASAKI announced the consideration of SENATE BILL NO. 2
"An Act relating to disclosure of election-related deepfakes;
relating to use of artificial intelligence by state agencies;
and relating to transfer of data about individuals between state
agencies."
3:57:17 PM
SENATOR SHELLEY HUGHES, District M, Alaska State Legislature,
Juneau, Alaska, sponsor of SB 2 said state agencies need to use
AI responsibly, protect Alaskans' data and personal liberties,
and ensure fairness and transparency. She said while AI can help
address workforce and budget challenges by streamlining tasks,
AI use must balance innovation with safeguards against harm. She
shared an experience serving on the National Conference of State
Legislators Task Force on AI. She stressed the responsibility of
state agencies to apply AI appropriately and transparently
without hindering private sector innovation.
4:00:11 PM
SENATOR HUGHES moved to slide 2, and defined the different types
of A.I.:
[Original punctuation provided.]
Defining A.I.
ARTIFICIAL INTELLIGENCE: falls into two primary
categories:
GENERATIVE: Machine-based system designed to operate
with varying levels of autonomy that may exhibit
adaptiveness after deployment and that, for explicit
or implicit objectives, infers how to generate outputs
from input the system receives.
RULES-BASED: Computational program or algorithm
designed to process information in a logical way that
does not produce inferential output beyond its
original programming and query parameters.
4:00:33 PM
SENATOR HUGHES moved to slide 3, Why Now Why Here, and discussed
the following points:
[Original punctuation provided.]
WHY NOW? A.I. is here. It is evolving at lightning
speed. We cannot stop it. We cannot ignore it.
"A.I. is a tool and in itself is not inherently evil.
Our job is to protect against bad actors and harness
A.I. for good the very best we can."-Senator Shelley
Hughes
WHY HERE? Congress is unlikely to unite on parameters
and best practices anytime soon. State legislatures
are more nimble and ready to mitigate the harm and
bridle the benefits of A.I.
4:01:11 PM
SENATOR HUGHES moved to slide 4, Why this Focus, and discussed
the following points:
[Original punctuation provided.]
1. State Agency Use of A.I.
a) Targeting private sector development and
deployment would stifle innovation and be a fool's
errand for a state with a small population.
b) Setting the parameters for state agency use is
necessary
i. to safeguard the public
ii. to ensure appropriate deployment that will
offer efficiencies and solutions for the
workplace
2. Political Deepfakes
a) No time to waste. Elections occur every year.
b) In general, lack of trust chaos.
SENATOR HUGHES said when SB 2 was first drafted, it was the only
legislation addressing political deepfakes. Since other
legislation now covers that issue, the committee may want to
remove the political deepfake section and allow it to be handled
separately to ensure proper disclosure and accurate public
information.
4:01:53 PM
SENATOR HUGHES moved to slide 5, A Good Starting Point, and
discussed the following points:
[Original punctuation provided.]
AGREEING ON AI PRINCIPLES
• Differentiate between tool and actor
-Protect against bad actors
-Support innovation for beneficial uses
• Aim for tech neutrality
• Assign human oversight and responsibility
• Maintain transparency
• Avoid harm/injury
• Respect sensitive personal data privacy and
security
• Embrace data hygiene
• Avoid creating/reinforcing unfair bias
• Uphold laws and protect individual rights
4:03:19 PM
EIELIA PRESTON, Staff, Senator Shelly Hughes, Alaska State
Legislature, Juneau, Alaska, co-presented the slideshow for SB 2
and moved to slide 6, What it Does-High Level:
[Original punctuation provided.]
1. Adds disclosure statement requirements for
political deepfake communications.
2. Adds new sections regarding state agency use of
artificial intelligence and individuals' data.
3. Adds section to allow persons who suffers harm to
bring civil action to superior court.
4:04:03 PM
MS. PRESTON moved to slide 7, What it Does-a bit in the weeds:
[Original punctuation provided.]
Requires biennial inventory and report of AI systems
being used by state agencies published on DOA website.
1.Name and vendor of system
2.General capabilities and uses
3.Most recent impact assessment completed date
Requires biennial impact assessments to determine
efficacy and continued use of systems.
4:04:35 PM
MS. PRESTON moved to slide 8, What it Does-a bit in the weeds:
[Original punctuation provided.]
Impact Assessment
1.System efficacy
2.Human oversight
3.Accountability mechanisms
4.Decision appeals process
5.Benefits, liability, and risks to state
6.Effects on liberty, finances, livelihood, and
privacy interests of individuals, including effects
from geolocation data use.
7.Unlawful discrimination or disparate impact on
individual or group
8.Policies and procedures governing process of A.I.
system use for consequential decision-making.
4:05:07 PM
MS. PRESTON moved to slide 9, What it Does-a bit in the weeds:
[Original punctuation provided.]
Requires state agencies to
1.Notify individuals who may be legally or
significantly affected
2.Obtain individual's consent before soliciting or
acquiring sensitive personal data or sharing data with
another state agency*
3.Provide appeals process including manual human
review
4.Inform and acquire consent if AI used in hiring
interview video
5.When outsourced, multi-factor authentication must
secure system and stored data
MS. PRESTON said these matters require transparency, such as the
Department of Public Safety sharing legally required information
with the court system.
4:05:49 PM
SENATOR HUGHES commented that the asterisk on the slides
explains there is an exemption for the Department of Public
Safety.
4:05:55 PM
MS. PRESTON moved to slide 10, What it Does-a bit in the weeds:
[Original punctuation provided.]
Prohibits* state agencies from using.
1.Biometric identification e.g., facial recognition
2.Emotion recognition
3.Cognitive behavioral manipulation of individuals or
groups
4.Social scoring
5.AI systems that use data hosted in hostile nations
*With provisional exceptions for Department of Safety
4:06:24 PM
SENATOR HUGHES recommended an amendment to reference the U.S.
Code for defining foreign adversary nations. She said this would
avoid updates and provide clarity, since views on hostile
nations may differ.
4:06:53 PM
MS. PRESTON moved to slide 11 and showed examples of other
countries with issues from deepfakes during an election.
4:07:17 PM
MS. PRESTON moved to slide 12, and read the following quote:
[Original punctuation provided.]
"The fact-checkers trying to hold the line against
disinformation on social media in Slovakia say their
experience shows AI is already advanced enough to
disrupt elections, while they lack the tools to fight
back." (Morgan Meaker, The Wired, 2023)
4:07:40 PM
SENATOR HUGHES noted that while deepfakes disrupted elections
abroad in 2024, U.S. research found deepfakes spread
misinformation yet did not change outcomes. Still, 52 percent of
Americans struggle to distinguish fact from fiction in election
news, and studies show 2550 percent of deepfakes aim to
mislead. She said growing awareness has helped people spot
fakes, but disclosure, enforcement, penalties, and injunctive
relief remain important parts of the proposal.
4:09:49 PM
CHAIR KAWASAKI opined that 52 percent is a low number of people
that struggle to identify misinformation. He referenced slide 13
stating that with AI filters everything would need a content
disclosure requirement. He asked for her views on how disclosure
laws should apply to deepfakes.
4:10:57 PM
SENATOR HUGHES reiterated the definition of a deepfake:
It would have to be something that creates something
false that would appear to a reasonable person to
depict a real individual saying or doing something
that did not actually occur and provides a
fundamentally different understanding or oppression of
an individual's appearance, conduct, or spoken words.
SENATOR HUGHES replied that that AI was also used positively in
the last election, such as translating candidate speeches into
other languages. She wanted to keep SB 2 narrowly focused on
deceptive uses, like making someone appear to say or do
something that never happened.
4:12:11 PM
CHAIR KAWASAKI announced invited testimony on SB 2.
4:13:17 PM
SPENCE PURNELL, Resident Senior Fellow, Technology and
Innovation, R Street Institute, Tampa, Florida, testified by
invitation on SB 2. He agreed that deepfakes are a real problem
and supports a narrow definition to avoid overreach, favoring
disclosure over bans. He stressed government roles beyond
regulation, such as education and awareness. He endorsed SB 2 as
a well-written bill that sets responsible boundaries without
discouraging beneficial AI use. He noted the importance of
careful regulation given the technology's early stage.
4:15:51 PM
CHAIR KAWASAKI stated his belief that disclosure is effective,
though it must be done carefully. He said if everything requires
a disclosure, people may start ignoring them altogether. He
asked for an explanation on how other states have set guidelines
for the use of artificial intelligence, particularly around
disclosure.
4:16:22 PM
MR. PURNELL warned that to avoid liability, many will add
disclosure statements to political communications, which could
lessen the impact. While not a bad outcome, he stressed that the
need is for digital literacy and civic education, enabling
citizens to critically evaluate information. He noted that AI is
just the first of many emerging technologies, and long-term
resilience depends on fostering cultural change and critical
thinking rather than relying solely on policy or technology.
4:19:06 PM
DANIEL CASTRO, Vice President, Information Technology and
Innovation Foundation, Washington, D.C., testified by invitation
on SB 2 and emphasized that generative AI offers significant
benefits while posing risks, particularly with deepfakes in
elections. He highlighted the need for narrowly tailored state
policies that focus on harmful manipulation rather than
legitimate AI use. Key principles include meaningful disclosure,
timely enforcement, accountability for bad actors, and
preserving beneficial uses like translation and accessibility.
He stressed that government use of AI should be transparent and
accountable, and that policies should protect election integrity
without stifling innovation.
4:22:56 PM
CHAIR KAWASAKI shared an example of Alaska's overly broad cell
phone law that unintentionally restricted common screen devices
and had to correct the next year. He asked if other states have
similarly overregulated technology and later had to roll back or
amend the laws.
4:24:02 PM
MR. CASTRO answered yes and said some states passed AI laws with
poor definitions that overreached, creating ineffective labeling
requirements. He said over-labeling can dilute trust signals,
and such rules only bind legitimate actors, not foreign bad
actors spreading misinformation. He cautioned against imbalance
and urged for technology-neutral policies focused on deceptive
media in elections rather than AI specifically.
4:26:21 PM
NATE PERSILY, Professor, Stanford Law School, Stanford,
California, testified by invitation on SB 2 and stated that AI
amplifies the abilities of all actorselection officials,
candidates, or foreign adversariesto pursue goals. While
Americans are especially pessimistic about AI's effect on
democracy, evidence from recent elections shows little actual
use of deepfakes to sway voters. He said the greater danger is
eroding trust in authentic media, as people become better at
spotting falsehoods and worse at recognizing truth. This
distrust could harm democracy more than the deepfakes
themselves. He stated that some states have banned deepfakes,
while many others, including bills like SB 2, are under
consideration and focus on disclosure. Disclosure is viewed as a
modest yet important first step, giving voters tools to
understand what content is AI-generated without overregulating
rapidly evolving technology.
4:31:23 PM
CHAIR KAWASAKI asked how the public can be educated to better
discern truth from misinformation, especially when many people
no longer trust what they see in the news or online and can
easily be misled.
4:33:05 PM
MR. PERSILY responded that social media has replaced
authoritative news sources, creating an environment where
misinformation spreads easily. While empowering users with tools
to identify synthetic content is a step forward, lasting
solutions require building widespread critical thinking skills.
He said however, repeatedly warning people not to trust online
content risks is leading people to distrust everything, even
information that is accurate.
4:35:37 PM
CHAIR KAWASAKI asked from a legal perspective, whether penalties
for misinformation or AI misuse can serve as an effective
deterrent.
4:36:03 PM
MR. PERSILY replied that a blanket ban on AI in communications
would be unconstitutional as overly broad under the First
Amendment. However, disclosure requirements are a recognized
constitutional safe harbor. Courts, including in the Citizens
United case, have upheld strong disclosure rules. SB 2 follows
that model, treating failure to disclose AI use, especially when
intended to manipulate images, similarly to other regulatory
contexts where nondisclosure can trigger enforcement.
4:37:55 PM
CHAIR KAWASAKI requested an explanation on whether libel laws
have been used to address cases where AI makes it appear that
someone said something they did not.
4:38:21 PM
MR. PERSILY replied that libel laws have seen limited
application in AI contexts, primarily with non-consensual
intimate imagery, which poses significant risks, especially for
young people. For public figures, libel requires proving actual
malice, making it harder to pursue cases involving AI deepfakes
of officials. He said while libel shows some promise, disclosure
requirements are often a more practical regulatory tool for
election-related AI content.
4:40:16 PM
CHAIR KAWASAKI opened public testimony on SB 2.
4:40:30 PM
MIKE COONS, representing self, Wasilla, Alaska, testified in
support of SB 2 and shared personal challenges adapting to
technology and noted that AI is far more advanced today. SB 2
provides initial protection for responsible government use of
AI. He said while AI can accelerate information processing and
improve accuracy, the final product must rely on human judgment
and innovation. He named his concerns that include overreliance
on AI, the potential for deepfakes to mislead the public, and
the risk that students may lose critical skills to discern truth
from misinformation. He opined that human oversight and
responsibility are essential to ensure AI supports rather than
undermines decision-making and public trust.
4:43:12 PM
SENATOR HUGHES stated that SB 2 is technology-neutral, covering
AI and other forms of manipulation like Photoshop for deepfakes,
with disclosure required for any altered content. She said state
agencies using AI, especially in consequential decisions
affecting individuals, should follow clear parameters, obtain
consent, and ensure transparency. The fiscal note imposing high
costs is seen as unnecessary, as responsible AI use should
streamline work rather than require additional staff. She
emphasized that AI use must remain transparent, fair, and
practical, with common-sense guidelines rather than excessive
regulation. Properly implemented, AI offers long-term benefits
and potential savings for state operations.
4:46:57 PM
CHAIR KAWASAKI stated that the fiscal note includes staffing and
resources totaling $5.6 million for operations and $2.5 million
for contractual services. He said the Finance Committee would
need to review SB 2 and then it would continue to Judiciary as
the second committee of referral. He said he would work with the
bill sponsor on accelerating SB 2.
4:47:52 PM
SENATOR GRAY-JACKSON stressed the urgency of addressing AI,
noting it moves too quickly for a task force approach, and
expressed willingness to help reduce the fiscal note to move the
bill forward.
4:48:27 PM
CHAIR KAWASAKI kept public testimony open for SB 2.
4:49:05 PM
CHAIR KAWASAKI held SB 2 in committee.
4:50:19 PM
There being no further business to come before the committee,
Chair Kawasaki adjourned the Senate State Affairs Standing
Committee meeting at 4:50 p.m.
| Document Name | Date/Time | Subjects |
|---|---|---|
| SB0002A.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 2 |
| SB 2 AI Sponsor Statement.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 2 |
| SB 2 Sectional Analysis.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 2 |
| SB 33 version A.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 33 |
| SB 33 Sponsor Statement version A.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 33 |
| SB 33 Sectional Analysis version A.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 33 |
| 2025 - Testimony of Daniel Castro - AK AI Deepfakes.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 2 |
| SB 2 AI Presentation S STA.pdf |
SSTA 4/29/2025 3:30:00 PM |
SB 2 |