Legislature(2023 - 2024)BELTZ 105 (TSBldg)
04/04/2024 03:30 PM Senate STATE AFFAIRS
Note: the audio
and video
recordings are distinct records and are obtained from different sources. As such there may be key differences between the two. The audio recordings are captured by our records offices as the official record of the meeting and will have more accurate timestamps. Use the icons to switch between them.
| Audio | Topic |
|---|---|
| Start | |
| Confirmation: Personnel Board | |
| SB201 | |
| SB177 | |
| SB262 | |
| Adjourn |
* first hearing in first committee of referral
+ teleconferenced
= bill was previously heard/scheduled
+ teleconferenced
= bill was previously heard/scheduled
| += | SB 201 | TELECONFERENCED | |
| += | SB 177 | TELECONFERENCED | |
| *+ | SB 262 | TELECONFERENCED | |
| + | TELECONFERENCED | ||
SB 177-AI, DEEPFAKES, CYBERSECURITY, DATA XFERS
3:59:10 PM
CHAIR KAWASAKI reconvened the meeting and announced the
consideration of SENATE BILL NO. 177 "An Act relating to
artificial intelligence; requiring disclosure of deepfakes in
campaign communications; relating to cybersecurity; and relating
to data privacy."
3:59:52 PM
CHAIR KAWASAKI solicited a motion.
3:59:57 PM
SENATOR MERRICK moved to adopt the committee substitute (CS) for
SB 177, work draft 33-LS1061\S, as the working document.
4:00:09 PM
CHAIR KAWASAKI objected for purposes of discussion.
4:00:23 PM
STEPHEN KNOUSE, Staff, Senator Hughes, Alaska State Legislature,
Juneau, Alaska, presented the summary of changes for SB 177.
[Original punctuation provided.]
Summary of Changes in State Affairs
Committee Substitute to SB 177
Version B to Version S
Page 1, lines 1-3: Change of title from "An Act
relating to artificial intelligence; requiring
disclosure of deepfakes in campaign communications;
relating to cybersecurity; and relating to data
privacy." to "An Act relating to disclosure of
election-related deepfakes; relating to use of
artificial intelligence by state agencies; and
relating to transfer of data about individuals between
state agencies." Cybersecurity fits under the category
of use of AI by state agencies.
Page 1, line 5: Section 15.13 is now recoded to
Section 15.80.
Page 1, line 6: Recodes section 15.13.093 to be
Section 15.80.009
Page 1, lines 6-7: Includes contracted content creator
of election-related deepfakes as being required to
include a disclosure.
Page 1, line 8-9: Election-related deepfakes which
require disclosure statements expand to include
propositions, and removes political parties.
Page 1, lines 10-11: Adds "or by another means" to
deepfake disclosure to cover content created by any
means, not just artificial intelligence.
Page 2, lines 1-6: Sentence referring to communication
"that includes audio component" changes to "that
consists only of audio", and modifies deepfake
disclosure requirements to include placement intervals
of disclosure.
Inserts clarifying term "election-related" to
references of "communication" pertaining to deepfakes
in the following places:
Page 1, lines 7, 10, 11, and 13 broadcast, cable,
satellite, Internet, or other digital communication
Page 2, line 1 requires the disclosure to remain
onscreen throughout the entirety of the communication
Page 2, line 2 requires the disclosure be read in
audio communications at the beginning, end, and at
least once every two minutes if the audio
communication is longer than two minutes
Page 2, line 8- prohibits a person from removing the
disclosure statement from known deepfake materials
Page 2, lines 14, 15, and 17- allows a candidate or
proposition group suffering damages to seek injunctive
relief
Page 2, line 25- injunctive relief does not apply to
paid election-related communication broadcast by a
radio, television, cable, or satellite provider if the
provider has made a good faith effort that the
communication does not contain a deepfake
Page 3, line 15- defines "election-related
communication" as communication that directly or
indirectly identifies a candidate or proposition and
is disseminated to an audience that includes voters
who have the opportunity to vote on the candidate or
proposition.
Page 2, lines 7-9: Prohibits entities from omitting or
removing required deepfake disclosures.
Page 2, lines 10-12: Makes entities violating required
disclosures liable to candidate or proposition group
for damages suffered by omission of deepfake
disclosure, full attorney fees, and costs.
Page 2, lines 13-17: Includes injunctive relief to
prohibit dissemination of deepfakes with omitted or
removed disclosures.
Page 2, line 19: Makes liability and disclosure
exceptions for satire, parody.
Page 2, line 20 - 3, line 2: Makes liability
exceptions for traditional and electronic
broadcasting, and publications that adhere to the
statement requirements applicable to the media form.
In the case of paid election-related communications
without disclosures, due diligence to confirm
communication did not include deepfakes.
Page 2: Re-lettering subsections to accommodate newly
inserted changes to Section 1.
Page 3, lines 15-24: Insert definition for "election-
related communications," "proposition," and
"proposition group."
Inserts term "generative" to specify type of AI
(generative vs. rules-based) or data being addressed
in the following places:
Page 3, line 28 to exclude rules-based AI from
inventory.
Page 4, line 9 to exclude rules-based AI from impact
assessments.
Page 5, lines 3, 11, and 12 to exclude rules-based AI
in state agency use requirements for consequential
decisions and prospective employees hiring videos.
Page 6, line 2 to exclude rules-based AI from
regulations where development, procurement,
implementation, use and system assessments are
concerned regarding consequential decisions.
Page 5, lines 6-7: Expands data collection consent to
"from or about" an individual.
Page 5, lines 19-21: Removes specific list of
adversarial countries to the United States and permits
the department of administration to designate foreign
adversaries (as determined by US Department of State
see Page 6, lines 12-13).
Page 5, lines 22-27: Removes use of "multi-factor
authentication" and inserts current security and
privacy controls as specified by the National
Institute of Standards and Technology.
Page 5, lines 28-30: Removes seeking "the individual's
consent" for inter-agency data transfers and replaces
with "giving notice to the individual".
Page 6, line 24 Page 7, line 2: Establishes new
section AS 44.99.760 for exemptions to the Department
of Public Safety in cases of criminal offenses,
missing persons, and exigent circumstances as they
pertain to inventories, impact assessments, AI use
requirements for state agencies, and data transfers
between state agencies.
Page 7, line 3: Recodes definition section 44.99.760
to section 44.99.770.
Page 7, lines 4-5: Removes current definition of
"artificial intelligence" and inserts new definitions
for "artificial intelligence", "generative artificial
intelligence", and "rules-based artificial
intelligence".
Page 8, lines 1-4: Adds additional types of
information that qualify as "sensitive personal data"
to include an individual's bank account, social
security number, or other personal identifier issued
to an individual by a government or institution.
Page 8, lines 12-15: Insert new section to restrict
applicability of the AS 44.99.750 enacted by sec. 2 of
the bill to acts or omissions occurring on or after
the effective date.
There are no other changes to the bill.
4:06:55 PM
CHAIR KAWASAKI referred to CSSB 177, Section 2, page 5, line 3,
lines 11-12, which discuss utilizing rules-based AI in state
agencies. He asked for clarification on whether that provision
is exempting rules-based AI from certain requirements.
4:07:31 PM
SENATOR HUGHES explained that rules-based AI, which can be as
simple as a spreadsheet, has been in use for some time without
raising public concern or issues. The focus of SB 177 and
similar efforts nationwide is to enhance efficiency, cost-
effectiveness and reduce the burden of mundane tasks for state
workers while assuring the public of responsible AI use. She
noted that while rules-based AI has been in use for some time,
they are focused here on generative AI, which is the new
emergent technology. Historically, there have been no
significant concerns regarding harm to individuals from rules-
based AI, which is why it was not a focus of the legislation.
However, she noted that the committee could choose to reconsider
this aspect if desired.
4:09:10 PM
CHAIR KAWASAKI removed his objection. He found no further
objection and CSSB 177 was adopted as the working document.
4:10:00 PM
SENATOR HUGHES highlighted her involvement with the National
Conference of State Legislatures Working Group on AI,
emphasizing the importance of responsible implementation in
state agencies. The compilation of ideas was gathered from
various organizations, including the Reason Foundation, Stanford
Law School, and Alaska's Department of Information Technology.
She expressed the need to address political deep fakes,
especially in the context of the 2024 elections, advocating for
transparency without infringing on freedom of speech. The
initial draft for SB 177 was a starting point, with room for
adjustments based on feedback and listed a number of
organizations and groups whose input has been heard and
incorporated as well as activity in the House. The distinction
between rules-based and generative AI was underscored, along
with the necessity to address ballot propositions within the
legislation. She acknowledged ongoing work to ensure
technological neutrality, noting the difference between and AI
deep fake and work that has been done simply by someone skilled
in Photoshop, protection of individuals' information, and the
right to sue the state for consequential harm. he noted further
work is needed on the issues of potential litigation, which
would suit the Judiciary Committee.
4:17:03 PM
SENATOR MERRICK mentioned the importance of SB 177. She asked
how the determination of satire or parity would be made and
inquired about the enforcement mechanisms for the proposed
legislation.
4:17:18 PM
SENATOR HUGHES noted that issues regarding satire or parody
would likely be brought to the attention of the Alaska Public
Offices Commission, as the proposal falls under a section of law
related to them. She mentioned that a definition for satire and
parody is not included in the bill because courts typically rely
on a general understanding. She expressed openness to including
a definition if necessary, emphasizing that it usually hinges on
what a reasonable person would perceive as satire or comedy.
4:18:05 PM
CHAIR KAWASAKI inquired about the new draft regarding deep
fakes, specifically addressing their use in advertisements
across various media platforms, including the internet. He
raised concerns about the ability to create entirely artificial
personas for promotional purposes, suggesting that these could
be designed to convey positive messages about individuals or
products. He questioned whether the legislation would require
disclosures indicating that such representations are not based
on real people, similar to disclaimers often seen in
advertisements, such as "five out of six doctors prefer this
medicine." He sought clarification on whether this kind of
disclosure was part of the envisioned framework in the bill.
4:19:32 PM
SENATOR HUGHES clarified that, as currently written, the
definition would only apply if a deep fake made a real person
appear to say or do something they did not actually say or do,
or if it gives a misleading impression of an individual. She
stated that the example Chair Kawasaki provided regarding
entirely artificial personas would not be covered under this
definition, indicating the need for an addition to the bill. SB
177 was originally designed to be neutral. However, she
recognized that candidates could create deep fakes that could
either harm their opponents or enhance their own images, such as
falsely claiming awards. While the previous version of the bill
focused solely on injurious deep fakes directed at opponents,
the new version allows for both positive and negative
portrayals. However, she noted that if a completely fabricated
persona were given a name, it might then fall under the existing
definition. Conversely, a generic deep fake featuring a group of
manufactured individuals expressing support would likely not be
included under the bill.
4:21:06 PM
SENATOR BJORKMAN inquired about the concerns raised by the
Alaska Broadcasters Association regarding potential liability
for broadcasters airing commercials or stories containing deep
fakes. He requested clarification on the aspects of the bill
that provide protection to broadcasters, news agencies, and
others disseminating information to the public, ensuring they
are safeguarded against lawsuits or misleading information from
bad actors.
4:21:56 PM
SENATOR HUGHES highlighted Sec. 15.80.009(e) of CSSB 177, which
addresses concerns raised by the Alaska Broadcasters Association
regarding potential liability when airing commercials or stories
containing deep fakes. The intent is to hold the creator of the
deep fakes responsible, not the broadcasters or other platforms.
Newscasts may report on a deep fake and show it, but they must
include a disclosure regarding the authenticity of the content.
The responsibility ultimately lies with the creator of the deep
fake. For example, if a candidate hires a marketing company to
produce campaign materials, the candidate is responsible for
ensuring the proper disclosures are made, such as the "paid for
by" information. However, if a candidate specifically requests a
marketing company to create a deep fake, knowing it will mislead
voters, both the candidate and the marketing company would be
held accountable. The marketing company would also need to
include a disclosure, which would have legal implications. In
summary, the bill aims to ensure that the responsibility for
creating misleading deep fakes falls on the individuals or
entities that produce them, protecting broadcasters and
platforms from liability.
4:24:18 PM
CHAIR KAWASAKI held SB 177 in committee.
4:24:52 PM
At ease
| Document Name | Date/Time | Subjects |
|---|---|---|
| CS SB 201.pdf |
SSTA 4/4/2024 3:30:00 PM |
SB 201 |
| CS SB 177.pdf |
SSTA 4/4/2024 3:30:00 PM |
SB 177 |
| SB0262A.pdf |
SSTA 4/4/2024 3:30:00 PM |
SB 262 |
| SB 262 Sponsor Statement.pdf |
SSTA 4/4/2024 3:30:00 PM |
SB 262 |
| SB 262 Sectional Analysis Version A.pdf |
SSTA 4/4/2024 3:30:00 PM |
SB 262 |