Legislature(2023 - 2024)BUTROVICH 205
05/11/2024 10:00 AM Senate JUDICIARY
Note: the audio
and video
recordings are distinct records and are obtained from different sources. As such there may be key differences between the two. The audio recordings are captured by our records offices as the official record of the meeting and will have more accurate timestamps. Use the icons to switch between them.
| Audio | Topic |
|---|---|
| Start | |
| HB358 | |
| Adjourn |
* first hearing in first committee of referral
+ teleconferenced
= bill was previously heard/scheduled
+ teleconferenced
= bill was previously heard/scheduled
| *+ | HB 358 | TELECONFERENCED | |
ALASKA STATE LEGISLATURE
SENATE JUDICIARY STANDING COMMITTEE
May 11, 2024
10:02 a.m.
MEMBERS PRESENT
Senator Matt Claman, Chair
Senator Jesse Kiehl, Vice Chair
Senator Löki Tobin
MEMBERS ABSENT
Senator James Kaufman
Senator Cathy Giessel
COMMITTEE CALENDAR
COMMITTEE SUBSTITUTE FOR HOUSE BILL NO. 358(2D JUD)
"An Act relating to defamation claims based on the use of
deepfakes; and relating to the use of deepfakes in
electioneering communications."
- HEARD & HELD
PREVIOUS COMMITTEE ACTION
BILL: HB 358
SHORT TITLE: DEEPFAKES: LIABILITY; ELECTIONS
SPONSOR(s): REPRESENTATIVE(s) CRONK
02/20/24 (H) READ THE FIRST TIME - REFERRALS
02/20/24 (H) JUD
03/13/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/13/24 (H) Heard & Held
03/13/24 (H) MINUTE(JUD)
03/15/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/15/24 (H) Heard & Held
03/15/24 (H) MINUTE(JUD)
03/20/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/20/24 (H) <Bill Hearing Canceled>
03/22/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/22/24 (H) Heard & Held
03/22/24 (H) MINUTE(JUD)
03/25/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/25/24 (H) Heard & Held
03/25/24 (H) MINUTE(JUD)
03/27/24 (H) JUD AT 1:00 PM GRUENBERG 120
03/27/24 (H) Heard & Held
03/27/24 (H) MINUTE(JUD)
04/01/24 (H) JUD AT 1:00 PM GRUENBERG 120
04/01/24 (H) Heard & Held
04/01/24 (H) MINUTE(JUD)
04/03/24 (H) JUD AT 1:00 PM GRUENBERG 120
04/03/24 (H) Moved CSHB 358(JUD) Out of Committee
04/03/24 (H) MINUTE(JUD)
04/08/24 (H) JUD RPT CS(JUD) NEW TITLE 6DP
04/08/24 (H) DP: GRAY, CARPENTER, GROH, SUMNER,
ALLARD, VANCE
04/29/24 (H) RETURNED TO JUD COMMITTEE
05/01/24 (H) JUD AT 1:00 PM GRUENBERG 120
05/01/24 (H) Moved CSHB 358(2D JUD) Out of Committee
05/01/24 (H) MINUTE(JUD)
05/02/24 (H) JUD RPT CS(2D JUD) NEW TITLE 4DP 1AM
05/02/24 (H) DP: GRAY, CARPENTER, ALLARD, SUMNER
05/02/24 (H) AM: VANCE
05/02/24 (H) RETURNED TO RLS COMMITTEE
05/09/24 (H) TRANSMITTED TO (S)
05/09/24 (H) VERSION: CSHB 358(2D JUD)
05/10/24 (S) READ THE FIRST TIME - REFERRALS
05/10/24 (S) JUD
05/11/24 (S) JUD AT 10:00 AM BUTROVICH 205
WITNESS REGISTER
DAVE STANCLIFF, Staff
Representative Mike Cronk
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Introduced HB 358 on behalf of the sponsor
and delivered the sectional analysis.
ROBERT BALLINGER, Staff
Representative Sarah Vance
Alaska State Legislature
Juneau, Alaska
POSITION STATEMENT: Answered questions during the discussion of
HB 358.
ACTION NARRATIVE
10:02:49 AM
CHAIR MATT CLAMAN called the Senate Judiciary Standing Committee
meeting to order at 10:02 a.m. Present at the call to order were
Senators Kiehl, Tobin, and Chair Claman.
HB 358-DEEPFAKES: LIABILITY; ELECTIONS
10:03:26 AM
CHAIR CLAMAN announced the consideration of CS FOR HOUSE BILL
NO. 358(2d JUD) "An Act relating to defamation claims based on
the use of deepfakes; and relating to the use of deepfakes in
electioneering communications."
CHAIR CLAMAN said this is the first hearing of HB 358 in the
Senate Judiciary Committee. He invited Mr. Stancliff to put
himself on the record and begin his presentation.
10:04:01 AM
DAVE STANCLIFF, Staff, Representative Mike Cronk, Alaska State
Legislature, Juneau, Alaska, introduced HB 358 on behalf of the
sponsor as paraphrased below:
The term commonly understood and currently used to
describe artificial intelligence (AI) identity abuse
is deepfake. HB 358 is drafted to protect an
individual's voice or image identity from being
changed or manipulated without permission through the
use of AI-generated deepfakes. The bill defines what a
deepfake is and sets prohibitions on certain uses,
including electioneering communications.
HB 358 establishes a civil threshold for cause of
action when deepfake causes harm. It exempts certain
parties from liability under specific circumstances.
For instance, if a social media company searches for
and identifies deepfake content, it is not held
responsible for merely possessing the material.
HB 358 and its definitions set forth the current
language needed to support this new area of law in
state statutes. This language was adopted from federal
and state sources. Deepfake technology is evolving so
fast, it is difficult to keep up with the terms used
to describe both the use and misuse of AI.
10:05:46 AM
MR. STANCLIFF presented the sectional analysis for HB 358
version D, CSHB 358(2d JUD):
[Original punctuation provided.]
Sectional for HB 358 (33-LS1272\D)
Section 1.
Amends AS 09.65 by adding a new section AS 09.65.360
which establishes that defamation based on the use of
a deepfake is a claim for defamation per se., meaning
it is presumed to be damaging to a person's reputation
without any additional proof of harm.
Section 2.
Amends AS 15.80 to include a new section AS 15.80.009
(Deepfakes in electioneering communications) to
prohibit a person from knowingly using a "deepfake" in
a campaign material. It provides that an individual
who is harmed by such behavior may bring an action to
recover damages, attorney fees, costs, or an
injunction against the person who created,
disseminated, or removed a disclosure. It does allow
the use of altered material if it is disclosed as
material that has been manipulated.
10:06:53 AM
MR. STANCLIFF said an advertisement recently aired showing a
reel of President Biden appearing to speak on a Fairbanks tax
issue, but it was not actually him. He emphasized how easily and
cheaply artificial intelligence (AI) can alter video, noting
that a voice can be changed in 20 minutes for under $10. He
stated that Representative Cronk introduced HB 358 as a basic
bill to begin addressing this emerging area of law in Alaska,
with the intent that it be built upon. He noted that two other
complex measures are also before the legislature. HB 358
received detailed debate in the House Judiciary Committee. It is
a nonpartisan bill with majority and minority support.
10:08:30 AM
SENATOR TOBIN commented that there appears to be a grammatical
error on page 2, line 13, where a period is followed by a
lowercase "and," despite the apparent intent to introduce a list
of numbered items 1, 2, and 3. She flagged the potential error
for correction if the committee prepares a substitute version of
the bill.
MR. STANCLIFF said he would contact legislative legal so they
can correct it if necessary.
10:09:32 AM
SENATOR TOBIN said HB 358 does not include an effective date.
She asked whether the bill would take effect this year if
passed. She noted that without an effective date, it would
become law 90 days after passage.
MR. STANCLIFF replied that the intent of HB 358 is to protect
against and end, as quickly as possible, the temptation to
distort images in this year's upcoming election.
10:10:07 AM
CHAIR CLAMAN said Senator Tobin's question is whether the bill
sponsor prefers to amend HB 358 to include an effective date.
MR. STANCLIFF replied that if it were the will of the committee,
his office would support any effective date that puts HB 358
into effect as soon as possible.
10:10:43 AM
SENATOR TOBIN asked how HB 358 addresses the use of AI for
political satire. She expressed concern about potential
liability for individuals whose material might be used for
nefarious purposes.
MR. STANCLIFF deferred the question to Mr. Ballinger, an
attorney that did most of the drafting of HB 358.
10:11:50 AM
CHAIR CLAMAN confirmed that Mr. Ballinger is staff to
Representative Vance.
MR. STANCLIFF replied yes.
10:11:54 AM
CHAIR CLAMAN directed the question to Mr. Ballinger.
10:12:06 AM
ROBERT BALLINGER, Staff, Representative Sarah Vance, Alaska
State Legislature, Juneau, Alaska, answered questions during the
discussion of HB 358. He replied that the original language from
the bill in Washington, D.C. prohibited the use of deepfakes in
elections or communications when done knowingly with the intent
to influence an election. He explained that if satire is used
with the intent to influence an election, it would fall under
the prohibition. However, if satire is used purely for humor, it
would not be prohibited. He added that if someone is concerned
their satire might be interpreted as election-related, they can
include a disclosure and still proceed without liability.
10:13:28 AM
SENATOR TOBIN gave the example of a popular political blogger
who operates in the space between news and satire, noting that
some may assume the blogger is trying to influence an election,
even if that is not the intent. She asked how courts interpret
the definition of intent in such cases.
10:13:57 AM
MR. BALLINGER replied that determining intent would depend on
the facts presented, including what was said, what actions were
taken, when they occurred, and the content of the image. He
stated that this evidence would be considered by a judge or jury
to assess intent. He added that if he were representing the
blogger, he would advise including a disclosure on the video or
image.
10:14:38 AM
CHAIR CLAMAN said satire presents a complex issue. He pointed to
the New York Times, which often lists what comedians said the
night before, and noted that at least five times a week there is
commentary about candidatesfrequently former President Trump
and current President Biden. He stated that while the primary
purpose of satire is to be humorous and offer commentary, it is
difficult to argue that such content does not influence
elections. He questioned how to protect satire as a form of
expression when someone might sue a comedian for influencing an
election by discussing topics like former President Trump's
trial, even if the intent was not to sway voters.
10:16:04 AM
MR. BALLINGER replied that even if there were an attempt to
prohibit that type of speech, it would be protected under the
First Amendment right to free speech. He clarified that HB 358
specifically targets the use of deepfakes. He cited the
definition of "deepfake" in HB 358, page 3, line 14, as any
visual or audio media that is created, altered, or otherwise
manipulated by artificial intelligence in a manner that to a
reasonable observer, appears to be an authentic record of an
individual's actual speech, conduct, or likeness; and conveys a
fundamentally different understanding or impression [of the
individual's appearance, action, or speech than a reasonable
person would have from the unaltered, original version of the
individual's appearance, action, or speech.]
10:16:42 AM
CHAIR CLAMAN asked what would happen if a comedian created a
deepfake and included it in a comedy show.
10:16:55 AM
MR. BALLINGER replied that if a deepfake is shown on a comedy
show, most people would not interpret it as an attempt to
influence politics. However, if the same content appears in a
different contextparticularly with a skilled impersonatorthe
question becomes whether artificial intelligence (AI) was used
to create it, which is a requirement under HB 358. He stated
that if the content appears realistic enough that someone could
reasonably believe it was intended to influence an election, the
recommendation is to include a disclosure. He said although it
is about protecting candidates it is more about preventing the
public from being deceived.
10:18:24 AM
SENATOR TOBIN said she agreed that the public should have
accurate information to make informed voting decisions. She
asked about the disclosure requirements for deepfakes, noting
that HB 358 does not include prescriptive measures such as font
size or notice placement. She expressed concern that without
such details, certain groupslike the visually impairedmay have
difficulty noticing the disclosure.
MR. BALLINGER responded that the concern is legitimate. He
explained that the disclosure font size must match the largest
font size used in the image. If no font is used in the content,
the disclosure must be in a reasonable size that viewers can
see.
10:20:14 AM
SENATOR TOBIN raised concern about the duration a disclosure
statement must remain visible on an image and reiterated the
importance of including guardrails to ensure effective
communication of the disclosure.
10:20:38 AM
SENATOR KIEHL asked whether the language regarding the
disclosure requirements was borrowed from another source. He
asked if the language used in HB 358 is the same language that
that governs ads such as car commercials.
MR. BALLINGER replied that the disclosure language originated
from Washington state legislation and was later modified by the
House Judiciary Committee. He stated that most of the language
remains the same as Washington State's, including the wording of
the disclosure statement.
10:21:22 AM
SENATOR KIEHL said it would be helpful to know whether the
disclosure statement requirements align with those seen in
national television ads, which are often difficult to read
without recording and pausing the ad. He expressed a desire for
assurance that the disclosure language in HB 358 is reasonable.
He then asked whether the legislation differentiates between a
person manipulating an image for their own benefit versus
someone else manipulating an image to that person's detriment,
or if both scenarios are treated the same under the bill's
requirements.
MR. BALLINGER replied it is treated the same. There was a
variation of the legislation that applied only if someone
manipulated someone else's image, but the bill now covers
manipulating any image with the intent to cause harm, which
would be considered a deepfake.
10:22:36 AM
SENATOR KIEHL said that the focus on protecting the public
rather than the candidates is a positive aspect.
SENATOR KIEHL asked about the provision in the legislation that
provides immunity to those who post or broadcast deepfake
content, except in cases where disclaimers are removed. He noted
that the standard used is "knowingly" and questioned why
television broadcasters, radio stations, and internet service
providers are not held to the same standard. He suggested they
should not be subject to liability if they are unknowingly
duped, but should be held accountable if they knowingly
distribute a deepfake without the required disclaimer.
MR. BALLINGER stated his belief that there is no reason such a
change could not be made. He noted that much of the language
came from TechNet and that similar language was mentioned. He
said the bill sponsor's opinion would be necessary but adding
that language would not alter the structure or intent of the
legislation.
10:24:20 AM
SENATOR TOBIN asked about the reference to private communication
in HB 358, page 3, line 25. She inquired how "audience" and
"internet" are defined, using the example of creating a deepfake
and sending it to a group of friends. She questioned whether she
would still be covered under the private communication exemption
if the deepfake was shared beyond her control.
MR. BALLINGER replied that if the creation and distribution of
the image were done with the intent to affect an election, it
would qualify as the use of a deepfake and could lead to
liability and potential damages. However, if the image was
shared privately among friends with a clear statement that it
was fake, it would not be reasonable to assume intent to
influence an election. He added that if a person intended for
their friends to believe the image was real, there could be
liability.
10:25:39 AM
SENATOR TOBIN said that with the discussion of liability and
damages, she found it notable that HB 358 has a zero fiscal
note. She questioned how the bill accounts for the creation of a
new section of law, potential training for investigators, and
funding to conduct investigations, expressing curiosity about
how those needs are addressed.
MR. BALLINGER stated that no new investigator would be needed
because the actions outlined in the legislation are civil in
nature and do not involve any criminal enforcement.
10:26:26 AM
SENATOR KIEHL said he had a technical question regarding
definitions in HB 358, referring to page 3, lines 17 and 19, and
asked about the terms "conduct" and "action." He stated he
suspects the intent is to refer to representations of physical
activity or behavior. He then asked whether the language could
apply to a political ad that misrepresents a legislator's vote.
For example, if Representative X voted differently on a food
benefit for children and an ad with a digital element claimed
that Representative X "stole food from the mouths of needy
children," he questioned whether that could be considered a
misrepresentation of conduct or action.
MR. BALLINGER replied that the scenario is a bit of a stretch
because, under HB 358, a person would first need to manipulate
an image in a way that makes it appear fundamentally different
from what it originally was. He said the bill is unlikely to
apply to the expression of ideas alone. However, if an image of
a legislator stating, "There is nothing more important to me
than taking care of kids," is manipulated to say, "Feeding kids
means nothing to me," the legislation would clearly apply. He
added that if the committee believes clarification is needed,
additional language could be included.
10:28:53 AM
SENATOR KIEHL said that as he reviews HB 358, page 3, lines 14
21, the definition of deepfake includes any audio media created
by artificial intelligence that appears to a reasonable observer
to be an authentic record of an individual's conduct and conveys
a fundamentally different understanding of the individual's
action. He opined that if the intent is to target physical
conduct or action, the bill may need to account for scenarios
such as using AI software to display a fake vote board or a real
image of a person in an ad. He acknowledged that while such
actions are deplorable, they are still protected speech.
MR. BALLINGER quoted lines 1921 of HB 358: "?a reasonable
person would have from the unaltered, original version of the
individual's appearance, action, or speech," and stated that a
deepfake, by definition, requires manipulation of appearance,
action, or speech. He suggested that "action" in this context
could possibly be interpreted to include votes but acknowledged
uncertainty about whether a court would agree with that
interpretation.
10:30:38 AM
CHAIR CLAMAN asked for confirmation that HB 358 creates a
private right of action, allowing an individual to sue another
person over the use of a deepfake.
MR. BALLINGER replied yes.
CHAIR CLAMAN said that by passing HB 358whether amended or not
the legislation establishes a private right of action, meaning
individuals can file lawsuits over deepfakes without state
involvement unless the law is significantly changed to include
criminal enforcement. He stated that a person with a claim would
need to find a lawyer willing to pursue a case for alleged
damages caused by the publication of a deepfake. He asked
whether, during the House hearing, there was any discussion
about what types of damages could be demonstrated in such cases.
He noted the practical concern that plaintiffs would likely need
to present a strong damage claim to attract legal representation
on a contingency fee basis, unless they had the financial means
to pay an attorney by the hour.
10:31:56 AM
MR. BALLINGER replied that legal billings for such cases would
likely be hourly. He said he does not expect many attorneys to
take these cases on a contingency basis unless the case is
significant, such as one involving a gubernatorial race with
higher potential damages. For a state representative race, while
damages could be demonstrated, they would likely be minimal. As
a result, individuals would probably need to pay an attorney by
the hour.
10:32:39 AM
CHAIR CLAMAN asked, if the legislation is passed, wouldn't it
effectively mean that only individuals with substantial
resources would be able to meaningfully pursue claims. He stated
that the average person would likely be unable to bring a case
due to the high cost, potentially requiring tens of thousands of
dollars to pursue a claim.
MR. BALLINGER replied yes, it would likely be individuals with a
vested interest who are willing to invest in bringing a claim.
He noted that successful plaintiffs could recover attorney's
fees and damages. If the case is clear, it may be worth
pursuing, particularly for candidates. However, he added that a
private citizen who feels wronged by a deepfake could also file
a claim. He concluded that unless the case involves clear and
significant damages, it would be similar to other cases where
people must decide whether pursuing justice is worth the
financial investment.
10:33:49 AM
SENATOR TOBIN said she is thinking along the same lines,
expressing concern for Alaska's citizen legislature, school
board members, city council members, and others who may be
targeted by deepfakes without meaningful recourse. She noted
that those individuals are not paid a high enough salary to
afford pursuing civil penalties. She asked what the penalty
structure looks like in other states, such as Washington.
10:34:37 AM
MR. BALLINGER recollected that the language in Washington is
very similar, granting the right to file a civil lawsuit. He
said there are surely other versions of bills that establish
specific penalties. He opined that if a candidate engaged in
creating or distributing a deepfake, it could also result in an
ethics complaint. Ultimately, he stated that the goal is for the
threat of being sued over a deepfake to serve as a deterrent
similar to how defamation laws function. He added that HB 358
creates a legal framework where deepfakes are treated as a form
of defamation for which civil action can be taken.
10:35:32 AM
CHAIR CLAMAN stated his belief that what motivates people to be
truthful is not the threat of a defamation lawsuit, but rather
their inherent goodwill and sense of justice.
MR. BALLINGER replied he hopes that is true.
CHAIR CLAMAN said the question of damages is important, noting
that two members of the committee previously worked as
legislative staff before running for office. He raised the
scenario in which HB 358 is in effect, and Candidate Kiehl loses
an election after a deepfake appears late in the campaign. If
Candidate Kiehl files a lawsuit against the publisher, and the
person acknowledges the deepfake but argues that he suffered no
damages because he can return to a better-paying legislative
staff position, it raises the issue of how damages would be
proven in such a case.
MR. BALLINGER replied that is a legitimate concern.
10:36:54 AM
MR. STANCLIFF commented that these types of questions arise
during a good legislative process. He expressed hope that, like
all laws, once enacted, this legislation would serve as a
preventative barrier. He also addressed the issue of free
speech, referencing a House amendment where the sponsor
indicated a willingness to eliminate the disclaimer provision
altogether, stating that no disclaimer would protect someone
from liability. He stated his belief that the amendment failed
due to concerns over free speech. He explained that the intent
was to allow individuals creating deepfakes to take
responsibility by disclosing their identity. He acknowledged the
fine line involved in balancing these concerns and noted that if
HB 358 becomes law, it would be just a beginninga new section
of statute that can be amended and expanded. He concluded by
saying that the bill's deterrent effect could grow through
public awareness, ongoing discussion, media attention, and the
increasing presence of deepfakes in news reels.
10:38:46 AM
SENATOR TOBIN stated that she agreed and noted that multiple
states are currently considering similar legislation. She
highlighted Florida's approach, which includes both civil and
criminal provisions. Florida's law makes failure to include the
required disclaimer a first-degree misdemeanor, which she
believes offers stronger protection for Alaskans considering a
run for office in the upcoming election cycle. She added that
Florida assigns the Division of Administrative Hearings to
adjudicate violations.
10:39:27 AM
SENATOR TOBIN emphasized that her concern extends beyond
citizens who might create deepfakes to include outside actors
and agitators, whose influence on local elections has been
evident for nearly two decades. She expressed concern that
relying solely on civil penalties and leaving enforcement to the
individual harmed by a deepfake removes an important tool for
protecting candidates. She pointed out that, as a candidate, she
would not have the resources to pursue legal action against
foreign entities, such as the Russian government, or platforms
like TikTok if they disseminated deepfakes aimed at influencing
the election.
SENATOR TOBIN urged legislators to consider the full
implications of the issue, advocating for a broader, more
inclusive approach rather than a single step forward. She
concluded by stating her belief that deepfakes and artificial
intelligence pose an existential threat to free and fair
elections.
10:40:42 AM
MR. BALLINGER stated he agreed one hundred percent. He expressed
concern about what is realistically possible and whether
legitimate concerns can be incorporated now or will have to wait
until the next legislative session. He noted that the sponsor
has repeatedly said HB 358 is a starting point. He acknowledged
that lawmakers do not yet fully understand all the possibilities
or issues involved, but emphasized the importance of getting
something on the record to build upon.
MR. BALLINGER added that if the committee chooses to make an
amendment and believes it can get HB 358 to the Senate floor and
back to the House for concurrence, the sponsor is likely open to
that. He concluded by saying it is not an issue with fundamental
disagreement; most people are on the same page, and the focus is
on how best to achieve legislation.
10:41:59 AM
SENATOR TOBIN said she is reviewing the legislation currently
being considered in 27 other states, many of which have passed
or introduced similar measures. She stated she will continue
examining those efforts to identify ways to provide relief and
protection to all of Alaska's potential candidates as quickly as
possible during the upcoming election cycle.
10:42:19 AM
CHAIR CLAMAN opened public testimony on HB 358; finding none, he
closed public testimony.
10:42:38 AM
CHAIR CLAMAN held HB 358 in committee.
10:43:00 AM
There being no further business to come before the committee,
Chair Claman adjourned the Senate Judiciary Standing Committee
meeting at 10:43 a.m.
| Document Name | Date/Time | Subjects |
|---|---|---|
| HB 358 version D 5.2.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |
| HB 358 Sponsor Statement 5.10.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |
| HB 358 Sectional Analysis 5.10.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |
| HB 358 Changes in Versions 5.10.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |
| HB 358 Supporting Document- AI Art 5.10.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |
| HB 358- Supporting Document-AI Art Part 2 5.10.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |
| HB 358 Statement of Zero Fiscal Impact 4.8.2024.pdf |
SJUD 5/11/2024 10:00:00 AM |
HB 358 |