US Lawmakers Introduce Algorithmic Accountability ActBill Would Require Impact Assessments, Create Public Repository, Empower FTC
U.S. lawmakers have introduced a bill that they say would bring "new transparency and oversight" to software, algorithms and other automated systems making "critical decisions" for American life. The bill - an updated version of a 2019 proposal - would also combat bias in the use of such technologies, its sponsors say.
The Algorithmic Accountability Act of 2022, introduced by Sens. Cory Booker, D-N.J., Ron Wyden, D-Ore., and Rep. Yvette Clarke, D-N.Y., would require companies to conduct impact assessments for bias, effectiveness and other factors when using automated decision systems. It would also create the first public repository at the Federal Trade Commission of such systems and would add 75 staff to the FTC to enforce the law.
"We have a responsibility to ensure that [automated decision systems] are adequately assessed for biases that may disadvantage minority or marginalized communities," says Booker in a statement. "I am proud to reintroduce this legislation and create the transparency needed to prevent unwanted disparities and to hold bad actors accountable."
The bill would also require the FTC, the nation's consumer protection agency, to publish an annual anonymized aggregate report on trends, high-level metrics, and steps on how to contest certain automated decisions, according to a summary of the proposal.
'Pulling Back the Curtain'
In a statement, Wyden says: "Our bill will pull back the curtain on the secret algorithms that can decide whether Americans get to see a doctor, rent a house or get into a school. Transparency and accountability are essential to give consumers choice and provide policymakers with the information needed to set the rules of the road."
And Clarke said of the reintroduced bill: "[These impactful automated decisions] are subject to a wide range of flaws from programming bias to faulty datasets that can reinforce broader societal discrimination, particularly against women and people of color. It is long past time Congress act to hold companies and software developers accountable for their discrimination by automation.
"We must ensure that our 21st-century technologies become tools of empowerment rather than marginalization and seclusion."
The bill summary also says there are "insufficient safeguards" on this issue and that "the American public and government need more information to understand where and why automation is being used."
The measure is co-sponsored by Sens. Brian Schatz, D-Hawaii; Mazie Hirono, D-Hawaii; Ben Ray, D-N.M.; Tammy Baldwin, D-Wis.; Bob Casey, D-Pa.; and Martin Heinrich, D-N.M.
It has earned support from experts and civil society organizations, including the Center for Democracy and Technology, Color of Change, Consumer Reports, Fight for the Future, the Electronic Privacy Information Center, the Montreal AI Ethics Institute and the U.S. Public Interest Research Group, among others.
And other advocacy groups say the measure is an important one, though they hope to "fine-tune" the verbiage.
Craig Albright, vice president of legislative strategy at trade group BSA - The Software Alliance, says in a statement shared with ISMG: "We agree with the core proposition of [the bill] that all organizations that develop and use high-risk AI systems should be required to perform impact assessments."
Albright adds, however: "While [it] is an important recognition of the utility of impact assessments, the mechanics included in the legislation will require fine tuning. We look forward to working with the sponsors and committees of jurisdiction to improve on the legislation to ensure that any new law clearly targets high-risk systems and sensibly allocates responsibilities between organizations that develop and deploy them."
Updates and Feedback
Booker, Wyden and Clarke say they updated the 2019 version of the bill - which did not advance beyond committee - after speaking with experts, advocacy groups and other stakeholders. They claim it includes "technical improvements" around the types of algorithms in question, the scope of assessments and the structure of related reports.
Arisha Hatch, vice president of the progressive nonprofit civil rights advocacy organization Color of Change, said of the bill: "We can no longer allow for these issues to go unregulated. … Big Tech and their operations must proactively detect and address discrimination."
Hatch said the bill would equip the FTC "with the resources necessary to enforce these protections and create a more equitable digital space."
DOD AI Office
Elsewhere within AI and emerging technology this week, the U.S. Department of Defense said its new office, the Chief Digital and Artificial Intelligence Office, which will work to align the DOD's AI-related initiatives, reached operational capacity, meeting a key early deadline.
Recently confirmed DOD CIO John Sherman will serve as acting chief digital and AI officer until a permanent candidate is found, officials say.
Initial plans to form the office were announced in December 2021, to streamline the DOD's AI-focused projects and provide cohesion through the data life cycle. The department met a Feb. 1 deadline to reach operational capacity and will work to a June 1 deadline for full capacity.
In October, Sens. Gary Peters, D-Mich., chairman of the Homeland Security and Governmental Affairs Committee, and Rob Portman, R-Ohio, the committee's ranking member, introduced legislation that would form a working group charged with monitoring the security of AI data obtained by federal contractors. The lawmakers said at the time that this body would also ensure that the data adequately protects national security and recognizes privacy rights - including safeguarding biometric data from facial recognition scans (see: New Bill Would Secure Government Contractors' Use of AI).
The bill, which was reviewed and advanced by Peters' committee, would require the director of the Office of Management and Budget to establish and consult with an Artificial Intelligence Hygiene Working Group.