Home

March 2023 U.S. Tech Policy Roundup

Kennedy Patlan, Rachel Lau, Carly Cramer / Apr 1, 2023

Kennedy Patlan, Rachel Lau, and Carly Cramer work with leading public interest foundations and nonprofits on technology policy issues at Freedman Consulting, LLC. Alondra Solis and Sofia Rhodes, Freedman Consulting Phillip Bevington policy & research interns, also contributed to this article.

U.S. Secretary of State Antony Blinken speaks during the Summit for Democracy on March 30, 2023, in Washington, D.C. State Department photo by Chuck Kennedy/ Public Domain

In March, TikTok and Congress faced off on the Hill as an increasing number of lawmakers from both sides of the aisles called for a ban of the platform. Notably, the Biden administration publicly expressed support for a bill that could force TikTok’s separation from its Beijing-headquartered owner, ByteDance, the administration’s first public foray into legislation related to the company’s fate. The month concluded with TikTok CEO Shou Chew testifying before the House Energy and Commerce Committee.

Additionally, ahead of the Summit for Democracy, the White House issued an executive order banning U.S. federal government entities from operationally deploying commercial spyware “that pose[s] significant counterintelligence or security risks to the U.S. Government or significant risks of improper use by a foreign government or foreign person.” Also on the surveillance front, the House Intelligence Committee established the Foreign Intelligence Surveillance Act (FISA) Working Group this month to take on the arduous task of reauthorizing Section 702 of FISA.

March was also a busy month for the Federal Trade Commission (FTC), which reached a $7.8 million settlement with BetterHelp after the online mental health company was found sharing customer data with third parties for advertising purposes. The agreement also banned BetterHelp from sharing consumers’ health data for advertising in the future. The agency also led investigations into Twitter’s data, privacy, and management practices following Elon Musk’s purchase of the company. The investigation was met with criticism from Republican members of the House Judiciary Select Subcommittee on the Weaponization of the Federal Government during a hearing, and in a related interim staff report. Separately, Politico reported on the FTC's intent to bring federal antitrust suits against Amazon, the likely outcome of active investigations into potential privacy, antitrust, and advertising violations by the company. These activities and other needs resulted in an FTC request to Congress to increase appropriations to the agency by more than 400 percent, potentially boosting its total budget by about 37 percent. (Much of the agency’s budget is covered by merger fees, which were boosted last year to support enforcement.)

Meanwhile, the Federal Communications Commission (FCC) faced another delay in filling the commission’s fifth seat after Gigi Sohn announced her withdrawal from the nomination process following an arduous 16-month process.

The below analysis is based on techpolicytracker.org, where we maintain a comprehensive database of legislation and other public policy proposals related to platforms, artificial intelligence, and relevant tech policy issues.

TikTok Faces Congress Amid Push for a Total Ban

  • The RESTRICT Act Gains Support: The rapid growth and extensive influence of popular social media platform TikTok has raised privacy and security concerns and attracted significant congressional scrutiny in recent months. Government officials have claimed that TikTok’s Chinese-headquartered parent company, ByteDance, could access Americans’ user data or spread misinformation to users. To combat these concerns, Sens. Mark Warner (D-VA) and John Thune (R-SD) introduced the RESTRICT Act this month, a bipartisan bill (S.686) that has the support of more than 25 senators. The bill gives Department of Commerce officials the authority to identify and block technology deals and products involving six foreign adversary countries, including China, that pose a national security risk. Despite the inclusion of other countries, the senators aimed most of their criticism at the Chinese government. While the bill is not an outright ban of TikTok, the “comprehensive approach,” according to Senator Warner, could be the most feasible legislative solution for TikTok privacy protections, especially given support from the White House, whose National Security Advisor Jake Sullivan urged Congress to “act quickly to send the bill to the President’s desk.” Sullivan’s statement marks the first time a TikTok-related bill has received the explicit backing of the Biden administration. The Biden administration also warned the company that it may be subject to a ban if ByteDance does not sell its stake in the app's U.S. version; China’s government, however, announced it would oppose a sale. While the RESTRICT Act does not yet have a companion version in the House, Sen. Warner claimed he had “lots of interest” from both Democrats and Republicans in the House and was in discussions with House Speaker Kevin McCarthy (R-CA). However, Politico reported that the bill still faces significant headwinds.

Other Relevant Legislation

  • The DATA Act (HR.1153):The month began with the House Foreign Affairs Committee reporting the DATA Act, which would functionally ban TikTok. It has no Democratic co-sponsors, causing Chairman of the Committee Michael McCaul (R-TX) to call out the Democratic panel for opposing the measure. Rep. Gregory Meeks (D-NY) responded by saying that the DATA Act was a “hastily drafted measure” that could infringe on users' free speech rights before a review was conducted.
  • The ANTI-SOCIAL CCP Act (S.347/H.R.1081): Introduced byReps. Mike Gallagher (R-WI), Raja Krishnamoorthi (D-IL), and Senator Marco Rubio (R-FL), the ANTI-SOCIAL CCP Act would force ByteDance to sell TikTok or risk the app being banned altogether.
  • The No Funds for Enablers of Adversarial Propaganda Act(S. 941): Introduced by Sen. Mario Rubio (R-FL), the bill would prohibit federal funding from being granted to individuals or organizations who have any formal relationships with entities headquartered or under jurisdiction of countries of concern. Rubio has been a staunch supporter of banning TikTok, and has said that it’s time to ban TikTok “for good.”
  • Finally, House and Energy Commerce Committee Chair Cathy McMorris Rodgers (R-WA) announced that she will propose a bill of her own focused on banning TikTok specifically, rather than targeting technologies across multiple countries like the RESTRICT Act.
  • Opposition to Proposed Bans: While congressional views on TikTok are largely hostile, the company has some defenders. Rep. Jamaal Bowman (D-NY), said that the issue has been taken over by “Washington groupthink” and expressed a desire to see Congress focus on a more holistic approach to social media policy; Sen. Rand Paul (R-KY) raised First Amendment concerns in blocking an effort to pass a ban by unanimous consent. TikTok’s spokesperson, Brooke Oberwetter, also relayed her concerns over any proposed bans. Some advocacy groups like the Electronic Frontier Foundation and Engine, which represents start-ups, suggested that the government use this moment to focus on creating a federal data privacy law, rather than banning the platform altogether. Other public interest advocacy groups also agree, as 16 of them penned a letter to Congress asking them to reconsider a full ban of TikTok due to violations of First Amendment rights and setting a bad global precedent that undermines American leadership in opposing digital repression.
  • TikTok Tries to Reassure Lawmakers: Amid the back-and-forth, TikTok has been working to deploy Project Texas, a security plan involving Oracle Corp. that will house the company's data and requires Oracle to oversee strict procedures regarding data security. On March 23, TikTok CEO Shou Chew testified before the House Energy and Commerce Committee, reporting that the company’s Project Texas plan will introduce new safeguards intended to prevent data misuse and keep Americans’ data domestic. He also reiterated that the Chinese government had not asked for user data, nor had the company provided it any.
  • What We’re Reading: Justin Hendrix, Editor of Tech Policy Press, explores the underlying dynamics that have led to this moment in technology policy. The Washington Post published an op-ed by British journalist Chris Stokel-Walker, who wrote about the political pressure that the app is facing, despite the issues at hand being reflective of social media platforms at large. Taylor Lorenz at The Washington Post also critiqued lawmakers’ approach to the TikTok hearing, which was riddled with inaccurate questioning. The New York Times featured an op-ed from the Knight First Amendment Institute’s Jameel Jaffer that applied First Amendment considerations to the ongoing debate about whether or not to ban TikTok and another from former Markup editor Julia Angwin that emphasized the importance of privacy laws. Bloomberg reported that TikTok is considering a divestiture from its China-based parent company, ByteDance Ltd.

Federal Agencies Act on AI Harms; Congress Studies AI Issues

  • Summary: In March, federal agencies took steps to protect consumers and workers from AI-driven harms. Regulators at the National Labor Relations Board (NLRB) and the Consumer Financial Protection Bureau have unveiled a new information sharing agreement that aims to protect American consumers and workers from illegal employer surveillance and employer-driven debt. NLRB General Counsel Jennifer Abruzzo argued that some uses of employee surveillance and artificial intelligence tools could prevent workers from exercising their labor rights, and that a "whole-of-government approach" would be critical in ensuring that "workers are able to fully and freely exercise their rights without interference or adverse consequence." Meanwhile, the FTC warned of the dangers of artificial intelligence and synthetic media in regards to deepfakes and voice cloning, and cautioned companies against unfair or deceptive practices involving these tools. The Commission indicated that fraudsters are already employing generative AI tools to create hype realistic fake content rapidly and at low cost. A week later, the Commission reiterated the danger, issuing a consumer alert due to scammers using voice-cloning technology to extort individuals who believe they are speaking with a loved one. And in late March, FTC Chair Lina Khan also spoke about the agency’s intent to focus more on AI developments.

Meanwhile both the House and Senate held hearings on AI. Suresh Venkatasubramanian, a professor at Brown University and a former advisor to the White House Office of Science and Technology Policy (OSTP) on its Blueprint for an AI Bill of Rights, testified before the Senate Committee on Homeland Security and Government Affairs alongside RAND Corp President and CEO Jason Matheny and Center for Democracy and Technology President and CEO Alexandra Reeve Givens on March 8th, urging Congress to take action to protect civil rights while allowing for the responsible deployment of AI strategies. The House Oversight Committee’s hearing featured former Google CEO Eric Schmidt, IBM’s Scott Crowder, and Merve Hickok from the Center for AI and Digital Policy.

Additionally, the U.S. Chamber of Commerce's AI Commission on Competitiveness, Inclusion, and Innovation released its long-awaited report on the promise of AI, calling for a risk-based regulatory framework that balances potential harms with global economic competitiveness. While the report indicates some of the same concerns reflected by Venkatasubramanian's testimony, such as potential impacts on individual rights, it places a much heavier emphasis on the benefits of such technologies, pointing to uses such as patient monitoring in hospital settings, mapping wildfire paths, and creating new avenues for credit.

  • Stakeholder Response: Republicans boycotted the Senate hearing after Democrats blocked conservative commentator Jordan Peterson from testifying due to his unwillingness to appear in person. The Center for Democracy and Technology highlighted CEO Alexandra Givens' testimony, which centered on how the use of AI risks harm to economic opportunity through its impact on employment, housing, lending, and administration of public benefits. Adam Thierer of R Street, who served as a commissioner for the Chamber of Commerce effort, asserted that the Chamber report supports and complements the National Institute of Standards and Technology's new Risk Management Framework in its ability to respond to new risks as they emerge. David Hirschmann, President and CEO of the Chamber's Technology Engagement Center, argued that "at the core of this debate is a simple premise – for Americans to reap the benefits of AI, people must trust it.”
  • What We’re Reading:Luke Hogg, Director of Outreach at the Lincoln Network, argued that AI tools have the power to either bolster or harm American democracy, and that the critical differentiation is the prevalence of open-source technologies. The Tech Policy Press podcast examined questions around the ethical, legal, and economic risks of generative AI and synthetic media. The New York Times chronicled how a hands-off approach has allowed AI to outgrow Congress's ability to regulate it. A Stat News investigation found Medicare Advantage plans using algorithmic management to limit patient care, while ProPublica documented Cigna’s efforts to automate denial of insurance claims.

The Winding Path to a Potential 702 Extension

  • Summary: This month, the House Intelligence Committee formed the Foreign Intelligence Surveillance Act (FISA) Working Group, a six-member group seeking a path forward to reauthorize Section 702 of the FISA Amendments Act of 2008, which will expire at the end of the year without congressional action. Section 702, which has been reauthorized twice previously, allows the U.S. government to surveil and collect emails, text messages, and phone calls of foreigners overseas without a warrant, regardless of whether Americans are involved in the communications. Section 702 has been reauthorized twice since its initial passage. Rep. Mike Turner (R-OH), chair of the House Intelligence Committee, pushed for reform, stating that “there have been and there continue to be many abuses of FISA.” The working group will likely also propose reforms to FISA beyond Section 702, educate Congress on FISA’s purview, and engage the intelligence community on FISA oversight.
  • Stakeholder Response: The Section 702 reauthorization deadline has sparked input from many stakeholders. Federal officials argued that the act allows for a crucial source of intelligence on threats to national security, but increased attention on the FBI’s documented violations of data privacy has strengthened skepticism about the free reign Section 702 allows. FBI Director Christopher Wray testified before the Senate Intelligence Committee that FBI searches in a warrantless surveillance database decreased over 93 percent in 2022. In response to Wray’s testimony, Liza Goitein, senior director for liberty and national security at the Brennan Center for Justice, tweeted that “the FBI is conducting up to 559 warrantless searches for Americans' phone calls, texts, and emails *every day*” even with the sharp decrease. Princeton professor Jonathan Mayer announced a novel method to calculate the amount of data foreign surveillance programs gather on Americans, undermining the argument that Section 702 only targets foreign actors. The method allows for the analysis of sensitive data sets without decryption, potentially allowing analysis while avoiding privacy violations. Additionally, National Security Advisor Jake Sullivan called for the reauthorization of Section 702, indicating the White House’s support. Industry actors also weighed inthrough the Reform Government Surveillance coalition, with Alphabet, Meta, and Apple pushing for limitations on U.S. intelligence agencies’ ability to collect and view digital communications. The Electronic Frontier Foundation also pushed for reform to Section 702.
  • What We’re Reading: The New York Times wrote about the FBI’s use of Section 702 as well as the connections between conversations on Section 702 and Trump and Biden’s classified files. The Associated Press reported on claims that the FBI wrongfully surveilled lawmakers.

White House Limits Government Use of Commercial Spyware

  • Summary: In late March, President Biden signed an executive order prohibiting U.S. federal governmental operational use of many commercial spyware products. The executive order cited the national security and counterintelligence risks posed by commercial spyware and noted the increased use of commercial spyware by foreign governments to enable human rights abuses in both authoritarian and democratic regimes. The executive order prohibits federal agencies and departments from “operationally using commercial spyware tools that pose significant counterintelligence or security risks to the U.S. Government or significant risks of improper use by a foreign government or foreign person, including to target Americans or enable human rights abuses.” It also creates a framework for evaluating these risks, establishes reporting requirements within the executive branch, and names remedial steps for commercial spyware vendors to reduce the risk associated with their products. In addition to the executive order, the White House also announced its participation in the Joint Statement on Efforts to Counter the Proliferation and Misuse of Commercial Spyware with nine other countries, the publishing of Guiding Principles on Government Use of Surveillance Technology, and participation in the Export Controls and Human Rights Initiative. Additionally, over 150 private sector companies signed onto a set of principles to reduce the risks and harms around commercial spyware in response to the Summit for Democracy.
  • Stakeholder Response: The executive order was announced in advance of the White House’s second Summit for Democracy and follows increased concern around spyware programs that can gather data from mobile phones without any user interaction. A senior administration official reported that at least fifty US government officials “are suspected or confirmed to have been targeted by invasive commercial spyware” through their mobile phones. The Center for Democracy and Technology tweeted in support of the executive order, but noted that it does not include limits on non-commercial spyware, transparency about what spyware will be banned, or establish the “equal treatment of the privacy interests of all persons.” Additionally, Front Line Defenders and over 40 other organizations published a letter to the governments participating in the summit for democracy calling for a ban on spyware sales until safeguards are established and the establishment of human rights due diligence requirements for venture capital firms involved with spyware and government agencies contracting technology companies.
  • What We’re Reading: Forbes wrote about the Drug Enforcement Agency’s use of Apple’s AirTag to investigate the transportation of illegal narcotics. Steven Feldstein and Brian (Chun Hey) Kot at the Carnegie Endowment for International Peace analyzed the greater context of the global spyware industry.

Other Legislation and Policy Updates

The following bills progressed in the Senate in March:

  • The Moving Americans Privacy Protection Act (S. 758, sponsored by Sen. Steve Daines (R-MT)): This bill would protect the personally identifiable information of people moving to the U.S. by amending the manifest disclosure process of the Tariff Act of 1930. Currently, U.S. Customs and Border Protection publicly releases manifest sheets documenting household goods imported to the U.S. The act would ensure that the publicly disclosed information no longer includes personally identifiable information like names, addresses, social security numbers, and passport numbers. The bill passed the Senate in March by unanimous consent.
  • Preventing Child Sex Abuse Act of 2023 (S.724, sponsored by Sen. Chuck Grassley (R-IA)): This bill would expand the definition of child sexual abuse beyond physical contact to include intent of harm through internet platforms, online chat rooms, and webcams. The section would expand the definition by including individuals with any intent to engage in any illicit sexual conduct with another person as opposed to individuals with a motivating purpose to engage in these behaviors. The bill passed the Senate in early March by unanimous consent. The House companion bill was introduced in January by Rep. Tim Burchett (R-TN) as H.R. 454 and it has not yet progressed further.
  • Informing Consumers about Smart Devices Act (S. 90, sponsored by Sen. Ted Cruz (R-TX) and Sen. Maria Cantwell (D-WA)): This act, introduced in the Senate in January, would require manufacturers to disclose when internet-connected devices contain cameras or microphones. In March, the Senate Commerce Committee advanced the bill for consideration by the full Senate. The House passed the companion bill, H.R. 538, in February 406-12.

The following bill passed the House in March:

  • Protecting Speech from Government Interference Act (H.R. 140, sponsored by Rep. James Comer (R-KY)): This bill would prohibit federal employees from using their authority to advocate for censorship of speech, including speech on third party platforms or private entities. The bill defines “censorship” as “influencing or coercing, or directing another to influence or coerce, for the removal of lawful speech, the addition of disclaimers, or the restriction of access with respect to any interactive computer service (e.g., social media).” The measure applies to executive branch employees acting in an official capacity, and there are exceptions for legitimate law enforcement related to child pornography, human trafficking, controlled substances, or classified national security information. H.R. 140 passed the House on a party line vote (219-206) with all Republicans voting for the bill, but is unlikely to make progress in the Senate.

The following bills were introduced in March:

  • Facial Recognition and Biometric Technology Moratorium Act of 2023 (S. 681, sponsored by Sens Edward J. Markey (D-MA), Elizabeth Warren (D-MA), Bernie Sanders (I-VT), and Ron Wyden (D-OR)): This bill would prohibit any federal agency or official to use biometric surveillance. Individuals harmed by the use of surveillance would have the right to sue the entity believed to be at fault.
  • Upholding Protections for Health and Online Location Data (UPHOLD) Privacy Act of 2023 (S.631, sponsored by Sens. Amy Klobuchar (D-MN), Elizabeth Warren (D-MA), and Mazie Hirono (D-HI)): The UPHOLD Act would prohibit health data collected from any source to be used in commercial advertising without consent from the user. The entities collecting data will also be required to publish a privacy policy outlining the purpose of the data collected, how it's being used, and a specific list of third parties the entity discloses data to. The act would create more regulations on data minimization and disclosure restrictions and eliminate data brokers’ sale of precise location data.
  • Data and Algorithm Transparency Agreement Act (DATA) Act (S.688, sponsored by Sen. Rick Scott (R-FL)): This bill would require internet platforms to inform users of data being collected regarding their habits, traits, preferences, beliefs, or location and require explicit consent to data collection. The bill would create a private right of action for individuals to bring civil action for violations of the act, establish a right to delete for users to request the deletion of their personal data, and prohibit the selling or sharing of data to third parties without the user’s explicit consent. This bill is not related to H.R. 1153, the Deterring America’s Technological Adversaries (DATA) Act.
  • Combating Cartels on Social Media Act of 2023(sponsored by Rep. Abigail Spanberger (D-VA), Michael Burgess (R-TX), and Juan Ciscomani (R-AZ)): This bill would require the Secretary of Homeland Security to develop and implement a strategy improving coordination and collaboration between the appropriate agencies to prevent criminal organizations from utilizing social media platforms to recruit people in the United States to engage in illicit activities. It is a companion to Senate legislation, S. 61, introduced by Sens. Kyrsten Sinema (I-AZ), Mark Kelly (D-AZ), and Bill Hagerty (R-TN) in January.
  • Advertising Middlemen Endangering Rigorous Internet Competition Accountability (AMERICA) Act(sponsored by Sens. Mike Lee (R-UT), Amy Klobuchar (D-MN), Ted Cruz (R-TX), Richard Blumenthal (D-CT), Marco Rubio (R-FL), Elizabeth Warren (D-MA), Eric Schmitt (R-MO), Josh Hawley (R-MO), Lindsey Graham (R-SC), JD Vance (R-OH), and John Kennedy (R-LA)): The AMERICA Act would amend the Clayton Act “to prevent conflicts of interest and promote competition in the sale and purchase of digital advertising” by creating transparency and fair access requirements for digital ad companies that process more than $5 billion in digital ad transactions and prohibiting digital ad companies that process more than $20 billion in transactions from “owning more than one part of the digital ad ecosystem.”
  • To provide authorities to prohibit the provision of services by social media platforms to individuals and entities on the Specially Designated Nationals List and certain officials and other individuals and entities of the People's Republic of China, and for other purposes (H.R. 1714, sponsored by Rep. Brian Mast (R-FL)): No bill text was publicly available at the time of publication.

Public Opinion Spotlight

A Washington Post poll administered among 1,027 U.S. adults from March 17-18, 2023 focused on public opinions on TikTok. It found that:

  • 41 percent of Americans support a federal ban on TikTok, while 25 percent oppose a ban
  • Those who do not use TikTok are more likely to support the ban (54 percent) while those who use TikTok every day or have used the app in the past month are more likely to oppose the ban (54 percent and 45 percent respectively).
  • 72 percent of Americans believe it is likely that the platform hurts teens’ mental health
  • 65 percent of Americans believe that TikTok likely collects personal data on Americans for the Chinese government
  • 56 percent of Americans believe it is likely that China controls what American users see on TikTok
  • 51 percent of Republicans and 33 percent of Democrats support the TikTok ban
  • 43 percent of Americans believe that TikTok collects “about the same amount” of user data as other social media platforms

Global Strategy Group administered a public opinion survey from February 23 - February 27, 2023 with 1,000 registered voters regarding the state of misinformation. They found that:

  • 78 percent of voters say they encounter misinformation somewhat or very often
  • 53 percent of voters are very worried about misinformation
  • “Americans are unlikely to say they see misinformation on the cable news networks they frequently watch: just 23 percent of frequent Fox News viewers say they see misinformation on the channel, as do just 21 percent of CNN viewers on CNN and 14 percent of MSNBC viewers on MSNBC. However, frequent social media users are most likely to say they see misinformation on Facebook (52 percent).”
  • 48 percent of voters are very worried about the impact of misinformation on climate change, 47 percent are worried about impact on crime, 44 percent on issues of gender and sexuality, and 42 percent on the impact on science and the strength of American democracy

In an Ipsos poll conducted from February 28-March 1, 2023 among 1,105 U.S. adults on AI, they found that:

  • 62 percent agree that using AI in the workplace can save time and resources
  • 56 percent say AI generated written work can produce bias or inaccuracies
  • 46 percent think it’s likely AI will change their jobs in the next five years
  • 42 percent do not believe that AI will create new jobs and opportunities to make up for the jobs that are lost
  • 64 percent agree that the government should take action to prevent the potential loss of jobs due to AI
  • 50 percent believe that increased use of AI will lead to more income inequality and a more polarized society

Morning Consult conducted a poll from February 17-March 19, 2023 with 2,200 U.S. adults to understand awareness and interest in AI. They found that:

  • “57 percent of consumers said they have heard of AI chatbots in the news, up from 50 percent just a month ago”
  • “Nearly 2 in 3 (65 percent) consumers said companies that develop AI models bear at least some responsibility for doing so ethically”
    • 28 percent of consumers say that the Federal Trade Commission or other regulators are very responsible for ethical development
    • 24 percent of consumers place say that state governments are very responsible for ethical development
    • 24 percent of consumers say that lawmakers in Congress are very responsible for ethical development
    • 22 percent of consumers say that the U.S. president and his administration are very responsible for ethical development
  • “More than a third (35 percent) of consumers completely or mostly trust AI search to provide unbiased results, up from 27 percent a month ago, and trust in companies to develop AI responsibly is also up 8 points.”

Advocacy organizations Accountable Tech and LOG OFF conducted a poll with 912 American teenagers to understand social media usage habits. They found that:

  • “66 percent feel they are losing track of time on social media, with Black and Hispanic teenagers being disproportionately affected
  • 50 percent lose sleep because they feel ‘stuck’ on social media
  • 43 percent of teens don't do as much homework or school work as they wanted because they got stuck on social media
  • 50 percent bought things they didn’t really want due to targeted ads
  • 75 percent received ads for things they just talked about”

In a Morning Consult poll among 2,205 U.S. adults conducted from February 17-19, 2023 on bias and misinformation associated with AI, they found that:

  • 89 percent of respondents believe data privacy is important when selecting a search engine
  • 36 percent of respondents would not trust a search engine that uses AI to use unbiased algorithms
  • 39 percent of respondents would not trust a search engine that uses AI to respect their data privacy

- - -

We welcome feedback on how this roundup and the underlying tracker could be most helpful in your work – please contact Alex Hart and Kennedy Patlan with your thoughts.

Authors

Kennedy Patlan
Kennedy Patlan is a Project Manager at Freedman Consulting, LLC, where she assists with strategic development, project management, and research. Her work covers technology policy, health advocacy, and public-private partnerships.
Rachel Lau
Rachel Lau is a Senior Associate at Freedman Consulting, LLC, where she assists project teams with research, strategic planning, and communications efforts. Her projects cover a range of issue areas, including technology policy, criminal justice reform, economic development, and diversity and equity...
Carly Cramer
Carly Cramer is an Associate at Freedman Consulting, LLC, where she assists project teams with communications, policy research, and coalition support. Her work covers public health, artificial intelligence policy, and public-private partnerships.

Topics