Encode AI’s cover photo
Encode AI

Encode AI

Civic and Social Organizations

We help policymakers and the public navigate AI.

About us

Encode helps policymakers and the public navigate AI.

Website
http://encodeai.org
Industry
Civic and Social Organizations
Company size
2-10 employees
Type
Nonprofit
Founded
2020

Employees at Encode AI

Updates

  • We're hiring for a policy advisor! You can view the job application and direct Airtable application at the link below. This role will be focused on the campaigns and coalitions that have been key to securing our wins on state bills like SB 53 and the RAISE Act, federal legislation like the TAKE IT DOWN Act, and twice blocking efforts to place a blanket moratorium on state AI regulation. https://lnkd.in/gyy5Uv-R

  • We’re excited to welcome Ben S., Severiano Christian, and Claire Larkin to our small but mighty team at Encode! 🎉 Severiano Christian (based in Sacramento) is joining as our Director of California Policy. At Encode, they’ll lead all of our California state-level advocacy and advise on political operations. Previously, Seve worked in California’s state legislature for 7 years, including as the lead legislative staffer for Senate Bill 53. Ben Snyder (based in D.C.) is joining as a Policy Advisor. At Encode, he’ll lead our biosecurity initiatives and support other ongoing campaigns. Previously, he worked on biosecurity policy as a researcher at Texas A&M University. Claire Larkin (based in D.C.) is joining as a Policy Advisor. At Encode, she’ll lead strategic operations and support our external advocacy and partnerships. Previously, she served as Chief of Staff at the Institute for Progress.

    • No alternative text description for this image
  • Encode AI reposted this

    Exciting news: I’ve collaborated on two proposals to SXSW 2026 and your vote can help bring them to the stage! Voting is open now through South by Southwest’s PanelPicker until August 24. You’ll need to create a free user account (one per email) and then you can vote on the sessions I've collaborated with leading advocates and researchers in AI and Tech Policy. Midterm Report: Unpacking the Trump AI Action Plan was a collaboration with Asad Ramzanali and Kate Brennan where I will join as a co-panelist. The presentation examines the impacts of the Trump Administration's AI Action Plan and discusses how U.S. AI Policy can be reoriented to serve the public good. (https://lnkd.in/eyUjKNhX) The second Not So Fast Senator: Beating the AI Moratorium is a behind-the-scenes discussion where I will moderate with Cody Venzke, Ilana Beller, and Adam Billen where we will discuss how a bipartisan grassroots campaign stopped a sweeping AI bill. We'll share lessons on coalition building, countering Big-Tech narratives, the current state of AI preemption (some things may have moved or been implemented by then), and what Trump's AI plan means for state and federal policy moving forward. (https://lnkd.in/e_Wzwd4v) How you can help: Create a free SXSW PanelPicker account here: http://bit.ly/41Lo4ML Vote for one (or both!) of these sessions. Your support means a lot. It helps ensure that critical conversations about AI, democracy, and accountability make it onto the SXSW stage. #SXSW #SXSW2026 #AI #TechPolicy #AIAccountability #PublicInterestTech

    • No alternative text description for this image
  • Industry has been pushing a "moratorium" on state AI legislation for the last month and a half. Last night, Senator Blackburn released new compromise text with Senator Cruz. The problem: it's just as bad -- if not worse -- than the last version. It has no meaningful exceptions. It does not protect kids. It does not protect BEAD. It's the same bribe industry has been pushing for a month and a half. We organized a letter with 130+ organizations that agree. https://lnkd.in/g7sWhB_k

    • No alternative text description for this image
  • The U.S. House of Representatives just voted to pass the TAKE IT DOWN Act 409-2! Having already passed the Senate in February, the bill now heads to the President’s desk for signature. With bipartisan support and endorsements from over 120 civil society organizations, companies, trade groups, and unions the bill is primed to become law. “Just a few years ago it was unthinkable that everyday people would have the technical expertise to generate realistic intimate deepfake content. But today anyone can open an app or website and create realistic nude images of anyone—an ex, a classmate, a coworker—in minutes. The resulting wave of abuse, compounded with the existing crisis of image-based sexual exploitation, has already robbed thousands of victims of their personhood and sense of self,” said Adam Billen, Vice President of Public Policy at Encode. “Congress has taken a critical step in addressing AI harms by passing the TAKE IT DOWN Act, which will not only hold perpetrators accountable but empower millions of victims to reclaim control of their images in the aftermath of abuse.” Encode organized multiple coalition letters in support of the bill, published op-eds in Tech Policy Press and the Seattle Times, convened victims, students, and lawmakers including sponsors Senators Cruz and Klobuchar to prevent deepfake porn in schools, and built a website tracking deepfake porn incidents in schools around the country. This morning, Encode’s Adam spoke at a virtual press conference addressing the legislation’s constitutional grounding. https://lnkd.in/gkd37FXm

  • We filed an amicus brief with Young People's Alliance and Design It For Us in Garcia v. Character Technologies, the case involving the tragic death of Sewell Setzer by suicide last year. We argue that prematurely dismissing Garcia's lawsuit on the basis of the First Amendment is a mistake given the novelty of the case and importance of factual discovery to 1A analysis. Courts have historically declined to dismiss claims at this early stage in other cases involving novel technologies, including in litigation over harmful design practices on social media. Cementing complete protection against liability for potentially serious, harmful conduct without guardrails could leave millions of young people vulnerable. https://lnkd.in/gZWPAWzy

  • View organization page for Encode AI

    4,078 followers

    At Encode, we created an anonymous platform for victims to report non-consensual deepfake explicit images of children—a crisis spreading at an alarming scale. While millions of high schoolers are aware of explicit AI-generated content circulating at their schools, only a handful of cases have been reported in the news. Inspired by a South Korean initiative that uncovered over 500 cases of abuse, our incident map offers a way for victims to share their experiences while preserving their privacy. By aggregating these reports, we hope to illuminate the scale of the issue as we continue our legislative advocacy efforts. Out today in Tech Policy Press: https://lnkd.in/g6nMKTsY Link to the site: https://lnkd.in/gnJ2VtB6

Similar pages

Browse jobs