GPTgreen

Our Team: The Gersons

  • Brian Becker - Website
  • Benson Jiang - Website
  • Isaac Saland - Website
  • Bryan Cho - Chrome Extension
  • Seth Lupo - Chrome Extension
  • Yuvit Batra - Chrome Extension

Description

Did you know that some scientists predict AI is going to cause CO2 emissions in certain American cities to double over the next 10 years?

Large language models are a powerful technology; however they are extremely power intensive. This resource intensive technology drains electricity, guzzles water, and spews CO2. As a group, we aimed to:

  • Spread awareness about one's own environmental impact
  • Reduce one's own environmental impact

Awareness

We've created multiple TOOLS to quantify the environmental impact for generating text. They are:

  1. Live Resource Consumption Chrome Extension We've created a chrome extensions that predicts/tracks the live resource consumption. It displays live estimates estimate the energy, water, and CO2 used in a given chat. It updates as new text is being produced. Additionally, our extension supports the predicting the resource consumption of generating a certain text from multiple file formats (.txt, .pdf, .docx). This allows the individual to quantify their AI's day-to-day usage on the environment.

  2. Long-Term Resource Consumption Analytics In a few short steps we've created a website that shows your long term resource consumption. This allows the individual to quantify their AI's long-term usage on the environment.

We've supplied peer-reviewed research to back each of our estimations. We've linked our sources on our website.

Reduction

While this is a complex issue, there is one relatively way to reduce the footprint of each query: ask ChatGPT to produce less text. In our extension, we created a "Energy Saver" setting that automatically prompt-engineers each query to reduce output size.

Acknowledgements

  • ChatGPT itself was used about everywhere in this codebase.
  • We used GPT-Tokenizer to count the tokens in text. This was instrumental to our model.
  • Big thanks to Ethan Siskel for this idea!
Share this project:

Updates