Inspiration

The BullyBlock team was inspired by the countless cyberbullying victims who deserve proper protection from the more harmful side of the internet.

What it does

BullyBlock scans through webpages for harmful words and phrases then replaces them with asterisks.

How we built it

We built BullyBlock as a Google Chrome extension using HTML, CSS, and JavaScript for real-time use when browsing online.

Challenges we ran into

Our first big challenge came from importing our custom database of harmful words and phrases into the extension. This was eventually resolved with a separate JavaScript file containing the "harmful phrases" database that BullyBlock scanned through whenever active.

Towards finalization, we had trouble with the switch design as it would appear "off" upon exiting BullyBlock even though the extension remained active. This was patched with a method that saved the appearance status of the switch and applied it to the switch upon reopening the extension.

Accomplishments that we're proud of

We are most proud of developing a functioning webpage scanner that could practically detect anything we decide BullyBlock to cross-reference with in our "harmful phrases" database. This ability holds potential in future expansion towards detecting images, videos, audio, and all forms of online content.

Our tips and tricks section at the bottom of the extension menu was also a nice added touch. We felt that every time users open BullyBlock, seeing the randomly generated comment could positively influence the user at any time, especially at a time when they feel compelled to activate the extension.

What we learned

Online, the only way negativity spreads and festers is if users add fuel to the fire. Ignoring it is truly the best way to stomp it out, but when it gets to be too much for some to bear, BullyBlock is here to help block the hate.

What's next for BullyBlock

With potential to expand into multi-media platforms, BullyBlock plans to be able to scan images, videos, and audio as these mediums become the norm for modern websites. With a big enough userbase, work can begin towards a machine learning algorithm that automatically detects harmful content without the need of a database.

Built With

Share this project:

Updates