Our announcement that GoGuardian and Securly failed to block more than half of the pornographic web sites we tested has clearly struck a nerve. (See here.)
We think blocking access to pornography is the most fundamental thing a web filter should do, and stand by our claims and concerns. We’ve also heard from many schools that share those concerns, tested their filters, and were shocked at the results.
This is about more than some dirty images. It’s about violence, illicit pornography, radicalism and child trafficking. It’s bad. This isn’t marketing; it’s us speaking out about the things we value and work hard for every day.
Securly claims that PageScan means explicit sites will be allowed only once, then categorized and blocked. We’ve retested our lists days later and they’re still allowed, so PageScan doesn’t seem to be effective.
GoGuardian combines their minimal database with Smart Alerts that didn’t detect and block most of the sites we tested, and were delayed when they worked (so content was visible for up to a minute).
Yes, we used our own list to test. There is no other list we could use to test because we believe it is the most complete list. As soon as a new inappropriate site is identified on the web, it gets added to our database. We’re using advanced AI, human review and a robot army on the backend, where we can monitor, test and review pornographic content — rather than have students stumble across porn as our first line of defense.
And of course we agree that filtering is about more than blocking explicit content — but we think it must at least do that! And our solutions provide way more than that: granular categories, reporting, alerts and policies to ensure students are safe, productive, and engaged on their school devices.
Some questions to ask as you evaluate filters:
• The web is huge and new sites are constantly added. How does the solution handle uncategorized/unknown sites?
• Does the AI identify obscene terms in foreign languages? Does it process videos and images?
• Is the company compliant with student data privacy regulations? Who is reviewing sites, who is reviewing flags and tips, where are they located, where is data located?
• How many URLs are in the database? In the pornography category?
• Does the company work with the FBI, Home Office in the UK, and Internet Watch Association to block and remove illicit child pornography?
• How many explicit sites could get through (even just once or twice, even for just a minute) and how would your parents and community respond when that happened?
We get it. There are a lot of different ways to do things and to approach filtering. But we believe this isn’t just a different approach; it’s a fundamental failure to block explicit content.
This isn’t about money. We understand that our decision to share this information will lead some schools to choose another filter and that’s fine. We’ve been filtering school devices for 20 years, so we also know many schools wouldn’t accept such a flawed method to content filtering.
Regardless of CIPA guidelines, we are parents and don’t think access to extremely obscene content by our students and children is something to take lightly. We had to speak up and believe it was the right thing to do.
If you’d like to test your filter and see for yourself, email us and we’ll send you sample sites. Hundreds of thousands of inappropriate sites are allowed by other filters and we don’t think that’s acceptable.