What would the impact be to your business or organization if your web server were running and serving pages, but the very people you needed to see them couldn’t access them?
Recently, I’ve come across a new form of Denial of Service (DOS) against Internet sites. Your customers are blocked from your web site without your knowledge. This technique doesn’t use bots, takes little to no effort, and requires no network bandwidth. It prevents users from accessing your web sites; reducing your web presence and potentially reducing your revenue.
Why is it happening? Who is behind this evil plot? Through something as innocuous as Internet Filters (aka Content-control software).
Many organizations and homes use Internet Filters to protect their users from the seven deadly sins of the Internet (i.e., adult content, gambling, sexuality, malicious sites, etc.) Companies use it to enforce employee policies on the use of the Internet. They help keep honest people honest, which is often my main goal in security.
They can also be used to keep honest people from honest sites. It’s happened to Security Catalyst. A few months ago I tried to access this site from work, but was blocked by our Internet Filter. It was labeled an MP3 site, which my company disallows as part of its policy. (We won’t debate that here.) You and I know that this isn’t an MP3 site, but because Michael uses MP3 for his podcasts, our vendor mislabeled it. I notified the vendor and they corrected it. However, we don’t know how long it was blocked and who may have been impacted.
This isn’t an isolated occurrence. I’ve seen this happen time and time again: DOS by Internet Filter.
It occurs one of two ways: either through the automated software that evaluates and categorizes sites or by someone requesting a particular site be placed in a certain category. Either way, we are seeing innocent websites placed in categories that many homes and organizations can’t reach. It’s also possible that it could be used malicious to block a competitor’s web site.
I’ve spoken with Internet Filter companies and they don’t see this as a problem. They are in denial that their software can cause a DOS against websites. It seems they don’t want to take the time and energy to find a better solution.
Granted, there is no easy answer to this problem. It’s impossible to have a human look at every new or changed Internet site across the globe. Automation is necessary. However, it must be intelligent enough to properly assess the true nature of the web site and categorize it appropriately. (Does one game on a site make it a Gaming site?) Also, the vendor must have a process that allows an unbiased review of web sites and be able to quickly re-categorize the site.
Other ideas I have are: (1) Have a voting process for each website allowing users to categorize them. It’s along the same idea as Wiki. (2) Have an independent review board to either categorized sites or to mediate disputes when there are disputes on the category for a web site.
What do you think? Am I alone in thinking this is a problem? What are other ways to improve Internet site categorization?