- Master's Program
- Tuition & Financial Aid
- Current Students
- Faculty and Leadership
Understanding Content Filtering: An FAQ for Nonprofits
Fri, 01/25/2008 - 13:45 — TechMission
Understanding Content Filtering: An FAQ for Nonprofits
What to know before you restrict access to online materials
June 4, 2007
This article is the first in TechSoup's two-part series on content filters. To learn more about individual content-filtering tools, please see TechSoup's article Content Filtering Tools: An FAQ for Nonprofits.
If you're in charge of the public computer lab at your school or library, chances are you're familiar with the Children's Internet Protection Act (CIPA), a U.S. law that aims to restrict children's access to online content that has been deemed inappropriate or harmful to minors. Under the CIPA, schools and libraries that wish to receive federal funds to purchase technology equipment must implement protection measures to help shield minors from objectionable online material.
To comply with the CIPA, many libraries and schools are turning to content filters, hardware or software that restricts what sites and pages users can access. While schools and libraries should consult the full text of the CIPA (PDF) to learn more about what kind of a content-filtering system will help them comply with this law, other organizations that operate public computer labs (such as community centers) have more flexibility and can make their own content-filtering rules.
Regardless of whether or not you must comply with the CIPA, having a solid grasp on the fundamentals of content filtering is the first step toward successfully introducing it to your computer lab and to the people you serve. To help you quickly understand how content filters work and gain insight into issues surrounding this technology, we've compiled the answers to a handful or frequently asked questions.
1. What are content filters and how do they work?
A content filter is a piece of hardware or software that acts a shield between the Internet and a user's computer, blocking access from potentially objectionable or offensive material. Most content filter manufacturers compile a list of sites they deem objectionable and classify them under different profiles, which often pertain to the end user's age.
For instance, a content filter's most aggressive blocking profile might be designed for children under 10 and would therefore restrict all access to a large range of materials, such as pornography; pages about illegal drugs; sites that deal with sex education; and sometimes even social-networking sites such as MySpace. On the other hand, profiles for adult users might allow most types of content forbidden to younger users yet still block the majority of sites that are known to install malware. If one of the filter's built-in profiles is too restrictive or lax for your audience's needs, you will often be able to create a custom profile or alter one of the presets to your liking.
In addition, content filters generally let you block any Web pages or search results that contain single or multiple instances of user-specified keywords. Many content filters also allow you to blacklist (always block) specific sites by entering their URLs. Note that content-filter manufacturers often provide automatic updates to their product's list of objectionable sites in order to account for sites that have recently appeared on the Internet.
2. What are the potential pros and cons of content filtering?
Content-filtering technologies have sparked much controversy and debate. While advocates claim that this technology protects minors from harmful material and online predators, opponents (who often refer to it as "censorware") believe that content filtering is inherently error-prone and can restrict access to educational or other important information
Libraries and schools may find that implementing a content-filtering solution on all of their computers can benefit them financially, as this will help them comply with the CIPA and qualify for federal funding for technology-related purposes. Other types of organizations that offer public-access computers to children or youth groups might also find content filters beneficial because they can reduce liability and help cut down on phone calls or visits from distressed parents.
Another potential benefit of installing a content-filtering system is that it can help decrease the amount of malware that your patrons inadvertently install on your machines. Many content filters keep lists of sites known to install malware and prevent users from accessing them; also, if your content filter has a blacklist feature, you can manually block sites that you know install malware. Note that a content filter should never be considered a substitute for dedicated anti-malware and antivirus programs.
Despite the aforementioned benefits, content filters are not without their downsides. Several predecessors to the CIPA were successfully struck down by the Supreme Court, partially on the grounds that they violated an individual's right to free speech. The American Library Association (ALA) has also challenged the CIPA in court, though the Supreme Court eventually upheld the law as constitutional.
Other critics of content filtering — including the Electronic Frontier Foundation (EFF) and the American Civil Liberties Union (ACLU) — contend that the technology is also flawed because it often accidentally blocks useful materials that could be used for educational purposes. For instance, an aggressive content filter might block the term "breast," which would prevent patrons from conducting legitimate research into topics such as cancer or anatomy. Peacefire.org — an organization that advocates for youth freedom of speech online — has tested a number of popular content-filtering programs and has compiled a list of legitimate sites that were accidentally blocked.
Content-filter opponents also object to the technology because sites deemed objectionable are subjectively chosen by hardware and software manufacturers and not by a central, impartial organization.
3. What can we do to help ensure that content filters do not undermine our constituents' trust?
Another potential downside of adding a content-filtering system to your public-access computers is that you may receive backlash from your patrons. Not only can content filters make your constituents feel as if you don't trust them, but a particularly aggressive filter can impede their work by blocking legitimate sites that they need to perform research.
While your patrons may not like having their Internet access restricted, educating them about safe surfing habits and the dangers of cyberspace can at least help them understand why your organization chooses to filter content. For example, explaining how phishing scams can steal personal information or how sexual predators have been known to use MySpace to find victims can help illustrate online dangers to your patrons and lend credence to your decision to install content filters.
You might also find it useful to implement a more formal method of educating your patrons. Barry Martin, CEO of the State YMCA of Pennsylvania, recalls a successful education program he was involved with when working at a teen center's computer lab.
"Before a teen was given their user/pass combo," said Martin, "they had to attend an orientation, which included safe-surfing concepts, described our core values, explained our watchdog policies, and ended with the signing of a safe-surfing pledge."
Martin notes that because the teen center relied solely on education and did not use a content filter, the staff received proper training in counseling techniques and also occasionally had to deal with disgruntled parents. Overall, though, Martin feels that the educational policy was mostly positive, adding that "the time spent one-on-one reviewing surfing habits with the teens was great for our mentoring relationships with the kids."
If you need guidance and resources to help your organization create an effective Internet safety policy, you might visit NetSmartz.org, which offers educational aids — such as videos, activity cards, and games — aimed at kindergarten through high-school students. WiredSafety.org also provides potentially useful materials for educators and librarians, including a database of youth-friendly Web sites and multimedia lessons and tutorials.
4. Is it possible for users to circumvent the content filter?
Not only can content filters sometimes block access to helpful materials, but they can also be bypassed entirely by tech-savvy individuals. The Internet contains many instances of tutorials and advice for circumventing content filters, such as those found on Peacefire.org's homepage; other tips for bypassing content filters can be found by using a search engine and the appropriate keywords.
Although some content filters may actually block pages that offer advice for bypassing them, there is currently no way to guarantee that users won't eventually be able to find a way around the filter. Again, actively educating your patrons about online hazards and clearly explaining your content-filtering policies may help to discourage users from attempting to circumvent the filter.
5. What is an Internet use policy and what are some best practices for creating one?
Organizations that choose to implement a content-filtering system will likely also want to draft a formal Internet use policy, a document that outlines rules regarding the use of its public computers. For instance, an Internet use policy might specify how long a user can access the Web each day and whether or not patrons can download and install software.
Internet use policies should also disclose the presence of any content filters and may explain what types of online material users are forbidden to access. You might also choose to include details about what sort of penalties patrons will face if they fail to adhere to the Internet use policy. No matter what specific information your Internet use policy contains, you should make it readily available to your patrons and be prepared to enforce it at all times.
If you are in the process of drafting an Internet use policy, you may want to visit the American Library Association's (ALA) checklist for creating an Internet use policy, which provides several general tips. You might also use another library or organization's Internet use policy as a guideline for your own; the ALA's site provides several examples of Internet use policies, including those from the Multnomah County Library and the San Antonio Public library. The New Jersey State Library has also create an Internet use policy template that includes language required by the CIPA.
While the aforementioned resources can help you get started drafting an Internet use policy, the ALA recommends that libraries and organizations using content filters speak with an attorney before finalizing the document.
6. How can I ensure that our Internet use policy protects patrons' privacy?
To help ensure that your data-collection practices do not compromise your patrons' privacy, you may find it useful to periodically review them. In a process known as a privacy audit, organizations assess what kind of data they collect, as well as how they store, share, and delete this information. A privacy audit can help you not only determine whether your data-collection policies could ever face legal challenges, but also whether they are detrimental to your organization's mission or public perception.
Organizations seeking advice about how to perform a privacy audit should check out the ALA's Conducting a Privacy Audit page, which contains tips and resources dealing with the topic.
If your organization eventually decides that the pros of implementing a content-filtering system outweigh the cons, you will next want to research a hardware or software solution that fits your budgets and needs. To get a clearer picture of what type of content-filtering system is best for you and what features to look for, read TechSoup's article Content Filtering Tools: An FAQ for Nonprofits.
About the Author:
Brian Satterfield is Staff Writer at TechSoup.
Copyright © 2007 CompuMentor. This work is published under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License.
Faith (for Content):