The Problems of Filtering

Yesterday, we at the OpenNet Initiative (a collaboration of Berkman/HLS with the Advanced Network Research Group of the Cambridge Security Programme at University of Cambridge, where Rafal Rohozinski is the principal, and the Citizen Lab at University of Toronto, where Ron Deibert is the principal) released a report on Internet filtering in Myanmar (Burma).  The NYT’s Tom Zeller wrote about it, and kindly Xeni Jardin linked to it from Boing Boing.

I’ve been thinking about what the story overall is with Filtering, circa Fall 2005.  Here are some draft thoughts:

Filtering regimes are becoming more sophisticated and more commonplace around the world as the Internet assumes greater importance as a means of communication, as a forum for doing business, and as a hotbed of political activism.  There’s a cat-and-mouse game being played between states that seek to control the information environment and citizens who seek to speak and read and interact freely online.  Filtering technologies, and the way that they are implemented, are becoming more sophisticated with each passing year.  At the same time, we are seeing a corresponding increase also in “soft controls” — in terms of the legal and normative mechanisms of control and surveillance in many places.  We fear also a growing correlation between those states with lousy human rights records and those that filter the Internet extensively.

The paradigmatic example is China.  The narrative of China’s filtering regime tells this story of increasing technological and legal sophistication of filtering and surveillance regimes most poignantly.  In China, the ONI’s research from 2002 through 2005 has shown a state that continues to find more, and more effective,ways to control speech online and to track the movements of their citizens, even as technologies to evade these controls become more sophisticated as well.  

The country formerly know as Burma, now Myanmar, is a state controlled by the army for many years.  As our research has shown, the state appears to have gone from an open source technology (DansGuardian) to a proprietary one (Fortinet) in the past year or so, and most indicators show that the filtering regime, along with the legal regime, are growing more restrictive rather than less so.  

A third example is Saudi Arabia, which for years has had a relatively transparent filtering regime that is narrowly tailored, mostly to pornography, but where they recently are reported to have shut down access to the entire set of blogs hosted on Blogger, a popular weblog hosting service – a step which is plainly leads to over-inclusive blocking of speech that meets none of the state’s criteria for censorship.  We’ll be writing more about various other states in the next few months which reveal some of the same trends.

Burma may be a canary in the mine.  Burma shows how a developing country that has a repressive ideology overall can extend that viewpoint to the information environment as its citizens come online.  If the Internet in Burma is introduced as a restrictive environment where one’s actions are blocked and tracked by the state, right off the bat, the state has a much better chance at keeping a lid – at least for a while – on the Internet’s democratizing potential.  Contrast Burma with Saudi Arabia, for instance, where the state did not introduce the Internet until they developed the ISU, which announced to the Saudi people that they would censor the net, but also take suggestions for how to block and unblock sites.  

Here’s one version of the “hard question” of filtering.  As we go from an Internet with about a billion people online to one that is closer to six billion, the question is whether the gatekeepers who run states will enable relatively open environments to flourish or will instead seek to shape what citizens do and say online by creating a culture of fear and control.

There’s a second important problem, which seems to be capturing news interest of late, which relates to the role of American for-profit companies in filtering regimes.  Our research has shown that the technology developed by at least three US companies is being used by states to block Internet access and potentially to listen in on their citizens’ Internet activity.  (No one really wants to admit to these facts.  In most cases, such as in the case of Myanmar, when we contact the United States-based company like Fortinet or their competitors, they refuse to confirm or deny their involvement, often referring to “resellers” who may or may not be distributing their products.  We find other ways to demonstrate the likelihood of a role of these companies’ technologies — such as finding that the Myanmar state has put out a web page talking about it, reading local news reports with information about someone said to be a company vice-president celebrating a sale, procuring a “block page” that citizens see when the seek to access a forbidden destination on the Internet that is different from the ones used before, and hearing from people on the ground.  There’s almost always a variety of reports, as here, of who is selling what to whom.  Judge for yourself, based upon what Nart Villeneueve has written about it, including two relevant photographs.)  

Set aside the particulars of which company’s technologies are used in which regime.  It is plain that there is a lurking “Oppenheimer” problem, with a complex set of ethical quandaries buried within it.  This problem is related to — but certainly distinct from — the problems that Yahoo! (the Hong Kong journalist case) and Microsoft (the MSN Spaces titles field case) and others are facing in China, but here the issue is that these technology security companies are directly profiting from the censorship regime itself (perhaps putting the problem closest to the debate over the use of Cisco routers in China).  At a minimum, it is troubling that United States companies would have anything to do with their technologies being put to work in places that have state-driven censorship and surveillance regimes in place.  Why are we interested in this topic?  Further enumeration, we are fairly certain, will help inform companies, investors, and policy-makers such that they can come to understand the problem lurking in Internet filtering.  We’d like be part of the solution in shaping an environment where incentives are properly aligned to support human rights and freedom of expression.

2 thoughts on “The Problems of Filtering

  1. […] The issue is that under the Burmese Freedom and Democracy Act of 2003 imports from Burma are placed under an embargo, but exports are not. Additionally, the U.S. Department of the Treasury states that “there is no prohibition on the exportation of goods and services other than financial services to Burma.” Such a policy is problematic because it creates an Oppenheimer problem: US-based Fortinet is directly profiting from Myanmar’s state-driven censorship and surveillance regime. Such a problem is a direct result of investors, policy makers, and corporate executives who do not understand the dangers of Internet filtering, and I am in full support of such initiatives as OpenNet which, in the words of John Palfrey, strive to shape “an environment where incentives are properly aligned to support human rights and freedom of expression.” […]

  2. […] Earlier, I described the state of the Internet in Myanmar and described the issue of US-based companies profiting from state-driven censorship and surveillance regime. The OpenNet Inititiave is a group whose mission is to investigate and challenge state filtration and surveillance practices. They have a weblog, and one of their members, John Palfrey, has written up some good Initiative-related posts on his own weblog. In fact, in Palfrey’s October 2005 entry “The Problems of Filtering“, he offers an excellent observation on the question of “hard filtering”: As we go from an Internet with about a billion people online to one that is closer to six billion, the question is whether the gatekeepers who run states will enable relatively open environments to flourish or will instead seek to shape what citizens do and say online by creating a culture of fear and control. […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.