Research Reports

A legal model for government intervention to combat online hate

Categories: Antisemitism, Features, News Categories, Research Reports: Tags: , ,

Published as: Andre Oboler, A legal model for government intervention to combat online hate, Internet Law Bulletin 14(2), May 2011
  • Racial hate propaganda is unlawful in Australia, and this extends to non-private online communications. This may create liabilities for technology companies.
  • International discussions have highlighted the need for both national and international engagement on the problem of online racism. More active government involvement is inevitable in the future and poses a manageable risk to technology companies.
  • The Copyright Act 1968 (Cth) provides a model for technology-based remedies to unlawful acts that take place online. This could serve as a template for remedies to other types of unlawful acts, including the spread of online hate propaganda.
  • The Attorney General’s announcement of a possible extension of “safe harbour” provisions in the Copyright Act to a larger range of service providers raises the questions of similar provisions for other unlawful activity facilitated by these providers.
  • Lawyers advising clients who provide non-private online spaces should consider a range of legal developments in other areas, and should consider how similar provisions in the area of online hate may affect their clients. Engineering solutions to limit risk are possible and could be integrated into future development if considered pre-emptively.


Online hate is the use of the internet to harass, defame, discriminate or incite against a person or group. It is a significant problem within the online world.1 Hate propaganda forms a more limited class of content: it includes content “aimed at, or with the effect of, inciting hatred or contempt for individuals or groups of individuals identifiable on the basis of personal characteristics such as race, religion, ethnicity, gender, family status, marital status, and sexual identity that have historically formed the basis of socially imposed disadvantage”. 2 Some, but not all, aspects of hate propaganda are unlawful in Australia as a result of Commonwealth and state anti-discrimination legislation.

One form of hate that is unlawful at both Commonwealth and state level is racial discrimination. The Racial Discrimination Act 1975 (Cth) and the Racial and Religious Tolerance Act 2001 (Vic) are examples of legislative provisions broad enough to directly tackle hate propaganda. There are, however, serious difficulties in the practical application of such laws to hate propaganda that occurs online. This is particularly true when third party platforms are involved.

This paper begins with a look at the existing law and its adaptability to meet growing demands that the government tackles online hate. It then examines the disempowerment of governments in the online world. Finally, it discusses the opportunity for companies to re-empower government and side-step the difficulties associated with policing their online spaces to prevent hate propaganda.

Online hate and the law

The internet is a powerful medium. Revolutions, enabled by online tools, have recently topped governments, and comparisons have been made to the role of mass printing in the 1848 revolutions in Europe. 3 That much power, if used for hate propaganda, presents a real threat to society.

The danger of online hate propaganda was recently recognised in the Inter-parliamentary Coalition for Combating Antisemitism’s Ottawa Protocol, which called for more research, expert advice and international cooperation into online hate. 4

Within Australia, racially-based hate propaganda is unlawful. Section 18C of the Racial Discrimination Act makes unlawful acts that “offend, insult, humiliate or intimidate”, on the basis of a person’s race. This section was applied to internet material in Jones v Toben 5 and resulted in orders for hate propaganda to be removed from the internet, as well as orders restraining republication.

The Victorian Racial and Religious Tolerance Act gives two standards of racial vilification, noting in both cases that the sections apply to “use of the internet or e-mail to publish or transmit statements or other material”. 6 This approach stands in stark contrast to efforts that address the specific nature of the online world in areas such as online copyright reform.

Government’s active engagement with the online world

Attorney-General, Robert McClelland, recently noted that copyright reform “is challenging because of the speed of technological developments” and that “legislative solutions can lag behind reality”. 7 He championed government engagement and the need to “continually examine the areas of copyright that are ripe for reform”. 8 He explained the challenge saying “governments are being asked to try to find a national solution to a global problem — and to do this without stymieing growth in new technology and market solutions that deliver content to the community”. This challenge exists in all interactions between government and the online world, including combating online hate.

In tackling digital copyright, new concepts such as the “safe harbour” provisions were created. These provisions give internet access providers a way to limit their liability for specific cases of copyright breach by taking active measures to facilitate general compliance. The measures access providers need to take are given in s 116AH of the Copyright Act. They include having a policy allowing termination of the accounts of repeat infringers, and compliance with industry codes aimed at protecting copyright material. 9 Specific requirements are made for four types of activity a provider may engage in, each requirement closely tied to the way technology is used for that activity. 10

Specific technical remedies can be written into law

The Copyright Act also provides enumerated remedies. Where the carrier acts as a conduit for information a user requested, the remedies are an order to terminate the users account, 11 or to limit access to material hosted overseas. 12 In the case of automatic caching, providing a user with storage capacity, or facilitating connections, the remedies include an order to terminate the user’s account, 13 to remove or disable access to the offending material, 14 or any other less burdensome non-monetary order necessary. 15 The Attorney-General has said that the “purpose of the scheme is to provide legal incentives for ISPs [Internet Service Providers] to cooperate with copyright owners in deterring infringement of copyright”. 16

The Attorney-General also suggested the “safe harbour” provisions be extended beyond access providers. 17 This would require the law to gain an understanding of the nature of these services, as it has done with access providers. Many of these providers will be publishers of users’ content, and laws setting standards for copyright may provide a model for handling other forms of unlawful conduct including the promotion of hate propaganda.

As the technology paradigm changes, so must the law

Access providers connect physically to the customer, so they must have a presence in Australia. Even when mediating communications within Australia, other service providers may be located entirely outside Australia. International mechanisms are needed to address issues that arise, these exist for copyright but not for the prevention of online hate propaganda. For now, as major service providers operate with such a large degree of autonomy over their online spaces, it begins to look like sovereignty, except for their care over copyright.

In reality, the rights of internet service providers are based on property and contracts law. It is their property rights over servers, networks and data storage devices, as well as intellectual property over source code, that gives technology companies authority. Participation in the virtual community is conditional on a licence to access the company’s property. The terms of this licence, literally the “terms of service”, give the company power to regulate users’ activity.

The legal concept of property refers not to objects but to the rights people have in them. 18 In the digital world, these rights, or the closest thing we have to them, are created by a company’s terms of service. These rights can be abrogated or altered by statute, but the law will need to enter the digital world and regulate the activity rather than the technology.

A foundation for further engagement

In entering the digital world, governments need to reassert their rights. The power of internet companies may be legally based on property and contracts, but “property” in a resource stops where the infringement of more basic human rights and freedoms begins”. 19 In some jurisdictions, issues over privacy are now causing governments to assert themselves. 20 In Australia, the protection of human dignity is said to provide a basis for equities intervention. 21 As the Supreme Court of New Jersey observed:

[P]roperty rights serve human values. They are recognised to that end, and are limited by it. 22

Today, private companies like Facebook seem to be able to ignore complaints from governments, 23 even over content calling for genocide and war crimes. 24 Instead, they are swayed by the media and online public opinion. 25 I have previously discussed a penalty model that could hold technology companies responsible when they fail to respond in reasonable time. 26 Another approach is for government to intervention in the online world itself. Technology companies, like Facebook, would need to provide the tools, either voluntarily or in response to legislation. Similar requirements already exist in phone systems to enable wiretaps. 27

Powers governments may request, or legislate to require, include:

  • the ability to delete public groups/pages;
  • the ability to suspend accounts; and
  • the ability to trace users and stored communications to an IP address.

In each case, this power could be limited to content controlled by users from within the country’s territory. Checks and balances, including judicial oversight, could be included. Judges could give time limited authorisation, and all activities done using the authorisation could be logged. By empowering government, technology companies may be able to side step the problems and potential liabilities of online hate.


The current law in Australia makes race-based hate propaganda unlawful, but does not effetely tackle the online problem. Law reform may create greater liabilities for companies, or cases may establish existing liability. The development of copyright law provides a template for more technology specific remedies, and discussions on extending “safe harbour” provisions may provide an opportunity to discuss generally new considerations and remedies to unlawful acts online.

Those advising clients in the technology sector should be aware of the potential for increased government intervention. In particular, the mechanisms of the Copyright Act and the Telecommunications (Interceptions and Access) Act may suggest possible approaches government may consider to ensuring compliance with the Racial Discrimination Act in the future. Building such capabilities into platforms now may prevent future risk and disruption from legal reform.

Governments have a responsibility to take an active role in the online world; if they don’t, they cannot meet their wider obligations to the people they serve. The powers, rights and limitations that apply to governments and private citizens in the real world need to be reflected online. The discussion over updates to the Copyright Act provides an opportunity to consider a wider picture of government involvement online.

Dr Andre Oboler,
Director, Community Internet Engagement Project
Zionist Federation of Australia.

1 Digital Journal Staff, “Online hate” (2003) Digital Journal, available at;

2 J Bailey, “Private regulation and public policy: towards effective restriction of Internet hate propaganda” (2003) 49 McGill Law Journal 59, fn 6, pp 63–64.

3 F Zakaria, “Why it’s different this time” (2011) Time Magazine (New York) 30–31.

4 A Oboler, “The ICCA tackles online hate” (2011) 13 Internet Law Bulletin 178.

5 Jones v Toben (2002) 71 ALD 629; (2002) EOC 93-247; [2002] FCA 1150; pp 655–656 at [113].

6 See, eg, Racial and Religious Tolerance Act 2001 (Vic), ss 7 and 24.

7 R McClelland, “Address to the Blue Sky Conference on future directions in Copyright law”, speech delivered at the Blue Sky Conference on future directions in Copyright law, Sydney, 25 February 2011.

8 See above note 8.

9 Copyright Act 1968 (Cth), s 116AH(1).

10 Copyright Act 1968 (Cth), s 116AH(1).

11 Copyright Act 1968 (Cth), s 116AG(3)(b).

12 Copyright Act 1968 (Cth), s 116AG(3)(a).

13 Copyright Act 1968 (Cth), s 116AG(4)(b).

14 Copyright Act 1968 (Cth), s 116AG(4)(a).

15 Copyright Act 1968 (Cth), s 116AG(4)(c).

16 Above note 8.

17 Above note 8.

18 R Chambers, An Introduction to Property Law in Australia, 2nd edition, Lawbook Co, 2008, p 5.

19 K Gray, “Property in thin air” (1991) 50 The Cambridge Law Journal 252, 297.

20 Letter from Jennifer Stoddart, Alex Turk, Peter Schaar, et al, to Erich Schmidt, accessed 19 April 2010, available at

21 Above note 21, p 226 at [43].

22 New Jersey v Shack (1971) 277 A 2d 369, 372 (NJ, 1971).

23 E Levy, “Israel tells Facebook: remove intifada page”, on Ynet News, 23 March 2011, available at

24 A Oboler, “Facebook and the third intifada: the aftermath”, on Jerusalem Post, 30 March 2011, available at

25 A Oboler, “The rise and fall of a Facebook hate group”, (2008) 13 First Monday, available at

26 A Oboler, “Time to regulate internet hate with a new approach” (2010) 13 Internet Law Bulletin 102.

27 Telecommunications (Interceptions and Access) Act (1979)(Cth), s 189.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to Reddit Post to Slashdot Post to StumbleUpon Post to Technorati

, ,

Dr Oboler’s report on the ICCA and online hate

Categories: Antisemitism, Features, Research Reports: Tags: ,

Published as: Andre Oboler, The ICCA tackles online hate, Internet Law Bulletin, Febuary / March 2011


In November 2010, the Inter-Parliamentary Coalition for Combating Antisemitism (ICCA) held its second conference; parliamentarians and experts from over 40 countries attended.

The conference, held at the Canadian Parliament, was hosted in partnership with the Canadian Ministry of Citizenship and Immigration. Australian involvement included Michael Danby MP, Senator Scott Ryan and four Australian experts.

The Ottawa conference ran working groups in parallel tracks for the Experts Forum and the parliamentarians. The conclusions of each pair of working groups were delivered to a combined plenum and informed the drafting of the Ottawa Protocol that was unanimously adopted by the parliamentarians.

The Online Antisemitism Working Group had a panel of five speakers. Christopher Wolf, a US technology lawyer, discussed Anwar al-Awalaki whose YouTube videos incite racial hatred and terrorism. Wolf called on the technology companies to deny their services to this virtual hate rally, as they would to a real world hate rally.

Rabbi Cooper, of Simon Wiesenthal Center, questioned the American approach of more speech in response to hate speech. He showed the link between online hate and terrorism. Marc Saltzman, a technology journalist, spoke on smart phones that allow updates on the go, with less thought. He argued we need the right combination of law, education and activism to address online hatred.

Cathy Wing, Media Awareness Network, focused on children now constantly exposed to hateful content. She expressed hope that online education against racism may have an impact. I examined the question of regulation and argued the social contract gave government an ultimate and irrevocable responsibility. The overall impression was that online hate is a fast moving field with a need for rapid access to both technical knowledge and government consideration. This was reflected in the final protocol.

The Ottawa Protocol notes that the gathered parliamentarians are “alarmed by the explosion of antisemitism and hate on the internet, a medium crucial for the promotion and protection of freedom of expression, freedom of information, and the participation of civil society”. The statement encapsulated a number of concerns expressed at the conference. Most notable was the concern that, left unregulated, the online world may be far less free than idealists believe. Racism and intimidation can dampen participation by minority groups and damage democracy.

The Ottawa Protocol commits the gathered parliamentarians to “establishing an international task force of internet specialists comprised of parliamentarians and experts to create common indicators to identify and monitor anti-Semitism and other manifestations of hate online and to develop policy recommendations for governments and international frameworks to address these problems”.

The establishment of a task force that contains both members of different parliaments and leading international experts is an opportunity. It creates a resource of international technical expertise for members of parliament and a dialogue for sharing best practise.

Most significantly, it provides a multilateral foundation from which companies can be addressed, monitored and held to account.

Dr Andre Oboler,
Director, Community Internet Engagement Project
Zionist Federation of Australia.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to Reddit Post to Slashdot Post to StumbleUpon Post to Technorati


JPPI report on Global Antisemitism Cites CIE’s Director

Categories: Antisemitism, Features, Research Reports: Tags: , , ,

A new JPPI report (October 12, 2010) provides a survey of prominent research on the phenomena of antisemitism around the world. In it’s discussion on online antisemitism the report refers to three articles by Dr Andre Oboler:

The report notes that “Few organizations are targeting and combating the online anti-Semitism”. It makes no mention of the Online Antisemitism Working Group of the Global Forum, nor did it note the creation of the Community Internet Engagement Project as the first mainstream group focused exclusively on Internet Antisemitism. The report was released just before the ICCA meeting which added further commitment to tackle online antisemitism.

One significant disagreement I have with the report is the suggestion, based on information from the ADL, that “anti-Semitism in cyberspace is virtually impossible to quantify, both because of the high dynamic of the medium, and because the information on the net is infinite, and it is almost impossible to reach it all”. This is clearly untrue as the implication is that search engines are also impossible. The premise would also suggest the entertainment industry should give up on efforts to prevent online piracy. As a technical premise, the argument is deeply flawed. Both the ADL and the Simon Wiesenthal Center have been using this argument as smoke screen to put the problem of monitoring of online hate into the “too hard” basket.

The problem is not too hard, it just requires a new approach and a more specialized expertise. This is exactly the problem CIE was created to solve, and we are working on it. Unfortunately we don’t have even a fraction of the budget of organisations like the ADL and Simon Wiesenthal Center. Without sufficient resources progress is slower than it needs to be. In 2007 I asked Issac Hertzog (then the Minister responsible for combating antisemitism)  who was going to pay for the work that needs to be done online. Despite raising that question in the Jerusalem post in 2008, with the exception of a very small pool of donors, we are still waiting for an answer. More than hand wringing, right now what’s needed is funding.

- Andre Oboler

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to Reddit Post to Slashdot Post to StumbleUpon Post to Technorati

, , ,

Wikipedia Manipulation: Anti-Israel Activists using Criticism Elimination

Categories: Features, Public Diplomacy, Research Reports: Tags: , , , , , , ,

A new paper by Andre Oboler (CIE’s Director), along with Prof Gerald Steinberg (head of NGO Monitor) and Rephael Stern has been published by the Journal of Information Technology and Politics. The full length academic article, “The Framing of Political NGOs in Wikipedia through Criticism Elimination”, introduces the concept of criticism elimination, a type of information removal that has been used by anti-Israel activists to control the message and frame issues in Wikipedia articles.

Criticism elimination facilitates a new form of gatekeeping, and the article demonstrates how this was systematically done to remove criticism of NGOs actions in relation to the Arab-Israeli conflict. The article also categorizes the editors responsible for the behavior into four types. Mitigation approaches to criticism elimination are also suggested.


  • Wikipedia’s approach has, however, raised concerns (Lichtenstein, 2008) that are traditionally reserved for the mass media. For instance, the media has long acted as a gatekeeper, selecting and framing issues in what was perceived to be the public interest (Williams & Delli Carpini, 2004). The management of public discourse through framing raises significant political implications…
  • The presence of politically motivated framing (rather than the expected NPOV), as well as gatekeepers, sanctioned or de facto, has serious implications for the understanding of content production in Wikipedia.
  • The problem in subjective areas is that Wikipedia articles can be dominated. Sunstein (2006) notes that the last editor “can appoint himself as sovereign” (p. 158) destroying, rather than aggregating, content. Stacy Schiff (2006), writing in The New Yorker, noted that more frequent editors generally get their way. Articles or entire topic areas can be framed with a particular view by users with knowledge, determination, and power within the system.
  • By dominating articles and topic areas, Wikipedia can be used as a platform for political propaganda. Paul Murphy (2008) called Wikipedia “an early and illustrative warning of the collapse from informed social networking to propaganda.” He explained that “sub-groups of the general community … are now using Wikipedia as a marketing tool for their viewpoints.” He called it “fundamentally inappropriate in a site nominally dedicated to the provision of objective information.” He raises a concern that those with an agenda will be more dedicated to getting their point across than casual users, thereby allowing them to dominate.
  • Framing can occur though gatekeeping (Lewin, 1947), a theory of how items are “selected” or “rejected.” … [it] is “the process by which selections are made in media work, especially decisions whether or not to admit a particular news story to pass through the ‘gates’ of a news medium” (McQuail, 1994, p. 213). Social responsibility theory (Peterson, 1956) saw the public as passive and easily manipulated and the media as “information gatekeepers who represented the public’s interest” (Williams & Delli Carpini, 2004, p. 63).
  • Wikipedia’s dominance raises concerns about its own effect, or that of dominant editors, in framing information and acting as gatekeepers….  In Wikipedia, a culture (Schiff, 2006) with power structures, guidelines, and policies has developed to prevent this. These policies include NPOV, which states that articles should be “written from a neutral point of view, representing significant views fairly, proportionately, and without bias” (Wikipedia, 2008c)… In practice, however, the top 1 percent of posters jointly contribute about half of Wikipedia’s edits (Wilson, 2008). The power of the elite gives them a default gatekeeping role. Their strength in authority, time commitment, and knowledge of Wikipedia can easily overwhelm, and thus eliminate, the contributions of others.

The experiment

We use an in vivo experiment in the form of an observational study with predefined variables and multiple “sites” (articles in this case), making this a field study as per Basili’s (1996) classification scheme for experimentation in software engineering. As Wikipedia records all interactions within the system, we use content analysis on stored data as a form of observation.


  • 16 NGO articles were used in this study, all edits to these articles were reviewed
  • 627 edits relating to criticism were extracted
  • Of the 16 NGO articles, nine were included in WikiProject Palestine (and their criticism sections were heavily revised to eliminate criticis


We reviewed how the edits changed the nature of the article, and specifically whether they removed relevant sourced information. We reviewed the over all impact on the article, and the over all editing behavior of the editors found doing such criticism elimination. We did this both for numeric results and for a more in-depth qualitative review.

Summary of Results

Four of the NGO entries examined (including the UK-based charity Christian Aid and the Israeli NGO Hamoked) had sourced criticism sections completely or almost entirely deleted. In both cases, all discussion on the Israeli-Palestinian conflict was removed (twice in the case of Christian Aid).

In total, 89 editors removed criticism, and 61 of these used registered user names. There are four prominent users removing criticism from multiple NGOs; 16 users removing criticism multiple times from one or more NGOs, in addition to making revisions in other NGO entries; and 26 users with low-edit counts focused on NGOs.

Qualitative analysis revealed 4 types of users who removed criticism:

A campaigner is a Wikipedia editor working towards a larger goal. He or she edits across a range of NGO articles and other articles. In the NGO articles examined here, campaigners usually removed sourced criticism. Some campaigners are members of WikiProject Palestine. Others appear to edit articles in the project without being members.

An advocate editor is concerned almost entirely with one page or a very limited topic. In the case of our research, the focus would be a particular NGO. One hypothesis is that advocates may be members, supporters, or staff of the NGO. These editors are using Wikipedia for a purpose unrelated to the advancement of the encyclopedia, and instead they remove criticism in order to frame their targets in the best possible light.

The lobbyists are editors who work within a broad scope of articles across Wikipedia, yet focus on only one NGO. They differ from advocates, because they contribute in other places, and from campaigners, because their actions do not appear to be part of a general campaign. These editors may attempt to remove or reduce criticism or set very high standards for the inclusions of criticism. As they become more involved in Wikipedia, their use of Wikipedia’s internal policies and guidelines to achieve their goals becomes more sophisticated.

Casual editors are visitors to Wikipedia who only edit articles on occasion. Spread across many topics, their edits show no unified agenda. Their attention is divided and, very often, thinly spread. These users may remove information that conflicts with their conceptual model on the justification that it is out of place.

Examples of each type of editor and the changes they made can be seen in the full paper.

The paper may be cited as: Andre Oboler, Gerald Steinberg and Rephael Stern, “The Framing of Political NGOs in Wikipedia through Criticism Elimination”, Journal of Information Technology & Politics, Volume 7, Issue 4 October 2010 , pages 284 – 299.

The published, Routledge (Taylor and Francis), have made the article a sample for the Journal. As a result the article is online and may be downloaded or read online for free.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to Reddit Post to Slashdot Post to StumbleUpon Post to Technorati

, , , , , , ,

Under attack in a virtual world

Categories: Antisemitism, Research Reports: Tags: , , , ,

Andre Oboler, PresenTense Magazine, February 2010.

The Jewish people are losing the war. When it comes to the online world, we are for the most part disorganized, under-resourced and lacking leadership. Battles may be determined by short-term objectives, but wars are won by strategy and determination. In the Jewish world today, few have realized that we are in an online virtual war. This is a war against the Jewish state, and against the Jewish people.

The virtual world is a battleground of competing ideas. In a world with no absolute truth or commonly accepted values, racism and intolerance are becoming widely accepted in society. Discrimination, rather than freedom from discrimination, becomes a right. As these poisonous ideas spill over from the virtual world into the real world, there is a potential reversal of all the progress that has been made in the name of civil rights.

Online, as in the real world, there is an extreme fringe. These are the classic antisemites and racists, often sporting swastikas and calling for death to the Jews. In the real world, such
racism is opposed and attracts social penalties. In the virtual world, however, such expressions of hate usually pass without comment. Modern online values can even legitimize such views, giving them equal weight to any other “opinion”. Online anonymity further exacerbates the problem. The largest challenge we face is not the racists; it is the online culture that accepts them and their message. This acceptance allows others, particularly the young, to be drawn to prejudice through their ignorance. It encourages good people to stand idly by, or risk the ire of the community for attempting to limit another’s “free expression”.

In May 2007, Facebook added a code of conduct to support its terms of service. The code stated, “While we believe users should be able to express themselves and their point of view, certain kinds of speech simply do not belong in a community like Facebook.”

The code of conduct did not seek to define what was illegal; instead, it sought to define shared values for the Facebook community. The code sought to exclude “graphic or gratuitous violence,” “threats of any kind,” material that “intimidates, harasses, or bullies anyone” and “derogatory, demeaning, malicious, defamatory, abusive, offensive or hateful” material. The code of conduct lasted almost two years before it was quietly dropped.

Commenting on Holocaust denial on Facebook after the code of conduct was removed, Facebook spokesperson Barry Schnitt said, “The bottom line is that, of course, we abhor Nazi ideals and find Holocaust denial repulsive and ignorant. However, we believe people have a right to discuss these ideas and we want Facebook to be a place where ideas – even controversial ideas – can be discussed.”

When hate-inspired conspiracy is considered as legitimate as historical fact, we have entered a dangerous post-modern stage of society. When those wishing to excuse or deny the Holocaust are said to have nothing worse than a controversial idea, it’s time to step back and wonder how far online society has regressed.

Since the beginning of 2010, Facebook, responding to a public outcry, has started to remove the classic Nazi and Holocaust-denial groups such as “For the followers of Hitler” and “6,000,000 for the TRUTH about the Holocaust.” This change has happened without an announcement, press release or change in written policy.While this is a step in the right direction, a significant amount of hateful content continues to proliferate on Facebook. Without a doubt, antisemitism abounds. More than 100 “Gaza Holocaust” groups, both large and small, still exist. Many of the groups label Israelis as Nazis and demonize Jews. Messages that attack Israel as a Nazi, apartheid, evil state pervade both Facebook and the Internet in general. Moreover, derogatory comments about the disabled, gays and various non-whites are increasingly common on other social-media sites, such as YouTube, Flickr and Blogger. This is not just a Jewish issue.

In the war of ideas, we must look for something to spark a change in online social values. Public leadership on social values is needed. We must hope such leadership eventually will emerge from the corporate world, from the likes of Facebook, Google, Yahoo and Microsoft. If it does not, that role falls to governments and the general public. Change is starting to happen. “David Appletree,” working under a pseudonym due to concern for his safety, founded the Jewish Internet Defense Force in 2008. The organization’s campaigns have led to the removal of hundreds of antisemitic groups on Facebook, as well as hundreds of racist YouTube videos.

“The problem is overwhelming,” Appletree said. “More people need to get involved to fight anti-Semitism online. Only then can we, together, start to get on top of this problem.”

Recently though, Appletree’s own Facebook account was disabled by the social-media site because he does not use his real name. But more than 50 accounts purporting to belong to Santa Claus have not been given the same treatment.

Last December, the Zionist Federation of Australia launched the Community Internet Engagement Project to provide research, training and support to the Australian Jewish community to respond to online hate. That same month, the Global Forum to Combat Anti-Semitism met in Jerusalem, where experts and government representatives from around the world discussed antisemitism, including online antisemitism. The forum produced 17 pages of recommendations to combat online antisemitism.

In an Internet culture where hate of Jews and Israel is seen as just another equally legitimate viewpoint, the Jewish people are set for disaster. Historically, we have been persecuted not just because we had persecutors, but because those who could have stopped it stood idly by. The online world is creating a culture where people will – once again – stand idly by.

As Jews we must stand up and challenge those who use technology to promote racism and hate. We must use the tools provided by sites such as Facebook and YouTube to report the hate we see online. In the wider name of humanity, we must ask others to join us, to turn their backs on those who hate and to exclude them from our online communities. We must create a culture where people refuse to  participate in communities that lack basic social values. This starts when we take a stand ourselves, as individuals, against the hate, racism, and bullying we see online. We must work for an online world that remembers the lessons of the past and incorporates the strides made for human rights over the last 60 years.

The clash of cultures that is taking place around the globe is reflected online, but so is the rejection of Western values. We are once again in a brave new world, a world of rapid change where anything can happen. In this new world the Jewish people are once again the canary in the coal mine. The online war over the values of society is a war that we must win – and not only for the sake of the Jewish people. This is a war over universal values. It is a war that civil society can’t afford to lose.

Dr. Andre Oboler is a social-media expert and commentator. He is the director of the Community Internet Engagement Project at the Zionist Federation of the Australia, co-chair of the working group into online antisemitism for the Global Forum to Combat Antisemitism, and CEO of Zionism On The Web. Dr. Oboler’s research into technology issues affecting Israel and the Jewish people has covered antisemitism 2.0, Replacement geography in Google Earth, Facebook hate and the JIDF, Wikipedia warfare, Facebook’s stance on Holocaust denial and other issues.

© 2010 Andre Oboler, originally published by PresenTense Magazine in The Digital Issue, February 2010. This article is released under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Unported License. You may report it else where provided you post it in full and include this notice.

Post to Twitter Post to Yahoo Buzz Post to Delicious Post to Digg Post to Facebook Post to Google Buzz Post to Reddit Post to Slashdot Post to StumbleUpon Post to Technorati

, , , ,