The Technology of Difference: ASCII1, Hegemony and the Internet

Jason Nolan, PhD
Knowledge Media Design Institute, University of Toronto

 
This is an unpublished draft of a book chapter for The Pedagogy of Difference, edited by Peter Triphonas. To be published by Routledge in 2002. Copyright © D. Jason Nolan 2001

 

Introduction

Saturday night. My partner, Yuka, is translating Orphan at My Door by Canadian Children's author Jean Little into Japanese2 (Little 2001). Her plan is to translate it into a web-page diary in a format known as a blog (Cooper 2001; Power 2000). Blogging will allow her to maintain an online journal, where each entry is not just a static web page, but forms part of a chronologically-ordered interactive database. A group of Japanese scholars, friends and her editor will be able to follow the progress and make suggestions, and the finished product will be downloaded by the publisher and printed. A novel approach, to translate an orphan's diary using an online diary -writing tool.

All of the code we looked atBlogger, Livejournal, Wikiweb, and GreyMatter3however, are either monolingual or privileges English to the extent that functioning in Japanese is virtually impossible. I settled on GreyMatter because the code is open-sourced, and can be run on my Internet server (achieve.utoronto.ca). This way, I could see if it could force the code to 'do Japanese'. I located a half dozen locations where the code needed to be modified, to tell the script "Hey, there are more languages than English, eh?", and to allow both the input and display of Kanji, Hiragana and Katakana, and Romanji (roman characters); the four alphabets that are used in written Japanese, by adding the code: "<META HTTP-EQUIV="Content-Type" CONTENT="text/html;CHARSET=x-sjis">" (Lunde 1993).

Yuka's next complaint was somewhat obvious. None of the environments (software/programs) had any way to save your work. All these environments allow you to post your messages to the Internet, and you can edit previous posts, but there is no way to save your work to ensure that if you get disconnected from the Internet, you do not lose everything. No programmer thought of adding one, it seems. There are a myriad of ways to work around this, but Yuka does not take well to work-arounds; an attitude that has informed much thinking about user compliance and the whims of the programmer (Cooper 1999; Nolan 2001; Nolan and Hogbin 2001).

Thus Emma Jane Hogbin. With the promise of dinner, she arrives to hack a save button. Self-motivated and self-taught, Emma is a Hacker. Her bookshelves inspire fear in the non-digiterati (Gray 2001, p. 48). We all talk through dinner (my job) as the printer processes the 200 plus pages of code/text that make up GreyMatter. She reads the printout, making notes, and I conduct keyword searches for lines of code at her direction. Suddenly she looks up with a laugh and announces, "Sheeeeeeeeeeet. This guy needs a life!" in shock at the featuritis of the code (Raymond 2001). I am unsure if the comment is one of admiration. And we continue to work.

Program of Inquiry

This scenario occurred just as I was settling down to write this chapter. The ideas have been percolating for half a decade, and this experience was just another in a long list of experiences where we are confronted by the arbitrary whims of a technologically positivist metanarrative that decenters people, cultures, language, gender, orientation, and most of all locates power and privilege within the grasp of a small group of individuals; generally those participating in the white male North American English discourse that informs postmodern technology (Cooper 1999; Noble 1997; Wertheim 2000).

The intention of this chapter is to engage the deep structures of the hegemony of the digital technology revolution as represented by the Internet, levels beneath those addressed by most of the contemporary critical discourses. This time, I am not working with more obvious examples of the Technology of Difference, such as the relationship of access to safe drinking water and basic rights of women's education to global attempts to bridge the digital divide (Nolan 2000), the future of educational technology in North America (Nolan and Hogbin 2001), or the potential for zero-cost computing and telephony technologies and indigenous language software environments. The goal is to extend the dialogue of difference and access to locations of communication and community to a consideration of the locations of control over which disadvantaged groups of users and non-users of communication technologies have little or no control, and even less information or understanding. I am looking not at the content/information/data that is presented through the various media of the Internet, but at the bias inherent in the medium itself (Jones 2000).

The Internet is first and foremost a learning environment in both formal and informal learning contexts (Nolan and Weiss 2002). It is, of course one that presents itself as value neutral; a manifestation of McLuhan's global village where bias and difference all meld into a stream of bits (McLuhan 1995). There is a great deal of pedagogy and curriculum about the Internet that both challenges and reinforces difference (Cummins and Sayers 1995; Harasim et al. 1995; Haynes and Holmevik 1998). But there is very little curriculum or curriculum theorizing that engages the software, code, discourse and metanarrative of the Internet itself, leaving current pedagogy to function in a sea of assumptions about what can be done and said and accomplished online. There is an "anti-intellectualism" similar to what Giroux describes as present in the classroom, or lack of interest in the sub-surface discourse of the code and software of the Internet (Gray 2001; Giroux 1992, p. 116).

McLuhan's medium is the message mantra is ever current in our thoughts as we are infected with the latest rash of technological developments. However, as educators and researchers confront the dominant and subversive ideologies presented online, very few are willing or aware of the need to critique the imposition of the locations of power that have brought the Internet into existence. There is a need for us all to be aware of the levels of implicit colonialization that accompanies the proliferation of Internet-informed culture (Said 1993). There are a variety of layers that must be unpacked and brought into the light of inquiry, "to know as much as possible about the house that technology built, about its secret passages and its trapdoors" (Franklin 1992, p. 12). First and foremost is the foundation and genesis of the Internet itself, located in the Cold War desire for a computer network designed to withstand nuclear war (Krol 1992). Who made the Internet? Who are its informal architects? What culture was this creation located in? Secondly, we have to look at the software that runs the Internet, the servers that move information, and the software that extends its purview to our desktops at home and in our workspaces. Thirdly, there is the post-1994 World Wide Web which opened this brave new world to both the general public and to the commercial influences that followed them (Berners-Lee 1998). Fourthly, we are faced with the Internet representing technology and discourse as the informing metanarrative of the new global economy. Finally, I will suggest some strategies to help educators encounter difference in our pedagogy, practice and inquiry. This will serve to point to locations, the potential avenues, for radical repositioning of the discourse at the nexus of the educator and her performative/transformational capacity as creator of learning environments (Nolan 2001).

The foundation and genesis of the Internet

Most of us are aware of the genesis of the Internet at the hands of the Advanced Projects Research Group, of the US Department of Defense which, in late 1960s founded research that led to
the linking of computers at university in the Southwestern US (Cailliau 1995; Gray 2001; Krol 1992; Mitchell 1998). This foundation has morphed into an ostensibly uncontrolled and uncontrollable global phenomenon that has exploded the opportunities for voice and communication around the world. It has gone down in Western history alongside Gutenberg and Caxton's moveable type revolutions which propelled text out of the Medieval modes of production and privilege (McLuhan 1995). And just as the print revolution was about the technology of the printing press, the Internet is as much about the software code and Internet Protocols (originally TCP/IP, Telnet, SMTP, FTP, and recently HTTP) that bring the Internet into existence, as it is about what we do on it. Those who controlled the printing presses still controlled what could be and was said. Someone needed to control a printing press in order to have voice; as time went on more people had access, and differing voices could make themselves heard. Of course, concomitant with this means of production, one needed to have access to networks of distribution, a limitation that still restricts the diversity of voices that are heard both in media-rich and media-poor cultures/languages. Today, access to public consciousness via the medium of print is seen as widespread, but in many situations individuals and groups are still voiceless (OECD 2000).

The Internet stands now as a force within our collective worlds. But control is still located in corporate and government institutions. In 1992, the US government released the rules governing acceptable use of Internet resources, opening the Internet up to business, and since then corporations have taken over much of the Internet (Cerny 2000; Hochheiser and Ric 1998). Individuals must purchase or rent time on expensive machines made by an ever-shrinking number of multinational corporations. Organizations such as the various Freenets (Scott 2001), FIDOnet (Vest 2001) and the Free Software Foundation (Stallman 1999) are still challenging the hegemony of institutional and corporate interests, but their influence is small and localized.

Technologies of Resistance

The Internet is not the free-for-all anarchic space that business, the media, Libertarians, and cyborgs would have us believe (Gray 2001). Though chaotic and anarchic activities do exist, and these are very important locations of resistance, every act of resistance or conformity occurs under the graces of the protocols of the Internet. These protocols are governed by various institutions, governments, and administrative agreements. The most fundamental of these it the TCP/IP protocol, invented by Vinton Cerf and Bob Kahn. Almost all Internet traffic must conform to the TCP/IP protocol or it is rejected by the servers that pass information from computer to computer. All traffic must pass along telephone lines, from the POT (plain old telephone) lines in our homes to major Internet backbones maintained by Telcos (Krol 1992). How that information is encoded is governed by standards developed and maintained by various groups such as WC3 (World Wide Web Consortium), ICANN (Internet Corporation for Assigned Names and Numbers), IEEE (Institute for Electrical and Electronic Engineers), and JPEG (Joint Photographic Experts Group) (Champeon n.d.). These regulatory bodies, organizations, and protocol standards control what can and is done on the Internet. Many of these groups are transnational, and do have representation from institutions around the world. But they contain a very narrow selection of interests that are contiguous with the goals of the West.

There is no question that the software and hardware we use is primarily informed by multinational corporations; Microsoft, Sun Microsystems, Intel, AMD, AOL Time Warner, Apple Computer, IBM, Yahoo, Hewlett-Packard, Sony, etc., along with their support companies and organizations, control at the most basic level how we communicate online. Even putting aside the hardware manufacturers in this discussion, and focussing on the software with which we do what we can do
with computers, every keystroke invokes a software response that mediates our communication. This software is not value-neutral. It is culturally and linguistically embedded in a technologically positivist metanarrative that sees the technology itself, and those who create it and use it, at the apogee of human cultural experience (Lyotard 1984). This predisposition is encoded in the software itself.

There are a number technologies and movements that challenge the consumerist/corporatist profit driven models of the Internet, positing a somewhat prosumerist4 model; "As prosumers we have a new set of responsibilities, to educate ourselves. We are no longer a passive market upon which industry dumps consumer goods but a part of the process, pulling toward us the information and services that we design from our own imagination" (Finely, 2000). The open source movement is the key idea that brings otherwise competing interests together; it is one of the most important in computing in the late 1990s, and will probably be one of the dominant forces into the next century (Scoville 1998; Raymond 1998; O'Reilly & Associates 2000). Open Source Initiative and the GNU Project are two organizations influenced by specific individuals; GNU by Richard Stallman, and Open Source by Eric Raymond (Scoville, 1998). In general terms, they both want to promote software that is free, freely available, and open to the Hacker community. These projects both support the traditional notion of sharing resources among members of a community.

The Free Software Foundation is clearly immersed in the Hacker philosophy that information wants to be free. I can and have put my own invention V.A.S.E. under a Free Software Foundation GPL license (Nolan 2000). I did not need to ask for permission. I only need to include the text that would identify this license in my code, and abide by the license myself.

The Free Software Foundation's GNU General Public License (GPL) was first brought forth in 1991:

The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free softwareto make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Library General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things.

To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code (Stallman, 1999).

The Linux operating system builds on this open source philosophy. It is both a software and a conceptual revolution that has changed computing in a way that we cannot have imagined. Because of its success, it is also an important pedagogical signpost, showing an alternative direction
from the commercialization of online exploration of difference. "Linux is a free Unix-type operating system originally created by Linus Torvalds with the assistance of developers around the world. Developed under the GNU General Public License, the source code for Linux is freely available to everyone" (Online 1994-2000). As such, it represents a movement that critical education can follow, through vehicles such as the GNU, to allow individuals and organizations to maintain ownership, while freely sharing of their work with a larger community. Linus created Linux while a student at the University of Helsinki. While Linus Torvalds is world famous, he does not directly make money from his Linux Kernel (Online 1994-2000).

Due to the very nature of Linux's functionality and availability, it has become quite popular worldwide and a vast number of software programmers have taken Linux's source code and adapted it to meet their individual needs (Online, 1994-2000). Most major computer companies have recognized the importance of Linux. Dell, IBM, Apple and others have Linux compliant hardware. Intel, who makes the CPUs for the Windows operating system, actually owns a large amount of stock in RedHat, the most popular version of Linux (Raymond 1999).

The rise in importance of Linux (www.linux.org) is predicated on the fact that it is an open source operating system. The dynamic potential of the CVEs (Collaborative Virtual Environments) I work with is fundamentally due to their open source existence. This means that the raw source code of the system is publicly available under a license that allows anyone to use it and modify it for their own purposes under relatively flexible conditions as laid out in the license (Nolan and Weiss 2002; Nolan 2001). The result is that many thousands of users are motivated not only to modify and add to Linux for their own purposes, but also to share what they have created with the entire Linux community. The strength of Linux comes from the openness of the system and the community that surrounds it. This does not mean that Linux is necessarily always free, but that it is freely available. Redhat (www.redhat.com), one of the most important companies in the Linux movement, repackages Linux and creates its own flavor and support which they then sell. Though all the components are freely available from places like www.redhat.com, www.linux.org and other sites, even the entire Redhat release can be downloaded for free, although it is easier and more convenient to spend the money for their packaged version on CD-ROM.

There are obvious advantages to hundreds of independent and corporate programmers adding their little bit to a product that can then both be made available without charge online and compiled and variously packaged by any company to be resold to those users who want to purchase a particular distribution such as those Redhat, Corel, and other companies offer. In this scenario we are moving beyond the present notion of consumerism, a shifting series of intersections between programmer, distributor, and user where an individual or company can participate in one, two, or three roles simultaneously. They become, many of them, prosumersboth producer and consumer in concert with the company like Apple or the Linux meta-organization which is itself no longer merely a producer, but also a consumer of the products produced by its users. This circular relationship is obviously appealing in terms of challenging control over the means of production, because the production of technology is a cultural production first and foremost.

Cultural Production

These initiatives and software do not challenge the Western bias outlined in this chapter. They do, however, challenge the multinational corporations' ability to control what software we use, and how software can be modified for our own uses. Open source initiatives offer individuals and groups interested in social justice not only valuable allies who are often underutilized, but most
importantly a model of resistance that seeks to transform debates and relocalize them within social, as opposed to corporate, purviews.

There are other communities of resistance, but they would not be positioned as communities of difference in the terms set out for this edited collection. I am not sure if I can or want to make an argument for their inclusion either, but they exist, and should be considered by communities of difference as potential allies. These groupsgeeks, radicals, digeratithink that information and knowledge should be free. Warez sites, 2600.com, alt.2600 , MOOs , and the GNU, are all children of the early Hacker communities (Anonymous 2000; Babcock 2001; Curtis and Nicols 1993).

Hackers are the first community of the Internet. Many of the original members are the programmers who hacked the Internet together in the first place. They were the first to subvert the dominant discourse of the Internet to human, communicative, social ends (Ruffin 2001; Sterling 1993). Hackers are not the malicious Crackers and virus programmers that strike fear into corporations and are vilified in the popular media (e-cyclopedia 1999; Raymond 2000; Stoll 1989). They are not destroyers, but travelers, seekers, and creators of alternatives and solutions to barriers to accessing knowledge and information. Their mantra is that information wants to be free (Gray 2001, pp. 48-54). They are also predominately the ultra-privileged young educated heteronormative white males, but they are philosophically opposed to the hegemony of corporate and governmental interests. I work with queer Hackers, cyborgwomen and cybergirls, and the work of Stone on the transgendered body (Stone 1992), and Harraway's cyborg (Harraway 2000), and Hayles post-human (Harraway 2000; Hayles 1999) collectively reveal how the interfacing of women and technology are relocalizing the discourse of hacking in gendered spaces. As the technologies and influence of technology on the body are engaged by women, they are staking territory in the realm of the Hacker. The roots of the community, however, are located in this opposition to institutions that want to control information and access to resources.

The Warez movement is a more nefarious offshoot that makes available commercial and copyright protected commercial software, serial numbers and passwords; it is the black market or Pirates of the Internet (Tetzlaff 2000). This illegal activityone that most philosophically conscious Hackers eschewexists using the various technologies that make the Internet impervious to domination and control or destruction. It is difficult to locate Warez sites on the Internet as they appear wherever unprotected holes and nodes are located, and disappear as soon as authorities find them. But it is possible to trace their existence through search engines and archived references to sites that no longer exist. Information is passed through IRC (internet relay chat) servers, and content appears on web services that allow for free websites such as geocities.com. At their most subversive they use pre-WWW technologies such as Gopher, Archie and Veronica servers (Krol 1992; Nolan 1994), and Warez sites have recently become havens for illegal materials such as child pornography. But they remain examples of an active subculture that actively opposes Western commercial and governmental interests, and uses all the tools that are situated deep within the woof and warp of the Internet.

2600: The Hacker Quarterly is the Hacker's bible. With writers including Noam Chomsky, and famous Hackers Kevin Mitnick and Phiber Optik, 2600 (named after 2600hz, the frequency AT&T used to indicate an unused phone line) is the public window onto this otherwise hidden world. Hackers, or Phreaks as Telco Hackers are called, exploited this frequency to make free long distance phone calls. The journal is full of radical philosophy, how-to hack guides, and a forum for Hackers to describe their latest exploits in exposing the attempts of institutions to thwart the desire of information to be free. Though 2600 has its roots are in telephony, the fact that voice and data communication travel the same wires means that 2600 is the public domain of an element of the
Hacker community that is a must read for anyone interested in maintaining or compromising computer security. 2600 is about digital mastery and machismo, but it also represents, in the manner of cattle ranchers of nineteenth century America, resistance to the enclosure, commercialization, commodification of the virtual free range. This range is an important potential location for homesteaders of difference who want to stake out territory beyond the ken of Western commercial interests.

Hacking is not all a shady underworld act that can result in security services carting off your PC in a pre-dawn raid. There are social learning environments, collectively called Collaborative Virtual Environments (CVEs) such as MOOs, where individuals and groups construct/program/hack out virtual spaces and communities (Bruckman 1997; Cicognani 1998; Fanderclai 1995; Nolan and Weiss 2002; Rheingold 1993; Schank et al. 1999; Turkle 1995; Wertheim 2000). I have been involved in CVEs since the late 1980s, and developed two MOOs. MOO is an acronym for Object Oriented MUD, itself an acronym often unpacked as Multi-User Domain/Dungeon/Discourse (Curtis 1992; Curtis and Nichols 1993). My MOOsMOOkti and Project Achieveare virtual places where participants from as far away as Taiwan, Iceland, Brazil and Russia "create representations of people, places and things and share them with others" (Nolan 2001). The key to these constructionist, polysynchronous5 (integrated synchronous and asynchronous communication) spaces is that people not only communicate online in a multimedia, open source software environment, but that they can collaboratively create and program these spaces according to whatever criteria they choose to conceptualize and describe (Nolan 1995; Davie et al. 1998; Davie and Nolan 1999). Though MOOs still suffer from their English-only roots, we can and have worked simultaneously in English, Chinese, Japanese, Russian, Icelandic, German, French, and we are conceiving a MOO dedicated to polylingual programming, construction and communication environment. A polylingual space, versus multilingual, suggests that not only can many languages be accommodated, but that no one language reigns supreme; that multiple, intersecting language events and spaces can be created, and participants can work within the language(s) of their choice without being mediated by an overall dominant language.

In MOOs, identity and gender are not only performed but constructed and experimented with (Bruckman 1992; Stone 1992). Identity is liquid and continually open to rethinking, reconstruction and negotiation. Among our many activities on Project Achieve (http://achieve.utoronto.ca), we have been running annual workshops on gender and identity with the Triangle Project, of the Toronto District School Board, Canada's only school for Queer Youth. Many of their projects are publicly viewable, allowing visitors can experience their work. But all aspects of their self-representation and the presentation of their ideas are under their control, negotiated with their teacher in accordance with mutually agreed upon curriculum. They maintain ownership of their work and govern how it is viewed. If they find something missing or required, they can create it themselves, collaborate with others, or interact with the wider community to access resources important to achieving their goals. Many of the hundreds of the public CVEs across the Internet allow for this level of construction and governance. As such, they are important and powerful tools that offer communities of difference the potential to represent themselves beyond external control and influence.

The Internet is Written in English

These three strands, the Internet infrastructure, mega-corporations and technologies/groups that challenge them, are primarily English/male/Western dominated discourses. All strands are Western in voice. Linux with its roots in Linus Torvald's Finland is an example of one of the rare innovations that enters the popular consciousness that is not of G7 genesis. More importantly, soft
ware is written in programming languages such as C, C++, ObjectC, Java, and or scripting/markup languages like Perl, PHP (PHP Hypertext Preprocessor), HTML (HyperText Markup Language), XML (eXtensible Markup Language), SGML (Standard Generalized Markup Language). Though it is possible to use these languages to express written languages other than English through various encodings, these languages were created by speakers of English to be used by speakers of English. You cannot participate in the creation of software without using English in the programming, scripting or markup of content, without participating in the hegemony of English, even if you do not have the ability to speak or write English.

What does this mean in terms of issues of difference in technology? Simply put, it means that it is practically impossible to participate in the world of technology without privileging English. The Internet is written in English. A programmer who wants to write a word processor for Icelandic writes the word processor in English using a language like Java or C++. The software is installed into, say, a Windows, Linux or Apple operating system that has been localized into Icelandic. But files created still require, in most instances, a .doc .txt .html suffix; all derived from English. These localized versions are localized as an afterthought. The major operating systems, and the various software packages, are most all written for English consumers first, tested and made available to English consumers, and then ported to other languages, if the software company feels that it is profitable to do so. In 1997, Microsoft was pressured into porting one of their versions of windows to Icelandic by the Icelandic government highlighting the fragility of languages in the face of English and corporate interests (Ford 2001).

Though many operating systems, such as Macintosh's OSX and Linux, now are sufficiently international to ship with multi-language package options and as localized versions for a few major language markets, there is very little available that is not Anglo-centric. There are non-English environments in which one can work, such as the Assembler programming language, which predates the C language on which the Internet and most commercial software is based, and there are embedded proprietary languages that are not directly available to the public or working on the Internet. The hegemonic influence of English in the computer languages running the Internet, however, means that concerted effort by educators of difference who are willing to work towards the creation of alternative language spaces is required.

The 26 letters of the English alphabet form the basis of how most content moves across the internet, encoded as ASCII (American Standard Code for Information Interchange) text. The Internet functions primarily using the 94 printable characters that make up the ASCII character set.

abcdefghijklmnopqrstuvwxyz
ABCDEFGHIJKLMNOPQURSTUVWXYZ
0123456789
!"#$%&'()*+,-./:;<=?@[\]^_'{!}~ (Lunde 1993, p36)

And when Japanese is displayed on your computer, the characters look and act like Japanese, but the encoding method still involves ASCII characters in the background.

In this example the two groups of characters, ka na and kan ji6, are presented with the ASCII encodings shown below. These characters which represent words that describe the two main writing scripts in Japanese (ka na and kan ji) are each encoded into four ASCII characters in
order to be processed by computer and to be communicated over the Internet. That is, the Japanese language must be encoded through the agency of the American Standard Code for Information Interchange to be communicated to another computer, even within Japan.

When you write a program, script or electronic document, it must be written using an English-based programming language, such as Java, C, Perl, HTML. This following example from MOOca.java describes connecting to our MOO and setting the parameters for encoding non-English characters:

outputWriter = new OutputStreamWriter(mSocket.getOutputStream(), mRequestedEncoding);

s = new String(inputBuffer, 0, byteCount, getEncoding());

(Nolan and Goulden 1996-2001)

Where mRequestedEncoding is taken from the applet parameters and is a string such as "SJIS" "ASCII" "UTF8" [Unicode] which tells Mooca how it should encode the characters to send it to the Moo. Where getEncoding() usually is the same as mRequestedEncoding above. The only time it would be different is if someone tried to ask for an encoding that didn't exist, such as "japlish", in which case it would fall back to the user's default encoding. (Goulden 2001)

It is possible to see this form of encoding as unproblematic. A minor price to pay for global communication, but the hegemony of English is even more profound: "Does the interiorization of media such as letters alter the ratio among our sense and change mental processes?" (McLuhan 1995, p. 119). When you send an e-mail message, regardless of the language in which you compose your text, your e-mail program must talk to a server. A message is sent to a server on port 25 and the first message it says is "HELO", an abbreviation of "hello" in order to initiate a process that gets your e-mail moving on its way (commands sent to initiate communication are in bold):

telnet achieve.utoronto.ca 25

Trying 128.100.163.159... Connected to achieve.utoronto.ca.

Escape character is '^]'. 220 achieve.utoronto.ca ESMTP Sendmail 8.9.3/8.9.3; Tue, 6 Nov 2001 14:36:15 -0500

helo achieve.utoronto.ca 25

achieve..utoronto.ca Hello envvirtual.utoronto.ca [128.100.163.131], pleased to meet you

This means that every e-mail ever sent on the Internet by anyone in any language to any country, is couched in, or bracketed by, English. Communication is initiated in English and concluded in English. As language theorist George Steiner notes, in After Babel: "So far as language is the mirror or counter statement to the world, or most probably an interpretation of the reflective with the creative along an 'interface' of which we have no formal model, it changes as rapidly and in as many ways as human experience itself"(Steiner 1998, p. 468). Steiner's ideas, in the context of the problematic dominance of English through out the woof and warp of the Internet, suggest that what we can do and think with technology is forever informed by a language that, though in flux itself, forces the expression of human experience to conform to the influence of a single language and perhaps the
metanarrative of those who thus situated it. The phenomenon that is the Internet has come upon the human species fast and unanticipated. Educators are playing catch up in their critical awareness of the foundations of the Internet, with the result that aspects of this revolution that should be challenged and problematized have slipped by, perhaps unnoticed.

The Educator as a Creator of Learning Environments

I am writing this chapter in English, as it is the only language to which I can claim fluency, but I choose to code this chapter in raw html, the hidden background scripting language that forms the background of all web pages. What I have written looks to me like this:

</blockquote>

<p>The educator as a Creator of Learning Environments</p>

<blockquote>

<p>I am writing this chapter in English

I am also writing this using programs such PICO and BBEditLite, free and simple word processors. These choices remove, even if just temporarily, a level of commercial influence over the production of text, and a level of isolation from what goes on behind the location of the presentation of text. In order to participate in the publication of this book, however, I will have to convert my text into Microsoft's Word program, but I will e-mail it to the editor using open source software. These vaguely symbolic acts do highlight how an educator can position herself within alternatives to corporatist agendas, and model a practice that can be both emulated by students and stimulate awareness and inquiry into alternatives.

These are, however, the most superficial locations of resistance, and an entire volume such as this would be required to engage the possible examples and experiences of CVEs, MOOs, Cyborgs and Hackers (Gray 2001; Stone). Technology and computer literacy as it is taught in our faculties of education, and in the classroom, rarely takes even this stance. Review any curriculum, and you will see that it is largely infused with corporate technologies and corporate interests. The goal of technology-based curriculum is that of teaching users to be consumers of products in the name of global competitiveness and efficiency. Often the technologies are no more than computer assisted learning and evaluation tools. When the technologies are used for expressive communication, the teaching is limited to what is proscribed in the manual. Rarely do even the most pedagogically aware educators, informed by critical pedagogy and aware of the need to promote alternative voices, critique the technologies in which the voices are located. And if/when they are aware, they lack the time or resources to really explore the options that are hidden away by security-conscious systems administrators (Nolan 2001). Today most decisions as to what technologies are used in learning environments are made by technology specialists and administrators, and are given to educators with little or no consultation. There is even less awareness on the part of the educators and students that alternatives exist.

This is an untenable position. If valid and sustained strides are to be made to embed alternative choices in the global culture, they must be found within the technologies we use. These technological alternatives are something that cannot be done for us either. They must be done with us, and by us in community with our peers and students. To control the conceptualization, creation, development, implementation, co-habitation and governance of these spaces we must learn to code, program, create our own software and environments that reflect our diverse needs and goals, and we must share them freely with others; allowing others to revise and relocalize what we share according
to their own criteria and needs. This must be undertaken with the same vigor as the alternative press who have given voice to their own communities and experiences. There really is no other alternative. For if we do not actively participate in the creation of our discursive spaces, they are created by someone else, and we are at the most disempowered end of the power relationship (Foucault 1991; Illich 1970). If we do not govern our own spaces, our pedagogy, curriculum, writing and thinking are open to commodification, and we are no longer creators, but consumers.

As a first step, transforming ourselves from consumers into prosumers where we are involved in communities of discourses, technologies, and narratives that we co-create and inhabit, allows us to share and interact with stories of difference. The much vaunted and rarely experienced virtual community becomes a potential reality when we are able to (re)construct and embrace both collective and infinitely differentiating representations of ourselves, as we see ourselves, and reflect upon how we see others and are seen by them (Fernback and Thompson 1995; Rheingold 1993). The potential dialogues are, however, only realizable when we control (or consciously yield control over) the means of our own representation. The situation is hazardous simulation when dialogues are mediated by technologies over which we have minimal understanding and scant influence (Baudrillard 1988; Fernback and Thompson 1995; Stone 1992; Turkle 1995).

The influence of the Internet on the diverse languages and cultures of the world, those represented and under-represented by technology, is in many instances an act of relocalization of culture(s) from the real to the virtual. Cultural topologies are (re)constructed, and cultural experience must find new strategies of expression and resistance to survive and thrive (Nolan 1998; Ostrom 1990; Rheingold 1993). But more importantly, this relocalization is an act of translation of cultural experience (Steiner 1998). And without the concerted effort of educators informed by the ideas of the pedagogies of differenceeducators who are able to engage and dialogue with the Englishness of the Internet below the surface level of written texts, down to the level of the code and encodings that make the Internet happenwe are situating struggles within colonializing dialogues, aware of whose hands we are playing into, but unaware of how deeply the cards are stacked against us (Giroux 1992). Where Steiner hypothesizes that "the proliferation of mutually incomprehensible tongues stems from an absolutely fundamental impulse in language itself [and] that the communication of information, of ostensive and verifiable 'facts', constitutes only one part, and perhaps a secondary part, of human discourse" I think that we are not only engaged in a struggle to liberate Internet discourses from the hegemony of English, but are engaged in a struggle fundamental to the defense of all aspects of difference (Steiner 1998, 497). There is nothing that can be done to exorcise English as the fundamental language informing all communication on the Internet, but it is incumbent on all educators engaged in struggles to defend or demarginalize communities of difference to extend their program of inquiry and resistance to an engagement of how language and cultural influences inherent in the structure of the Internet translate/encode discourse and experience within a dominant cultural ideology.

Bibliography

Anonymous. 2001. alt.2600 FAQ Revision .014 (1/4) [HTML], May, 29, 2000 [cited October 1 2001]. Available from http://www.faqs.org/faqs/alt-2600/faq/.

Babcock, Jim. 2001. 2600- a whatis definition [HTML], July 21, 2001 2001 [cited October 1 2001]. Available from http://whatis.techtarget.com/definition/0,,sid9_gci211496,00.html.

Baudrillard, Jean. 1988. Simulacra and Simulations. In Selected Writings, edited by M.
Poster. Stanford: Stanford UP.

Berners-Lee, Tim. 2001. Tim Berners-Lee: A short history of web development. 1998 [cited August 2 2001]. Available from http://www.w3.org/People/Berners-Lee/ShortHistory.

Bruckman, Amy. 1999. Identity Workshop: Emergent Social And Psychological Phenomena in Text-Based Virtual Reality [Postscript document]. Amy Bruckman, Friday, September 3, 1999 1992 [cited December 12 1999].

. 1998. MOOSE Crossing: Construction, Community, and Learning in a Networked Virtual World for Kids. [Web page, PhD Dissertation] 1997 [cited 1998]. Available from http://www.cc.gatech.edu/fac/Amy.Bruckman/thesis/index.html.

Cailliau, Robert. 1999. A Little History of the World Wide Web [HTML]. W3C, 03 October 1995 1995 [cited August 4 1999].

Cerny, Jim. 2001. Who Runs the Internet? [HTML] 2000 [cited November 1 2001]. Available from http://www.unh.edu/Internet/web/whoruns.html.

Champeon, Steve. November 1. RTFM: A Guide to Online Research [HTML]. Wired Digital Inc. n.d. [cited 2001 November 1]. Available from http://hotwired.lycos.com/webmonkey/templates/print_template.htmlt?meta=/webmonkey/00/08/index2a_meta.html.

Cicognani, Anna. 1998. On the Linguistic Nature of Cyberspace and Virtual Communities. Virtual Reality 3:16-24.

Cooper, Alan. 1999. The Inmates are Running the Asylum: Why High-tech Products Drive Us Crazy and How to Restore the Sanity. Indianapolis, Indiana: SAMS.

Cooper, Charles. 2001. When blogging came of age. CNN 2001 [cited September 20 2001]. Available from http://news.cnet.com/news/0-1272-210-7242676-1.html?tag=bt_bh.

Cummins, Jim, and Dennis Sayers. 1995. Brave New Schools: Challenging Cultural Illiteracy through Global Learning Networks. Toronto: O.I.S.E. Press.

Curtis, Pavel, and Doug Nichols. 1993. MUDs grow up: Social virtual reality in the real world. Paper read at Third International Conference on Cyberspace, at Austin, TX.

Curtis, Pavel. 1992. Mudding: Social Phenomena in Text-Based Virtual Realities. Intertrek 3 (3):26-34.

Davie, Lynn, Hema Abeygunawardena, Katherine Davidson, and Jason Nolan. 1998. Universities, Communities, and Building Sites: An Exploration of Three Online Systems. Paper read at Educational Computing Organization of Ontario, at Toronto, ON.

Davie, Lynn, and Jason Nolan. 1999. Doing Learning: Building Constructionist Skills for Educators, or, Theatre of Metaphor: Skills Constructing for Building Educators. Paper read at TCC, at Maui, Hawaii.

e-cyclopedia. 2000. BBC News | e-cyclopedia | Cracking: Hackers turn nasty [HTML]. e-cyclopedia@bbc.co.uk, Tuesday, August 31, 1999, 12:34 GMT 1999 [cited April 19 2000]. Available from http://news.bbc.co.uk/hi/english/special_report/1999/02/99/e-cyclopedia /newsid_434000/434498.stm.

Fanderclai, T. L. 1995. MUDs in Education: New Environments, New Pedagogies. Computer-Mediated Communication 2 (1):8.

Fernback, Jan, and Brad Thompson. 1999. Virtual Communities: Abort, Retry, Failure? [HTML] 1995 [cited August 16 1999]. Available from http://www.well.com/user/hlr/texts/VCcivil.html

Finely, Michael. 2000. Alvin Toffler and the Third Wave. www.mastersforum.com, Mon, Jan 31, 2000 10:17:18 PM GMT 2000 [cited April 18 2000]. Available from http://www.mastersforum.com/toffler/toffler.htm.

Ford, Peter. 2001. Need software in, say, Icelandic? Call the Irish. [HTML]. Christian Science Monitor, February 6, 2001 2001 [cited October 15 2001]. Available from http://www.csmonitor.com/durable/2001/02/06/fp1s3-csm.shtml.

Foucault, M. 1991. Governmentality. In The Foucault Effect: Studies in governmental rationality., edited by G. Burchell, Gordon, C. & Miller, P. Hertfordshire: Harvester Wheatsheaf.

Franklin, Ursula. 1992. The Real World of Technology. Toronto: Anansi.

Giroux, Henry A. 1992. Border Crossings: Cultural Workers and the Politics of Education. New York: Routledge.

Goulden, David. 2001. raw MOOca. Toronto, November 7, 2001.

Gray, Chris Hables. 2001. Cyborg Citizen: Politics in the Posthuman Age. New York: Routledge.

Harasim, Linda, Starr Roxanne Hiltz, Lucio Teles, and Murray Turoff. 1995. Learning Networks: A Field Guide to Teaching and Learning Online. Cambridge, MA: MIT Press.x

Harraway, Donna. 2000. A Cyborg Manifesto: Science, Technology and Socialist-Feminism in the Late Twentieth Century. In The Cybercultures Reader, edited by D. Bell and B. Kennedy. London: Routledge.

. 2000. How Like a Leaf. New York: Routledge.

Hayles, Katherine. 1999. How We Became Posthuman: Virutal Bodies in Cybernetics, Literature, and Informatics. Chicago: Chicago.

Haynes, Cynthia, and Jan Rune Holmevik. 1998. Highwired: On the Design, Use and Theory of Educational MOOs. Ann Arbor: Michigan.

Hochheiser, Harry, and Robin Ric. 2001. Who Runs the Internet? [HTML]. Computer Professionals for Social Responsibility, May 3, 1998 1998 [cited November 1 2001]. Available from http://www.cpsr.org/onenet/whoruns.html.

Illich, Ivan. 1970. Deschooling Society. New York: Harper & Row.

Jones, Steve. 2000. The Bias of the Web. In The World Wide Web and Contemporary Cultural Theory, edited by A. Herman and T. Swiss. New York: Routledge.

Krol, Ed. 1992. The Whole Internet User's Guide & Catalog. Sebastapol: O'Reilly.

Little, Jean. 2001. Orphan at My Door: The Home Childe Diary of Victoria Cope Guelph Ontario 1897. Toronto: Scholastic.

Lunde, Ken. 1993. Understanding Japanese Information Processing. Sebastapol: O'Reilly.

Lyotard, Jean François. 1984. The postmodern condition : a report on knowledge, Theory and history of literature. v. 10. Manchester: Manchester University Press.

McLuhan, Marshall. 1995. The Gutenberg Galaxy. In Essential McLuhan, edited by E. McLuhan and F. Zingrone. Toronto: Anansi.

Mitchell, William. 1998. City of Bits: Space, Place and the Infobhan. Cambridge, Mass: MIT Press.

Noble, David. 1997. The Religion of Technology: The Divinity of Man and the Spirit of Invention. New York: Knopf.

Nolan, Jason. 2001. The Techneducator Effect: Colliding Technology and Education in the Conceptualization of Virtual Learning Environments. PhD Dissertation., Curriculum Teaching and Learning, Ontario Institute for Studies in Education., University of Toronto, Toronto.

. 1999. The Dark Side of the Internet [HTML], 1998 1994 [cited August 5 1999]. Available from http://noisey.oise.utoronto.ca/gbut/.

. 1999. Educators in MOOkti: A Polysynchronous Collaborative Virtual Learning Environment., February, 28, 1999 1995 [cited April 25 1999]. Available from http://noisey.oise.utoronto.ca/jason/dissertation-proposal.html.

. 2000. Project Achieve & VASE: Virtual Learning Environments. Paper read at TEACHING, LEARNING AND RESEARCH IN TODAY'S UNIVERSITY: Information Technology and the University Professor, at University of Toronto, Toronto, Ontario.

. 2000. Unpacking Transnational Policy: Learning to Bridge the Digital Divide. Educational Technology & Society 1 (1).

Nolan, Jason. 2001. Vase: The Virtual Assignment Server Environment. [HTML] 2000 [cited September 27 2001]. Available from http://achieve.utoronto.ca/vase.

Nolan, Jason, and David Goulden. MOOca.java (3.0) [Java Applet]. Project Achieve 1996-2001 [cited. Available from http://www.zanid.com/mooca/.

Nolan, Jason, and Emma Jane Hogbin. 2001. Internet Literate: The A report on Future Trends for Online Learning Environments in North America. Toronto: Vivendi.

Nolan, Jason, Jeff Lawrence, Yuka Kajihara. 1998. Montgomery's Island in the Net: Metaphor and Community on the Kindred Spirit's E-mail List. Canadian Children's Literature 24:3/4 (91/92):64-77.

Nolan, Jason, and Joel Weiss. 2002. Learning Cyberspace: An Educational View of Virtual Community. In Building Virtual Communities: Learning and Change in Cyberspace, edited by K. A. Renninger and W. Shumar. Cambridge: Cambridge.

O'Reilly & Associates, Inc. 2000. Welcome to the O'Reilly Open Source Center [HTML]. O'Reilly & Associates, Inc., Thu, Apr 13, 2000 4:10:53 PM GMT 2000 [cited April 18 2000]. Available from http://opensource.oreilly.com/.

OECD. 2000. Learning to Bridge the Digital Divide, Schooling for Tomorrow. Paris: OECD Publications.

Online, Linux. 2000. The Linux Home Page at Linux Online. Linux Online Inc. 1994-2000 [cited April 18 2000]. Available from http://www.linux.org/.

Ostrom, Elinor. 1990. Governing the Commons: The Evolution of Institutions for Collective
Action
. Edited by J. A. D. North, The Political Economy of Institutions and Decisions. Cambridge: Cambridge.

Power, Edward. 2001. Joe Blogs on the Internet [HTML]. Irish Times, Dublin, October 28, 2000 2000 [cited October 31 2001]. Available from http://www.ireland.com/newspaper/features/2000/1028/features4.htm.

Raymond, Eric S. 2000. Open Source: Software Gets Honest [HTML] 1998 [cited April 18 2000]. Available from http://www.opensource.org/.

. 2000. The Rampantly Unofficial Linus Torvalds FAQ [HTML]. Eric S. Raymond, Wed, Dec 22, 1999 5:01:41 PM GMT 1999 [cited April 18 2000]. Available from http://www.tuxedo.org/~esr/faqs/linus/.

. 2000. How to become a Hacker [HTML], March 24, 2000 2000 [cited April 18 2000]. Available from http://www.tuxedo.org/~esr/faqs/hacker-howto.html.

. 2001. Creeping Featuritis [HTML] 2001 [cited November 1 2001]. Available from http://www.tuxedo.org/~esr/jargon/html/entry/creeping-featuritis.html.

Reach, Global. 2001. Global Internet Statistics (by language) [HTML], September 30, 2001 2001 [cited Novermber 1 2001]. Available from http://www.glreach.com/globstats/index.php3.

Rheingold, Howard. 1993. The Virtual Community. New York: Harper.

Ruffin, Oxblood. 2001. The Hacktivismo FAQ v1.0 [HTML]. cDc communications 2001 [cited October 20 2001]. Available from http://www.cultdeadcow.com/cDc_files/HacktivismoFAQ.html.

Said, Edward. 1993. Culture and Imperialism. New York: Vintage.

Schank, Patricia, Jamie Fenton, Mark Schlager, and Judi Fusco. 2000. From MOO to MEOW: Domesticating Technology for Online Communities [HTML]. SRI International, Center for Technology in Learning 1999 [cited April 7 2000]. Available from http://kn.cilt.org/cscl99/A64/A64.HTM.

Scott, Peter. 2001. Free-Nets and Community Networks [HTML]. Lights.com, unk 2001 [cited October 15 2001]. Available from http://www.lights.com/freenet/.

Scoville, Thomas. 2000. Whence the Source: Untangling the Open Source/Free Software Debate [On-line]. O'Reilly & Associates, Inc., Thu, Dec 9, 1999 7:10:38 PM GMT 1998 [cited April 18 2000]. Available from http://opensource.oreilly.com/news/scoville_0399.html.

Stallman, Richard. 2000. GNU's Not Unix! - the GNU Project and the Free Software Foundation (FSF) [HTML]. Free Software Foundation, Inc., Sat, Apr 8, 2000 4:04:17 AM GMT 1999 [cited April 18 2000]. Available from http://www.fsf.org/.

Steiner, George. 1998. After Babel: Aspects of Language and Translation. third ed. New York: Oxford University Press. Original edition, 1975.

Sterling, Bruce. 1993. The Hacker Crackdown: Law and Disorder on the Electronic Frontier. New York: Bantam.

Stoll, Cliff. 1989. The Cuckoo's Egg: Tracking a Spy through the Maze of Computer Espio
nage
. New York: Pocket books.

Stone, Allucquere Rosanne. 1992. Will the Real Body Please Stand Up?: Boundary Stories about Virtual Cultures. In Cyberspace: First Steps., edited by M. Benedikt. Cambridge, MA: MIT Press.

Tetzlaff, David. 2000. Yo-Ho-Ho and a Server of Warez. In The World Wide Web and Contemporary Cultural Theory, edited by A. Herman and T. Swiss. New York: Routledge.

Turkle, Sherry. 1995. Life on the Screen. New York: Shuster.

Vest, Frank. 2001. FidoNews - The FidoNet Dialup BBS Community Weekly Newsletter [HTML]. FidoNews Editor, October 15, 2001 2001 [cited October 15 2001]. Available from http://www.fidonews.org/.

Wertheim, Margaret. 2000. The Pearly Gates of Cyberspace: A History of Space from Dante to the Internet. New York: Norton.

1 ASCII stands for American Standard Code for Information Exchange, and is the encoding method by which most text-based information moves around the internet.

2 I use Japanese as the example of a non-English language because it is one in which I have the greatest fluency. It is also that language which makes up 9.2% of all web pages on the Internet (Reach 2001).

3 http://www.blogger.com; http://www.livejournal.com; http://c2.com/cgi-bin/wiki?WikiWikiWebFaq; http://www.noahgrey.com/greysoft/

4 Prosumer, in general, is an individual who partakes in both production and consumption.

5 "Polysynchronous is a term coined to describe the nature of MOOs where communication is an embedded combination of both synchronous and asynchronous communication (Nolan, 1998; Davie and Nolan, 1999). An IRC chat group is completely synchronous. Users communicate in real time, and there is usually no record kept of the communication unless one member personally creates a transcript of the interaction as a log. Asynchronous communication refers to the what happens on bulletin boards and via e-mail where a message is composed and transmitted to another individual or group. In a MOO, communication can be synchronous or asynchronous, but it can also be a combination of both. A conversation can be encoded into an object for others to read. MOO objects can be programmed to listen to conversations between members and generate responses that become part of the MOO-space itself for other participants to listen to later. As well, a conversational interaction may take the form of direct synchronous speech and the co-manipulation of MOO objects. It is possible to talk with another person, hand her virtual objects for her to look at, co-program MOO objects, and record the conversation for a third party to read later. This type of polysynchrony is particular to MOO-type environments, but reflects the direction that collaborative virtual environments are anticipated to follow in the future" (Nolan 2001).

6 The ka na and kan ji characters and their encodings have been created by Ken Lunde for this publication.