Internet

When examining electronic networks, the Internet is a good example as it is one of the biggest and most spectacular. It is possible to compare the development of the Internet with earlier, radical infrastructural changes caused for example by the arrival of the railway or the motor car. Viewed in this way, the Internet is a logical continuation of a development that began with the wire telegraph and continued with the telephone. As a hybrid network, it exhibits all the phenomena associated with both orderly expansion and evolutionary growth.

The Internet grew out of a project initiated by the American Advanced Research Project Agency — ARP A, in the 1960s. This agency was subordinate to the US Department of Defense and its mission was to create a big network for communication between military research computers. The network acquired the name ARPANET and soon came to include many universities pursuing military research. The location of its main node became Washington DC.

The original concept was to make the network resistant to a nuclear war. One or several nodes could be destroyed without devastating consequences for the network as a whole, thanks to its distributed character. The big innovation was that data transmission was based on ‘packet switching’. This technique divides the information content of a message into small electronic packages of equal length, each equipped with an address tag. Each package could be routed on different ways in the network, a very practical feature in the case of bottlenecks or breakdowns.

As more and more universities from all parts of the world were connected to the network, its character changed to that of an international university communications facility. The network soon became a conglomerate consisting of the different nations’ own networks and was taken over by the American National Science Foundation under the name of the Internet. It can now be characterized as a ‘network of networks’. For experts in the area, the term for all integrated networks of the world is the World Net or The Weh. The Internet, however, must be regarded as the core of this phenomenon with its more than thirty thousand individual networks.

The Internet is today an example of natural, ‘anarchistic’ growth from below, where the wish to connect was the only main shaping factor. There is no board of directors or central command authority for the network; from several points of view it manages itself. It can thus be characterized as unhierarchie, uncentralized and unplanned. In spite of this, some kind of Invincible Hand exerts a certain control without authority. But everywhere the Internet arises, rebels also arise to resist human control and realize their individuality. After the modest start at the end of the 1960s, the Internet has exploded and is now the backbone of global data communication with its more than 1 billion computer users, about 10-15 per cent of humanity. In Sweden, for example, each third person daily uses Internet and more than 70 per cent of the population can access it.

In the last ten to fifteen years, Internet has been transformed from a matter of a few specialists to a weekday medium (in the rich world). In a way, it is possible to speak of two different Internet; one “idealistic” and one “commercial”. Both are breaking up the geographically determined conditions we have been  bomed into. Now we can chose what we want to see, read, listen to and who we want to speak with. We can communicate to people without the need to share home district, working place or language. Of course, problems are created when common past or frames of reference are broken. On the other hand, it creates a certain liberation from collective identities not of our own choice. This gives the possibility to a selfchosen spirit of community: political, religious, professional, etc. All is within our own choice.

In the commercial Internet we can select between a limited number of pre-produced alternatives. It is an international shopping centre in combination with a multimedia entertainment store delivering diversion to mainly passive consumers.

Unfortunately, the main part of the Internet have been conquerd by heavy commercial interests with the help of politician unable to let things alone — copyrights, software patents, and an enormous advertising apparatus.

However, the romantic dream of an “alternative”, non-hierarchical and non-commercial free Internet is by no means broken. Commercial interests have contributed to put the Internet within reach of many. Its technology has been more user-friendly, relatively cheap and spread throughout the world. An example is the phenomenon of “Internet-cafés” spread over the whole world.

The excess profits of music and film companies are undermined by file-exchange computer programs like Kazaa. A dominant giant such as Microsoft is challenged by free operating system like Linux. With e-mail, home-pages and discussion-groups global networks are created out of control of states and big business.

From a technical point of view, the Internet is a digital highway with a capacity of 34 Mbits per second on its main links. It is a typical peer-to- peer network where the local branches have a capacity of 2 Mbits or below. More and more telephone companies are now offering what is called broad-band subscriptions with a capacity of 10 Mbits or more. As 11 Mbits is considered to be the maximum information input potential of human beings this will allow the transmission of high quality video via Internet.

Typical for the capacity-explosion of digital networks are the main Swedish university communication links performance, as presented below.

In a global network of such dimensions, certain problems emerge out of necessity. These include questions about the kind of information that may be distributed, and the ‘human information rights’ of the participants. For example, a special group has been founded to protect human rights in the electronic world called the Electronics Frontier Foundation (EFF). Politicians and lawyers are continually trying to figure out whether to govern or police the network and if so how. Authorities speak about legally responsible publishers for bulletin boards, censorship to prevent the distribution of pornography, and the prohibition of information encryption.

Cryptography on the Internet is a problem area of its own. Much of the information transmitted on the network must be considered attractive for various reasons. Business information, stolen records, military secrets, passwords and private mail are constantly available for those capable of reading it. Probably there are no safe systems on the Internet and computer security experts state that at least one million passwords are stolen every year.

The need for electronic privacy therefore has prompted a growing use of cryptography. In United States, the official attitude to this trend is said to be positive — on condition that the authorities hold a key to a Trapdoor’ in every system. This is the idea behind the criticized Clipper system which holds the official standard for information encryption. Other countries, such as France, have a still stricter attitude towards cryptography; there, all cryptography should be licensed, a legitimate reason for its use established and copy of the key delivered to the authorities.

The fact that many new digital technologies like cellular telephones are untappable lay behind the new US Digital Telephony Act. This law states that telephone companies must use communication software which permits the authorities to tap and read the bit-stream. A cause célèbre is, however, that the Clipper has been partially cracked by a scientist from Bell Laboratories in New Jersey.

The business world, together with the majority of other users of the Internet, are rather cool vis-à-vis all attempts to share codes and keys with ‘Big Brother’. The idea in itself makes secrecy a corrupted concept and many have turned to what is called public key codes or PKC. In contrast to classical codes this has two keys. One of them, the public key, is given to the person who wants to send information to you and is used for encoding. The other is an unrelated ‘private’ code key used by the receiver to decode the message. It resides in the personal computer where it is less likely to be stolen. If someone get hold of the public code nothing can be done with it. Neither the message nor the public code itself tells anything about the private part. (Consequently, some people distribute their public code key on the network.)

The use of codes in itself indicates that certain information is kept secret for other persons. There are now on the Internet more discreet methods which even hide the fact that something is hidden. They may be characterized as methods of ‘hidden meaning’. Hiding messages in innocent text and pictures has long been a technique used; doing it in the relevant digital bit-stream is both easy, cheap and undetectable. Within the millions of bytes representing everything from sound files, high resolution pictures, private letters or financial transactions, every kind of secret information can be hidden. Even cryptographic experts admit that such hidden messages rarely leave enough of a pattern to be detected and decoded.

Serviceable and widely available encryption algorithms are the prerequisite for something which can be called digital signatures. This is a piece of code which has its origin from a certain person or rather his personal computer. If such a signature can be considered entirely trustworthy it can be used to confirm and acknowledge all kinds of transactions in a network. Such hidden transactions could easily be the backbone of an alternative economy unrestricted by governments and a nightmare for assessment authorities.

It is, however, doubtful if it ever will be possible to regulate the Internet, because what is forbidden in the US will be allowed in Finland, and so on. Also for purely technical reasons, it is at present impossible to censor the network, since it is constructed to work around blockages and censorship by its self-repairing qualities.

Of course, there is a real danger that the Internet may be threatened by the very qualities that supported its growth. It is influenced from all sides; veteran users try to protect it, governments want to control it and pornographers try to exploit the freedom of it. The greatest current threat seems to be the commercial interests which strive to make money on it. If they were to grow too strong they could cause a sudden collapse of the Internet — one of the most promising cultural phenomena of the 1990s.

The earliest applications on the network have been electronic mail (E- mail) and computer conferencing. Classified as electronic mail is everything which is not ordinary mail (‘snail mail’) such as telefax, videotext, searching in external databases, conferencing, etc. The use of E-mail is in fact much more rapid than the use of an ordinary fax device. In addition, an E-mail is safer from  wire-tapping during transmission and quite private in contrast to the fax message which can be read in public at the receiving end. The need for E-mail is one of the major reasons for the growth of the Internet.

Although E-mail is still one of the most used applications, practically everything which can be transformed to an electronic bitstream can be communicated via the Internet. Political debate, literary criticism, stock market tips and matchmaking are some examples. Archives, libraries and databases are available around the clock, and very often without fee. Today, a researcher can publish a report on the Internet and receive an immediate response instead of waiting several months.

Many academic institutions around the world have their papers and publications stored in databases retrievable for both researchers and the general public. The latest highlight is that Encyclopedia Britannica has found its way on to the Internet. Some 44 million words and many thousand of pictures are here stored in a database.

Free electronic magazine subscriptions are common on the Internet. One example is the Internet gazette Refractions which contains news about various electronic forums. There are already more than 3000 forums for different discussion topics.

Some of the main areas of information accessible on the Internet include:

  • Scientific data, for example, star catalogues, earthquake danger zones, research papers, etc.
  • Sound and pictures. Internet users digitize pieces of music or pictures and share with others. Today, short digitized moving video sequences are to be found on several databases. Several radio stations has also established themselves on Internet with daily transmissions.
  • Electronic newspapers, magazines, books and various kind of fact files. Project Gutenberg in the USA and project Runeberg in Sweden have digitized several hundreds books which are available free of charge as the copyright has expired.
  • Normally this is what is called shareware or other kinds of programs which can be distributed free.
  • Internet trading, including banking. More and more companies are introducing themselves with homepages and by use of credit card numbers it is easy to buy for example books from bookstores. Private bank transactions are relatively easy to manage on Internet althought it requires a small encryption device attatched to the computer.

A main criticism of the Internet has been that network services were complicated to use especially when transferring files between universities and enterprises. FTP, or file transfer protocol, has been used to do this for many years. Its user interface is rather primitive, and bears a certain resemblance to earlier DOS versions. To improve matters for the operator and to facilitate the use of the network, simple graphical interfaces has been introduced. Instead of cryptic file and catalogue names, longer explanations of documents and archives are given in a graphical environment. With these modern interfaces and their hyper- menu system the user has no need to know on what computer his information is stored. By choosing a certain menu it is possible to navigate throughout the network. Some of the menus connect directly to the requested information which is presented as text on the screen.

The prédominât way of accessing information on the Internet is today by use of World Wide Web or WWW addresses. This is a further development of old interfaces with graphics, pictures and hypertext. Hypertext implies that specially indicated words or phrases in a text are equipped with links to new information quantities. These links crisscross the network in all directions. Today it is common to use Internet as a work of reference — to do a Net Search. Specially designed software called search engines are easily accessible on the Web for immediate use.

In 1984, there were about 1000 WWW servers, most of them at academic institutions. Since then the development can be highlighted by the following rate of growth:

1989  About 150000 computers

1991  More than 1000000 computers

1993 About 2000000 computers, about 6000 servers

1995 Internet consists of about 25000 different networks with 6000000 connected computers. 50000000 persons are considered to use the net.

More than 100000 servers were in action

1996 Internet-connected computers were about 13000000

1997 1000000 web-servers were in action

1999 The number of servers has exceeded 5000000

Today almost all well-known computer enterprises are storing their drivers and reference articles on Internet servers. Another interesting field of application involving WWW servers is the connection of indicators. Some research institutes in US have coupled their Geiger- counters to the network, thus making continuing registration of radiation available everywhere.

All this makes the Internet more of an intellectual working tool than a communication medium. The refinements of the network have grown more and more and have become increasingly user-friendly. Anyone can now sit at his desk and help himself to the accumulated knowledge of the world. The Gutenberg art of printing created the foundation of the modern nation state. The printed word was the glue which joined together the different parts of a nation. In the same way, the international networks are on their way to creating a world-wide community.

Although the base of users is constantly broadening, the majority of them are university graduates and persons in private employment who have access to the Internet via their job. It is still expensive for a private person to connect via a modem and a personal computer at home.

Many communication enthusiasts now speak of the Internet and its users as the modern ‘network society’ existing in ‘cyberspace’ and populated by ‘cyber-citizens’ following the ‘netiquette’. A special kind of ‘net-culture’ has emerged where researchers, technicians and students exchange ideas and information, often in the form of computerized conferences.

The net-culture is based upon a sort of anarchic ethic, embraced by the students and ‘hackers’ who took part of the early build-up of the network. Among its implicit basic rules are the following:

  • All information should be free.
  • All access to computers should be unlimited and total.
  • Promote decentralization and mistrust authority.

Much of the information residing in the Internet is in fact free. Accounted as free information must also be all those computer programs which are possible to download for personal use. Many of them enhance access to various parts of the network and may be regarded as its self- organizing agents.

Based on the written word and the English alphabet extended with various punctuation marks and computer characters, the writing style has become a special part of the net-culture. Certain acronyms and signs are commonly used to express frequent expressions and emotions. Some examples are given in the list below. (If the head is turned sideways the signs expressing emotions are quite striking!).

Warning!  Do   not   type   in capital letters. (IT IS LIKE SHOUTING!).

The small constellations in the right-hand column are called ‘smileys’ or ’emoticons’. They are used to express the body language and feelings existing in an oral conversation but which are lost in the E-mail system.

Being a two-way medium, the Internet must be considered a many- to-many facility. As such, it has dramatically changed the social rules as to who may talk to whom and who may listen. In fact, it is, today, easy to contact politicians and those in power. Most networkers are aware that they have direct access to the computer screen of America’s president, at least in theory if not in practice. The White House now receives about 4000 E-mail messages a week, stored, handled and filtered by a special group of aides. How many of these messages really reach the eyes of the President is an open question. Probably not more than the fingers on one hand as a certain filter technique has been developed to avoid total information input overload. Filters for ordinary networkers who believe that their lives will be both saner and better if they can avoid reading messages from certain detestable individuals — called bozo filters — are common.

Many enthusiasts (including former US Vice President A1 Gore) have stated that the new network technology will change society completely. The changes are going to be as radical as when the industrial revolution superseded the peasant culture. We may perhaps be about to enter a new democracy like that of the old Athens. But changes of this kind are often revolutionary, beyond parliament. From that point of view, a vice president cannot influence the development very much.

However, even as we speculate about a future network society, the old division between information consumers and producers is already loosening up. Today, anybody can be a producer or a publicist and have at his disposal a mass medium in the network. Historically, distribution has always been the problem, but with modern networks this is no longer the case.

Thirty years ago, nobody had a feeling that the computer, which was then the controlling instrument of ‘Big Brother’, would be a tool for freedom and democracy. Thanks to the networks we now have the whole world within reach of our index finger. Thirty years ago, the only persons with the world within reach of an index finger were the American and Soviet presidents and then in a very devastating sense.

Source: Skyttner Lars (2006), General Systems Theory: Problems, Perspectives, Practice, Wspc, 2nd Edition.

Leave a Reply

Your email address will not be published. Required fields are marked *