Internet: A Medium or a Message?
Essays regarding the Internet, E-Commerce, E-Publishing, and Information Technology (IT)
By: Dr. Sam Vaknin
Malignant Self Love - Buy the Book - Click HERE!!!
Relationships with Abusive Narcissists - Buy the e-Books - Click HERE!!!
READ THIS: Scroll down to review a complete list of the articles - Click on the blue-coloured text!
These essays were published by the Israeli (Hebrew) edition of PC
Magazine back in 1996, when the Internet was in its formative epoch. I have
left them essentially unchanged, except for a few minor errata I corrected. I
find time travel fascinating. It is interesting to recall the mainstream view,
ten years ago, about the Internet, its goals, its role, and its future. So,
State of the Net
An Interim Report about the Future of the Internet
Who are the participants who constitute the Internet?
The fate of each of these components - separately and in solidarity - will determine the fate of the Internet.
The first phase of the Internet's history was dominated by computer wizards. Thus, any attempt at predicting its future dealt mainly with its hardware and software components.
Media experts, sociologists, psychologists, advertising and marketing executives were left out of the collective effort to determine the future face of the Internet.
As far as content is concerned, the Internet cannot be currently defined as a medium. It does not function as one - rather it is a very disordered library, mostly incorporating the writings of non-distinguished megalomaniacs. It is the ultimate Narcissistic experience. The forceful entry of publishing houses and content aggregators is changing this dismal landscape, though.
Ever since the invention of television there hasn't been anything as begging to become a medium as the Internet.
Three analogies spring to mind when contemplating the Internet in its current state:
These metaphors prove to be very useful (even business-wise). They permit us to define the commercial opportunities embedded in the Internet.
Yet, they fail to assist us in predicting its future in its transformation into a medium.
How does an invention become a medium? What happens to it when it does become one? What is the thin line separating the initial functioning of the invention from its transformation into a new medium? In other words: when can we tell that some technological advance gave birth to a new medium?
This work also deals with the image of the Internet once transformed into a medium.
The Internet has the most unusual attributes in the history of media.
It has no central structure or organization. It is hardware and software independent. It (almost) cannot be subjected to legislation or to regulation. Consider the example of downloading music from the internet - is it tantamount to an act of recording music (a violation of copyright laws)? This has been the crux of the legal battle between Diamond Multimedia (the manufacturers of the Rio MP3 device), MP3.com and Napster and the recording industry in America.
The Internet's data transfer channels are not linear - they are random. Most of its "broadcast" cannot be "received" at all. It allows for the narrowest of narrowcasting through the use of e-mail mailing lists, discussion groups, message boards, private radio stations, and chats. And this is but a small portion of an impressive list of oddities. These idiosyncrasies will also shape the nature of the Internet as a medium. Growing out of bizarre roots - it is bound to yield strange fruit as a medium.
So what business opportunities does the Internet represent?
I believe that they are to be found in two broad categories:
The Map of Terra Internetica
How many Internet users are there? How many of them have access to the Web (World Wide Web - WWW) and use it? There are no unequivocal statistics. Those who presume to give the answers (including the ISOC - the Internet SOCiety) - rely on very partial and biased resources. Others just bluff.
Yet, everyone seems to agree that there are, at least, 100 million active participants in North America (the Nielsen and Commerce-Net reports).
The future is, inevitably, even more vague than the present. Authoritative consultancy firms predict 66 million active users in 10 years time. IBM envisages 700 million users. MCI is more modest with 300 million. At the end of 1999 there were 130 million registered (though not necessarily active) users.
The Internet - An Elitist and Chauvinistic Medium
The average user of the Internet is young (30), with an academic background and high income. The percentage of the educated and the well-to-do among the users of the Web is three times as high as their proportion in the population. This is fast changing only because their children are joining them (6 million already had access to the Internet at the end of 1996 - and were joined by another 24 million by the end of the decade). This may change only due to presidential initiatives to bridge the "digital divide" (from Al Gore's in the USA to Mahatir Mohammed's in Malaysia), corporate largesse and institutional involvement (e.g., Open Society in Eastern Europe, Microsoft in the USA). These efforts will spread the benefits of this all-powerful tool among the less privileged. A bit less than 50% of all users are men but they are responsible for 60% of the activity in the net (as measured by traffic).
Women seem to limit themselves to electronic mail (e-mail) and to electronic shopping of goods and services, though this is changing fast. Men prefer information, either due to career requirements or because knowledge is power.
Most of the users are of the "experiencer" variety. They are leaders of social change and innovative. This breed inhabits universities, fashionable neighbourhoods and trendy vocations. This is why some wonder if the Internet is not just another fad, albeit an incredibly resilient and promising one.
Most users have home access to the Internet - yet, they still prefer to access it from work, at their employer's expense, though this preference is slight and being eroded. Most users are, therefore, exploitative in nature. Still, we must not forget that there are 37 million households of the self-employed and this possibly distorts the statistical picture somewhat.
The Internet - A Western Phenomenon
Not African, not Asian (with the exception of Israel and Japan), not Russian , nor a Third World phenomenon. It belongs squarely to the wealthy, sated world. It is the indulgence of those who have everything and whose greatest concern is their choice of nightly entertainment. Between 50-60% of all Internet users live in the USA, 5-10% in Canada. The Internet is catching on in Europe (mainly in Germany and in Scandinavia) and, in its mobile form (i-mode) in Japan. The Internet lost to the French Minitel because the latter provides more locally relevant content and because of high costs of communications and hardware.
Most computer owners still possess a 28,800 bps modem. This is much like driving a bicycle on a German Autobahn. The 56,600 bps is gradually replacing its slower predecessor (48% of computers with modems) - but even this is hardly sufficient. To begin to enjoy video and audio (especially the former) - data transfer rates need to be 50 times faster.
Half the households in the USA have at least 2 telephones and one of them is usually dedicated to data processing (faxes or fax-modems).
The ISDN could constitute the mid-term solution. This data transfer network is fairly speedy and covers 70% of the territory of the USA. It is growing by 100% annually and its sales topped 10 billion USD in 1995/6.
Unfortunately, it is quite clear that ISDN is not THE answer. It is too slow, too user-unfriendly, has a bad interface with other network types, it requires special hardware. There is no point in investing in temporary solutions when the right solution is staring the Internet in the face, though it is not implemented due to political circumstances.
A cable modem is 80 times speedier than the ISDN and 700 times faster than a 14,400 bps modem. However, it does have problems in accommodating a two-way data transfer. There is also need to connect the fibre optic infrastructure which characterizes cable companies to the old copper coaxial infrastructure which characterizes telephony. Cable users engage specially customized LANs (Ethernet) and the hardware is expensive (though equipment prices are forecast to collapse as demand increases). Cable companies simply did not invest in developing the technology. The law (prior to the 1996 Communications Act) forbade them to do anything that was not one way transfer of video via cables. Now, with the more liberal regulative environment, it is a mere question of time until the technology is found.
Actually, most consumers single out bad customer relations as their biggest problem with the cable companies - rather than technology.
Experiments conducted with cable modems led to a doubling of usage time (from an average of 24 to 47 hours per month per user) which was wholly attributable to the increased speed. This comes close to a cultural revolution in the allocation of leisure time. Numerically speaking: 7 million households in the USA are fitted with a two-way data transfer cable modems. This is a small number and it is anyone's guess if it constitutes a critical mass. Sales of such modems amount to 1.3 billion USD annually.
50% of all cable subscribers also have a PC at home. To me it seems that the merging of the two technologies is inevitable.
Other technological solutions - such as DSL, ADSL, and the more promising satellite broadband - are being developed and implemented, albeit slowly and inefficiently. Coverage is sporadic and frustrating waiting periods are measured in months.
Hardware and Software
Most Internet users (82%) work with the Windows operating system. About 11% own a Macintosh (much stronger graphically and more user-friendly). Only 7% continue to work on UNIX based systems (which, historically, fathered the Internet) - and this number is fast declining. A strong entrant is the free source LINUX operating system.
Virtually all users surf through a browsing software. A fast dwindling minority (26%) use Netscape's products (mainly Navigator and Communicator) and the majority use Microsoft's Explorer (more than 60% of the market). Browsers are now free products and can be downloaded from the Internet. As late as 1997, it was predicted by major Internet consultancy firms that browser sales will top $4 billion by the year 2000. Such misguided predictions ignored the basic ethos of the Internet: free products, free content, free access.
Browsers are in for a great transformation. Most of them are likely to have 3-D, advanced audio, telephony / voice / video mail (v-mail), instant messaging, e-mail, and video conferencing capabilities integrated into the same browsing session. They will become self-customizing, intelligent, Internet interfaces. They will memorize the history of usage and user preferences and adapt themselves accordingly. They will allow content-specificity: unidentifiable smart agents will scour the Internet, make recommendations, compare prices, order goods and services and customize contents in line with self-adjusting user profiles.
Two important technological developments must be considered:
PDAs (Personal Digital Assistants) - the ultimate personal (and office) communicators, easy to carry, they provide Internet (access) Everywhere, independent of suppliers and providers and of physical infrastructure (in an aeroplane, in the field, in a cinema).
The second trend: wireless data transfer and wireless e-mail, whether through pagers, cellular phones, or through more sophisticated apparatus and hybrids such as smart phones. Geotech's products are an excellent example: e-mail, faxes, telephone calls and a connection to the Internet and to other, public and corporate, or proprietary, databases - all provided by the same gadget. This is the embodiment of the electronic, physically detached, office. Wearable computing should be considered a part of this "ubiquitous or pervasive computing" wave.
We have no way of gauging - or intelligently guessing - the part of the mobile Internet in the total future Internet market but it is likely to outweigh the "fixed" part. Wireless internet meshes well with the trend of pervasive computing and the intelligent home and office. Household gadgets such as microwave ovens, refrigerators and so on will connect to the internet via a wireless interface to cull data, download information, order goods and services, report their condition and perform basic maintenance functions. Location specific services (navigation, shopping recommendations, special discounts, deals and sales, emergency services) depend on the technological confluence between GPS (stallite-based geolocation technology) and wireless Internet.
Suppliers and Intermediaries
"Parasitic" intermediaries occupy each stage in the Internet's food chain.
Access to the Internet is still provided by "dumb pipes" - the Internet Service Providers (ISP).
Content is still the preserve of content suppliers and so on.
Some of these intermediaries are doomed to gradually fade or to suffer a substantial diminishing of their share of the market. Even "walled gardens" of content (such as AOL) are at risk.
By way of comparison, even today, ISPs have four times as many subscribers (worldwide) as AOL. Admittedly, this adversely affects the quality of the Internet - the infrastructure maintained by the phone companies is slow and often succumbs to bottlenecks. The unequivocal intention of the telephony giants to become major players in the Internet market should also be taken into account. The phone companies will, thus, play a dual role: they will provide access to their infrastructure to their competitors (sometimes, within a real or actual monopoly) - and they will compete with their clients. The same can be said about the cable companies. Controlling the last mile to the user's abode is the next big business of the Internet. Companies such as AOL are disadvantaged by these trends. It is imperative for AOL to obtain equal access to the cable company's backbone and infrastructure if it wants to survive. Hence its merger with Time Warner.
No wonder that many of the ISPs judge this intrusion on their turf by the phone and cable companies to constitute unfair competition. Yet, one should not forget that the barriers to entry are very low in the ISP market. It takes a minimal investment to become an ISP. 200 modems (which cost 200 USD each) are enough to satisfy the needs of 2000 average users who generate an income of 500,000 USD per annum to the ISP. Routers are equally as cheap nowadays. This is a nice return on the ISP's capital, undoubtedly.
The Web houses the equivalent of 100 billion pages. Search Engine applications are used to locate specific information in this impressive, constantly proliferating library. They will be replaced, in the near future, by "Knowledge Structures" - gigantic encyclopaedias, whose text will contain references (hyperlinks) to other, relevant, sites. The far future will witness the emergence of the "Intelligent Archives" and the "Personal Newspapers" (read further for detailed explanations). Some software applications will summarize content, others will index and automatically reference and hyperlink texts (virtual bibliographies). An average user will have an on-going interest in 500 sites. Special software will be needed to manage address books ("bookmarks", "favourites") and contents ("Intelligent Addressbooks"). The phenomenon of search engines dedicated to search a number of search engines simultaneously will grow ("Hyper- or meta- engines"). Meta-engines will work in the background and download hyperlinks and advertising (the latter is essential to secure the financial interest of site developers and owners). Statistical software which tracks ("how long was what done"), monitors ("what did they do while in the site") and counts ("how many") visitors to sites already exists. Some of these applications have back-office facilities (accounting, follow-up, collections, even tele-marketing). They all provide time trails and some allow for auditing.
This is but a small fragment of the rapidly developing net-scape: people and enterprises who make a living off the Internet craze rather than off the Internet itself. Everyone knows that there is more money in lecturing about how to make money on the Internet - than in the Internet itself. This maxim still holds true despite the 32 billion US dollars in E-commerce in 1998. Business to Consumer (B2C) sales grow less vigorously than Business to Business (B2B) sales and are likely to suffer another blow with the advent of Peer to Peer (P2P) computer networks. The latter allow PCs to act as servers and thus enable the swapping of computer files asmong connected users (with or without a central directory).
This is the underprivileged sector of the Internet. They all lose money (even e-tailers which offer basic, standardized goods - books, CDs - with the exception, until September 11, of sites connected to tourism). No one thanks them for content produced with the investment of a lot of effort and a lot of money. A really qualitative, fully commerce enabled site costs up to 5,000,000 USD, excluding site maintenance and customer and visitor services. Content providers are constantly criticized for lack of creativity or for too much creativity. More and more is asked of them. They are exploited by intermediaries, hitchhikers and other parasites. This is all an off-shoot of the ethos of the Internet as a free content area.
More than 100 million men and women constantly access the Web - but this number stands to grow (the median prediction: 300 million). Yet, while the Web is used by 35% of those with access to the Internet - e-mail is used by more than 60%. E-mail is by far the most common function ("killer app") and specialized applications (Eudora, Internet Mail, Microsoft Exchange) - free or ad sponsored - keep it accessible to all and user-friendly.
Most of the users like to surf (browse, visit sites) the net without reason or goal in mind. This makes it difficult to apply traditional marketing techniques.
What is the meaning of "targeted audiences" or "market shares" in this context?
If a surfer visits sites which deal with aberrant sex and nuclear physics in the same session - what to make of it?
The public and legislative backlash against the gathering of surfers' data by Internet ad agencies and other web sites - has led to growing ignorance regarding the profile of Internet users, their demography, habits, preferences and dislikes.
People like the very act of surfing. They want to be entertained, then they use the Internet as a working tool, mostly in the service of their employer, who, usually foots the bill. Users love free downloads (mainly software).
"Free" is a key word on the Internet: it used to belong to the US Government and to a bunch of universities. Users like information, with emphasis on news and data about new products. But they do not like to shop on the net - yet. Only 38% of all surfers made a purchase during 1998.
67% of them adore virtual sex. 50% of the sites most often visited are porn sites (this is reminiscent of the early days of the Video Cassette Recorder - VCR). People dedicate the same amount of time to watching video cassettes or television as they do to surfing the net. The Internet seems to cannibalize television.
Sex is followed by music, sports, health, television, computers, cinema, politics, pets and cooking sites. People are drawn to interactive games. The Internet will shortly enable people to gamble, if not hampered by legislation. 10 billion USD in gambling money are predicted to pass through the net. This makes sense: nothing like a computer to provide immediate (monetary and psychological) rewards.
Commerce on the net is another favourite. The Internet is a perfect medium for the sale of software and other digital products (e-books). The problem of data security is on its way to being solved with the SET (or other) world standard.
As early as 1995, the Internet had more than 100 virtual shopping malls visited by 2.5 million shoppers (and probably double this number in 1996).
The predictions for 1999 were between 1-5 billion USD of net shopping (plus 2 billion USD through on-line information providers, such as CompuServe and AOL) - proved woefully inaccurate. The actual number in 1998 was 7 times the prediction for 1999.
It is also widely believed that circa 20% of the family budget will pass through the Internet as e-money and this amounts to 150 billion USD.
The Internet will become a giant inter-bank clearing system and varied ATM type banking and investment services will be provided through it. Basically, everything can be done through the Internet: looking for a job, for instance.
Yet, the Internet will never replace human interaction. People are likely to prefer personal banking, window shopping and the social experience of the shopping mall to Internet banking and e-commerce, or m-commerce.
Some sites already sport classified ads. This is not a bad way to defray expenses, though most classified ads are free (it is the advertising they attract that matters).
Another developing trend is website-rating and critique. It will be treated the way today's printed editions are. It will have a limited influence on the consumption decisions of some users. Browsers already sport buttons labelled "What's New" and "What's Hot". Most Search Engines recommend specific sites. Users are cautious. Studies discovered that no user, no matter how heavy, has consistently re-visited more than 200 sites, a minuscule number. The 10 most popular web sites (Yahoo!, MSN, etc.) attracted more than 50% of all Internet traffic. Site recommendation services often produce random - at times, wrong - selections for their user. There are also concerns regarding privacy issues. The backlah against Amazon's "readers' circles" is an example.
Web Critics, who work today mainly for the printed press, will publish their wares on the net and will link to intelligent software which will hyperlink, recommend and refer. Some web critics will be identified with specific applications - really, expert systems which will incorporate their knowledge and experience.
Where will the capital needed to finance all these developments come from?
Again, there are two schools:
One says that sites will be financed through advertising - and so will search engines and other applications accessed by users.
Certain ASPs (Application Service Providers which rent out access to application software which resides on their servers) are considering this model.
The second version is simpler and allows for the existence of non-commercial content.
It proposes to collect negligible sums (cents or fractions of cents) from every user for every visit ("micro-payments") or a subscription fee. These accumulated cents or subscription fees will enable the owners of old sites to update and to maintain them and encourage entrepreneurs to develop new ones. Certain content aggregators (especially of digital textbooks) have adopted this model (Questia, Fathom).
The adherents of the first school pointed at the 5 million USD invested in advertising during 1995 and to the 60 million or so invested during 1996.
Its opponents point exactly at the same numbers: ridiculously small when contrasted with more conventional advertising modes. The potential of advertising on the net is limited to 1.5 billion USD annually in 1998, thundered the pessimists (many thought that even half that would be very nice). The actual figure was double the prediction but still woefully small and inadequate to support the Internet's content development.
Compare these figures to the sale of Internet software ($4 billion), Internet hardware ($3 billion), Internet access provision ($4.2 billion) in 1995.
Hembrecht and Quist estimated that Internet related industries scooped up 23.2 billion USD annually (A report released in mid-1996).
And what follows advertising is hardly more enocuraging.
The consumer interacts and the product is delivered to him. This - the delivery phase - is a slow and enervating epilogue to the exciting affair of ordering through the net at the speed of light. Too many consumers still complain that they do not receive what they ordered, or that delivery is late and products defective.
The solution may lie in the integration of advertising and content. Pointcast, for instance, integrated advertising into its news broadcasts, continuously streamed to the user's screen, even when inactive (they provided a downloadable active screen saver and ticker in a "push technology"). Downloading of digital music, video and text (e-books) will lead to immediate gratification of the consumer and will increase the efficacy of advertising.
Whatever the case may be, a uniform, agreed upon system of rating as a basis for charging advertisers, is sorely needed. There is also the question of what does the advertiser pay for?
Many advertisers (Procter and Gamble, for instance) refuse to pay according to the number of hits or impressions (=entries, visits to a site). They agree to pay only according to the number of the times that their advertisement was hit (page views).
This different basis for calculation is likely to upset all revenue scenarios.
Very few sites of important, respectable newspapers are on a subscription basis. Dow Jones (Wall Street Journal) and The Economist, to mention but two.
Will this become the prevailing trend?
The Internet as a Metaphor
Three metaphors come to mind when considering the Internet "philosophically".
The Internet as a Chaotic Library
1. The Problem of Cataloguing
The Internet is an assortment of billions of pages containing information. Some of them are visible and others are generated from hidden databases by users' requests ("Invisible Internet").
The Internet displays no discernible order, classification, or categorization. As opposed to "classical" libraries, no one has invented a cataloguing standard (remember Dewey?). This is so needed that it is amazing that it has not been invented yet. Some sites indeed apply the Dewey Decimal Syatem (Suite101). Others default to a directory structure (Open Directory, Yahoo!, Look Smart and others).
Had such a standard existed (an agreed upon numerical cataloguing method) - each site would have self-classified. Sites would have an interest to do so to increase their penetration rates and their visibility. This, naturally, would have eliminated the need for today's clunky, incomplete and (highly) inefficient search engines.
A site whose number starts with 900 will be immediately identified as dealing with history and multiple classification will be encouraged to allow finer cross-sections to emerge. An example of such an emerging technology of "self classification" and "self-publication" (though limited to scholarly resources) is the "Academic Resource Channel" by Scindex.
Users will not be required to remember reams of numbers. Future browsers will be akin to catalogues, very much like the applications used in modern day libraries. Compare this utopia to the current dystopy. Users struggle with reams of irrelevant material to finally reach a partial and disappointing destination. At the same time, there likely are web sites which exactly match the poor user's needs. Yet, what currently determines the chances of a happy encounter between user and content - are the whims of the specific search engine used and things like meta-tags, headlines, a fee paid, or the right opening sentences.
2. Screen versus Page
The computer screen, because of physical limitations (size, the fact that it has to be scrolled) fails to effectively compete with the printed page. The latter is still the most ingenious medium yet invented for the storage and release of textual information. Granted: a computer screen is better at highlighting discrete units of information. So, this draws the batlle lines: structures (printed pages) versus units (screen), the continuous and easily reversible versus the discrete.
The solution is an efficient way to translate computer screens to printed matter. It is hard to believe, but no such thing exists. Computer screens are still hostile to off-line printing. In other words: if a user copies information from the Internet to his Word Processor (or vice versa, for that matter) - he ends up with a fragmented, garbage-filled and non-aesthetic document.
Very few site developers try to do something about it - even fewer succeed.
3. The Internet and the CD-ROM
One of the biggest mistakes of content suppliers is that they do not mix contents or have a "static-dynamic interaction".
The Internet can now easily interact with other media (especially with audio CDs and with CD-ROMs) - even as the user surfs.
A shopping catalogue can be distributed on a CD-ROM by mail. The Internet Site will allow the user to order a product previously selected from the catalogue, while off-line. The catalogue could also be updated through the site (as is done with CD-ROM encyclopedias).
The advantages of the CD-ROM are clear: very fast access time (dozens of times faster than the access to a site using a dial up connection) and a data storage capacity tens of times bigger than the average website.
Another example: a CD-ROM can be distributed, containing hundreds of advertisements. The consumer will select the ad that he wants to see and will connect to the Internet to view a relevant video.
He could then also have an interactive chat (or a conference) with a salesperson, receive information about the company, about the ad, about the advertising agency which created the ad - and so on.
CD-ROM based encyclopedias (such as the Britannica, Encarta, Grolier) already contain hyperlinks which carry the user to sites selected by an Editorial Board.
But CD-ROMs are probably a doomed medium. This industry chose to emphasize the wrong things. Storage capacity increased exponentially and, within a year, desktops with 80 Gb hard disks will be common. Moreover, the Network Computer - the stripped down version of the personal computer - will put at the disposal of the average user terabytes in storage capacity and the processing power of a supercomputer. What separates computer users from this utopia is the communication bandwidth. With the introduction of radio, statellite, ADSL broadband services, cable modems and compression methods - video (on demand), audio and data will be available speedily and plentifully.
The CD-ROM, on the other hand, is not mobile. It requires installation and the utilization of sophisticated hardware and software. This is no user friendly push technology. It is nerd-oriented. As a result, CD-ROMs are not an immediate medium. There is a long time lapse between the moment they are purchased and the moment the first data become accessible to the user. Compare this to a book or a magazine. Data in these oldest of media is instantly available to the user and allows for easy and accurate "back" and "forward" functions.
Perhaps the biggest mistake of CD-ROM manufacturers has been their inability to offer an integrated hardware and software package. CD-ROMs are not compact. A Walkman is a compact hardware-cum-software package. It is easily transportable, it is thin, it contains numerous, user-friendly, sophisticated functions, it provides immediate access to data. So does the discman or the MP3-man. This cannot be said of the CD-ROM. By tying its future to the obsolete concept of stand-alone, expensive, inefficient and technologically unreliable personal computers - CD-ROMs have sentenced themselves to oblivion (with the possible exception of reference material).
4. On-line Reference Libraries
These already exist. A visit to the on-line Encyclopaedia Britannica exemplifies some of the tremendous, mind boggling possibilities:
Each entry is hyperlinked to sites on the Internet which deal with the same subject matter. The sites are carefully screened (though more detailed descriptions of each site should be available - they could be prepared either by the staff of the encyclopaedia or by the site owner). Links are available to data in various forms, including audio and video. Everything can be copied to the hard disk or to CD-ROMs.
This is a new conception of a knowledge centre - not just an assortment of material. It is modular, can be added on and subtracted from. It can be linked to a voice Q&A centre. Queries by subscribers can be answered by e-mail, by fax, posted on the site, hard copies can be sent by post. This "Trivial Pursuit" service could be very popular - there is considerable appetite for "Just in Time Information". The Library of Congress - together with a few other libraries - is in the process of making just such a service available to the public (CDRS - Collaborative Digital Reference Service).
5. The Feedback Option
Hard to believe, but very few sites encourage their guests to express an opinion about the site, its contents and its aesthetics. This indicates an ossified mode of thinking about the most dynamic mass medium ever created, the only interactive mass medium yet. Each site must absolutely contain feedback and rating questionnaires. It has the side benefit of creating a database of the visitors to the site.
Moreover, each site can easily become a "knowledge centre".
Let us consider a site dedicated to advertising and marketing:
It can contain feedback questionnaires (what do you think about the site, suggestions for improvement, mailto and leave message facilities, etc.).
It can contain rating questionnaires (rate these ads, these TV or radio shows, these advertising campaigns).
It can allocate some space to clients to create their home pages in (these home pages could lead to their sites, to other sites, to other sections of the host site - and, in any case, will serve as a display of the creative talent of the site owners). This will give the site owners a picture of the distribution of the areas of interest of the visitors to the site.
The site can include statistical, tracking and counter software.
Such a site can refer to hundreds of useful shareware applications (which deal with different aspects of advertising and marketing, for instance). Developers of applications will be able to use the site to promote their products. Other practical applications could also be referred to from - or reside on - the site (browsers, games, search engines).
And all this can be organized in a portal structure (for instance, by adopting the open software of the Open Directory Project).
6. Internet Derived CD-ROMS
The Internet is an enormous reservoir of freely available, public domain, information.
With a minimal investment, this information can be gathered into coherent, theme oriented, cheap CD-ROMs. Each such CD-ROM can contain:
Addresses of web sites specific to the subject matter:
The Internet is the world's largest "publisher", by far. It "publishes" FAQs (Frequent Answers and Questions regarding almost every technical matter in the world), e-zines (electronic versions of magazines, not a very profitable pursuit), the electronic versions of dailies (together with on-line news and information services), reference and other e-books, monographs, articles and minutes of discussions ("threads"), among other types of material.
Publishing an e-zine has a few advantages: it promotes the sales of the printed edition, it helps to sign on subscribers and it leads to the sale of advertising space. The electronic archive function (see next section) saves the need to file back issues, the space required to do so and the irritating search for data items.
The future trend is a combined subscription: electronic (mainly for the archival value and the ability to hyperlink to additional information) and printed (easier to browse current issue).
The electronic daily presents other advantages:
It allows for immediate feedback and for flowing, almost real-time, communication between writers and readers. The electronic version, therefore, acquires a gyroscopic function: a navigation instrument, always indicating deviations from the "right" course. The content can be instantly updated and immediacy has its premium (remember the Lewinsky affair?).
Strangely, this (conventional) field was the first to develop a "virtual reality" facet. There are virtual "magazine stalls". They look exactly like the real thing and the user can buy a paper using his mouse.
Specialty hand held devices already allow for downloading and storage of vast quantities of data (up to 4000 print pages). The user gains access to libraries containing hundreds of texts, adapted to be downloaded, stored and read by the specific device. Again, a convergence of standards is to be expected in this field as well (the final contenders will probably be Adobe's PDF against Microsoft's MS-Reader).
Broadly, e-books are treated either as:
Continuation of print books (p-books) by
A whole new publishing universe.
Since p-books are a more convenient medium then e-books - they will prevail in any straightforward "medium replacement" or "medium displacement" battle.
In other words, if publishers will persist in the simple and straightforward conversion of p-books to e-books - then e-books are doomed. They are simply inferior to the price, comfort, tactile delights, browseability and scanability of p-books.
But e-books - being digital - open up a vista of hitherto neglected possibilities. These will only be enhanced and enriched by the introduction of e-paper and e-ink. Among them:
The technology is still not fully there. Wars rage in both the wireless and the ebook realms. Platforms compete. Standards clash. Gurus debate. But convergence is inevitable and with it the e-book of the future.
8. The Archive Function
The Internet is also the world's biggest cemetery: tens of thousands of deadbeat sites, still accessible - the "Ghost Sites" of this electronic frontier.
This, in a way, is collective memory. One of the Internet's main functions will be to preserve and transfer knowledge through time. It is called "memory" in biology - and "archive" in library science. The history of the Internet is being documented by search engines (Google) and specialized services (Alexa) alike.
The Internet as a Collective Brain
Drawing a comparison from the development of a human baby - the human race has just commenced to develop its neural system.
The Internet fulfils all the functions of the Nervous System in the body and is, both functionally and structurally, pretty similar. It is decentralized, redundant (each part can serve as functional backup in case of malfunction). It hosts information which is accessible in a few ways, it contains a memory function, it is multimodal (multimedia - textual, visual, audio and animation).
I believe that the comparison is not superficial and that studying the functions of the brain (from infancy to adulthood) - amounts to perusing the future of the Net itself.
1. The Collective Computer
To carry the metaphor of "a collective brain" further, we would expect the processing of information to take place in the Internet, rather than inside the end-user's hardware (the same way that information is processed in the brain, not in the eyes). Desktops will receive the results and communicate with the Net to receive additional clarifications and instructions and to convey information gathered from their environment (mostly, from the user).
This is part fo the philosophy of the JAVA programming language. It deals with applets - small bits of software - and links different computer platforms by means of software.
Future servers will contain not only information (as they do today) - but also software applications. The user of an application will not be forced to buy it. He will not be driven into hardware-related expenditures to accommodate the ever growing size of applications. He will not find himself wasting his scarce memory and computing resources on passive storage. Instead, he will use a browser to call a central computer. This computer will contain the needed software, broken to its elements (=applets, small applications). Anytime the user wishes to use one of the functions of the application, he will siphon it off the central computer. When finished - he will "return" it. Processing speeds and response times will be such that the user will not feel at all that it is not with his own software that he is working (the question of ownership will be very blurred in such a world). This technology is available and it provoked a heated debated about the future shape of the computing industry as a whole (desktops - really power packs - or network computers, a little more than dumb terminals). Applications are already offered to corporate users by ASPs (Application Service Providers).
In the last few years, scientists put the combined power of the computers linked to the internet at any given moment to perform astounding feats of distributed parallel processing. Millions of PCs connected to the net co-process signals from outer space, meteorological data and solve complex equations. This is a prime example of a collective brain in action.
2. The Intranet - a Logical Extension of the Collective Computer
LANs (Local Area Networks) are no longer a rarity in corporate offices. WANs (wide Area Networks) are used to connect geographically dispersed organs of the same legal entity (branches of a bank, daughter companies, a sales force). Many LANs are wireless.
The intranet / extranet and wireless LANs will be the winners. They will gradually eliminate both fixed line LANs and WANs. The Internet offers equal, platform-independent, location-independent and time of day - independent access to all the members of an organization.Sophisticated firewall security application protects the privacy and confidentiality of the intranet from all but the most determined and savvy hackers.
The Intranet is an inter-organizational communication network, constructed on the platform of the Internet and which enjoys all its advantages. The extranet is open to clients and suppliers as well.
The company's server can be accessed by anyone authorized, from anywhere, at any time (with local - rather than international - communication costs). The user can leave messages (internal e-mail or v-mail), access information - proprietary or public - from it and to participate in "virtual teamwork" (see next chapter).
By the year 2002, a standard intranet interface will emerge. This will be facilitated by the opening up of the TCP/IP communication architecture and its availability to PCs. A billion USD will go just to finance intranet servers - or, at least, this is the median forecast.
The development of measures to safeguard server routed inter-organizational communication (firewalls) is the solution to one of two obstacles to the institution of the Intranet. The second problem is the limited bandwidth which does not permit the efficient transfer of audio (not to mention video).
It is difficult to conduct video conferencing through the Internet. Even the voices of discussants who use internet phones come out (slightly) distorted.
All this did not prevent 95% of the Fortune 1000 from installing intranet. 82% of the rest intend to install one by the end of this year. Medium to big size American firms have 50-100 intranet terminals per every internet one.
At the end of 1997, there were 10 web servers per every other type of server in organizations. The sale of intranet related software was projected to multiply by 16 (to 8 billion USD) by the year 1999.
One of the greatest advantages of the intranet is the ability to transfer documents between the various parts of an organization. Consider Visa: it pushed 2 million documents per day internally in 1996.
An organization equipped with an intranet can (while protected by firewalls) give its clients or suppliers access to non-classified correspondence. This notion has its charm. Consider a newspaper: it can give access to all the materials which were discarded by the editors. Some news are fit to print - yet are discarded because of space limitations. Still, someone is bound to be interested. It costs the newspaper close to nothing (the material is, normally, already computer-resident) - and it might even generate added circulation and income. It can be even conceived as an "underground, non-commercial, alternative" newspaper for a wholly different readership.
The above is but one example of the possible use of the intranet to communicate with the organization's consumer base.
3. Mail and Chat
The Internet (its e-mail possibilities) is eroding traditional mail. The market share of the post office in conveying messages by regular mail has dwindled from 77% to 62% (1995). E-mail has expanded to capture 36% (up from 19%).
90% of customers with on-line access use e-mail from time to time and 60% work with it regularly. More than 2 billion messages traverse the internet daily.
E-mail applications are available as freeware and are included in all browsers. Thus, the Internet has completely assimilated what used to be a separate service, to the extent that many people make the mistake of thinking that e-mail is a feature of the Internet. Microsoft continues to incorporate previously independent applications in its browsers - a behaviour which led to the 1999 anti-trust lawsuit against it.
The internet will do to phone calls what it has done to mail. Already there are applications (Intel's, Vocaltec's, Net2Phone) which enable the user to conduct a phone conversation through his computer. The voice quality has improved. The discussants can cut into each others words, argue and listen to tonal nuances. Today, the parties (two or more) engaging in the conversation must possess the same software and the same (computer) hardware. In the very near future, computer-to-regular phone applications will eliminate this requirement. And, again, simultaneous multi-modality: the user can talk over the phone, see his party, send e-mail, receive messages and transfer documents - without obstructing the flow of the conversation.
The cost of transferring voice will become so negligible that free voice traffic is conceivable in 3-5 years. Data traffic will overtake voice traffic by a wide margin.
This beats regular phones.
The next phase will probably involve virtual reality. Each of the parties will be represented by an "avatar", a 3-D figurine generated by the application (or the user's likeness mapped into the software and superimposed on the the avatar). These figurines will be multi-dimensional: they will possess their own communication patterns, special habits, history, preferences - in short: their own "personality".
Thus, they will be able to maintain an "identity" and a consistent pattern of communication which they will develop over time.
Such a figure could host a site, accept, welcome and guide visitors, all the time bearing their preferences in its electronic "mind". It could narrate the news, like "Ananova" does. Visiting sites in the future is bound to be a much more pleasant affair.
In 1996, the four corporate giants (Visa, MasterCard, Netscape and Microsoft) agreed on a standard for effecting secure payments through the Internet: SET. Internet commerce is supposed to mushroom by a factor of 50 to 25 billion USD. Site owners will be able to collect rent from passing visitors - or fees for services provided within the site. Amazon instituted an honour system to collect donations from visitors. Dedicated visitors will not be deterred by such trifles.
5. The Virtual Organization
The Internet allows simultaneous communication between an almost unlimited number of users. This is coupled with the efficient transfer of multimedia (video included) files.
This opens up a vista of mind boggling opportunities which are the real core of the Internet revolution: the virtual collaborative ("Follow the Sun") modes.
A group of musicians will be able to compose music or play it - while spatially and temporally separated;
Advertising agencies will be able to co-produce ad campaigns in a real time interactive mode;
Cinema and TV films will be produced from disparate geographical spots through the teamwork of people who never meet, except through the net.
These examples illustrate the concept of the "virtual community". Locations in space and time will no longer hinder a collaboration in a team: be it scientific, artistic, cultural, or for the provision of services (a virtual law firm or accounting office, a virtual consultancy network).
Two on going developments are the virtual mall and the virtual catalogue.
There are well over 300 active virtual malls in the Internet. They were frequented by 32.5 million shoppers, who shopped in them for goods and services in 1998. The intranet can also be thought of as a "virtual organization", or a "virtual business".
The virtual mall is a computer "space" (pages) in the internet, wherein "shops" are located. These shops offer their wares using visual, audio and textual means. The visitor passes a gate into the store and looks through its offering, until he reaches a buying decision. Then he engages in a feedback process: he pays (with a credit card), buys the product and waits for it to arrive by mail. The manufacturers of digital products (intellectual property such as e-books or software) have begun selling their merchandise on-line, as file downloads.
Yet, slow communications and limited bandwidth - constrain the growth potential of this mode of sale. Once solved - intellectual property will be sold directly from the net, on-line. Until such time, the intervention of the Post Office is still required. So, then virtual mall is nothing but a glorified computerized mail catalogue or Buying Channel, the only difference being the exceptionally varied inventory.
Websites which started as "specialty stores" are fast transforming themselves into multi-purpose virtual malls. Amazon.com, for instance, has bought into a virtual pharmacy and into other virtual businesses. It is now selling music, video, electronics and many other products. It started as a bookstore.
This contrasts with a much more creative idea: the virtual catalogue. It is a form of narrowcasting (as opposed to broadcasting): a surgically accurate targeting of potential consumer audiences. Each group of profiled consumers (no matter how small) is fitted with their own - digitally generated - catalogue. This is updated daily: the variety of wares on offer (adjusted to reflect inventory levels, consumer preferences and goods in transit) - and prices (sales, discounts, package deals) change in real time.
The user will enter the site and there delineate his consumption profile and his preferences. A customized catalogue will be immediately generated for him.
From then on, the history of his purchases, preferences and responses to feedback questionnaires will be accumulated and added to a database.
Each catalogue generated for him will come replete with order forms. Once the user concluded his purchases, his profile will be updated.
There is no technological obstacles to implementing this vision today - only administrative and legal ones. Big retail stores are not up to processing the flood of data expected to arrive. They also remain highly sceptical regarding the feasibility of the new medium. And privacy issues prevent data mining or the effective collection and usage of personal data.
The virtual catalogue is a private case of a new internet off-shoot: the "smart (shopping) agents". These are AI applications with "long memories".
They draw detailed profiles of consumers and users and then suggest purchases and refer to the appropriate sites, catalogues, or virtual malls.
They also provide price comparisons and the new generation (NetBot) cannot be blocked or fooled by using differing product categories.
In the future, these agents will refer also to real life retail chains and issue a map of the branch or store closest to an address specified by the user (the default being his residence). This technology can be seen in action in a few music sites on the web and is likely to be dominant with wireless internet appliances. The owner of an internet enabled (third generation) mobile phone is likely to be the target of geographically-specific marketing campaigns, ads and special offers pertaining to his current location (as reported by his GPS - satellite Geographic Positioning System).
6. Internet News
Internet news are advantaged. They can be frequently and dynamically updated (unlike static print news) and be always accessible (similar to print news), immediate and fresh.
The future will witness a form of interactive news. A special "corner" in the site will be open to updates posted by the public (the equivalent of press releases). This will provide readers with a glimpse into the making of the news, the raw material news are made of. The same technology will be applied to interactive TVs. Content will be downloaded from the internet and be displayed as an overlay on the TV screen or in a square in a special location. The contents downloaded will be directly connected to the TV programming. Thus, the biography and track record of a football player will be displayed during a football match and the history of a country when it gets news coveage.
Terra Internetica - Internet, an Unknown Continent
This is an unconventional way to look at the Internet. Laymen and experts alike talk about "sites" and "advertising space". Yet, the Internet was never compared to a new continent whose surface is infinite.
The Internet will have its own real estate developers and construction companies. The real life equivalents derive their profits from the scarcity of the resource that they exploit - the Internet counterparts will derive their profits from the tenants (the content).
A few companies bought "Internet Space" (pages, domain names, portals), developed it and make commercial use of it by:
Internet Space can be easily purchased or created. The investment is low and getting lower with the introduction of competition in the field of domain registration services and the increase in the number of top domains.
Then, infrastructure can be erected - for a shopping mall, for free home pages, for a portal, or for another purpose. It is precisely this infrastructure that the developer can later sell, lease, franchise, or rent out.
At the beginning, only members of the fringes and the avant-garde (inventors, risk assuming entrepreneurs, gamblers) invest in a new invention. The invention of a new communications technology is mostly accompanied by devastating silence.
No one knows to say what are the optimal uses of the invention (in other words, what is its future). Many - mostly members of the scientific and business elites - argue that there is no real need for the invention and that it substitutes a new and untried way for old and tried modes of doing the same thing (so why assume the risk?).
These criticisms are usually founded:
To start with, there is, indeed, no need for the new medium. A new medium invents itself - and the need for it. It also generates its own market to satisfy this newly found need.
Two prime examples are the personal computer and the compact disc.
When the PC was invented, its uses were completely unclear. Its performance was lacking, its abilities limited, it was horribly user unfriendly.
It suffered from faulty design, absent user comfort and ease of use and required considerable professional knowledge to operate. The worst part was that this knowledge was unique to the new invention (not portable).
It reduced labour mobility and limited one's professional horizons. There were many gripes among those assigned to tame the new beast.
The PC was thought of, at the beginning, as a sophisticated gaming machine, an electronic baby-sitter. As the presence of a keyboard was detected and as the professional horizon cleared it was thought of in terms of a glorified typewriter or spreadsheet. It was used mainly as a word processor (and its existence justified solely on these grounds). The spreadsheet was the first real application and it demonstrated the advantages inherent to this new machine (mainly flexibility and speed). Still, it was more (speed) of the same. A quicker ruler or pen and paper. What was the difference between this and a hand held calculator (some of them already had computing, memory and programming features)?
The PC was recognized as a medium only 30 years after it was invented with the introduction of multimedia software. All this time, the computer continued to spin off markets and secondary markets, needs and professional specialities. The talk as always was centred on how to improve on existing markets and solutions.
The Internet is the computer's first important breakthrough. Hitherto the computer was only quantitatively different - the multimedia and the Internet have made it qualitatively superior, actually, sui generis, unique.
This, precisely, is the ghost haunting the Internet:
It has been invented, is maintained and is operated by computer professionals. For decades these people have been conditioned to think in Olympic terms: more, stronger, higher. Not: new, unprecedented, non-existent. To improve - not to invent. They stumbled across the Internet - it invented itself despite its own creators.
Computer professionals (hardware and software experts alike) - are linear thinkers. The Internet is non linear and modular.
It is still the age of hackers. There is still a lot to be done in improving technological prowess and powers. But their control of the contents is waning and they are being gradually replaced by communicators, creative people, advertising executives, psychologists and the totally unpredictable masses who flock to flaunt their home pages.
These all are attuned to the user, his mental needs and his information and entertainment preferences.
The compact disc is a different tale. It was intentionally invented to improve upon an existing technology (basically, Edison's Gramophone). Market-wise, this was a major gamble: the improvement was, at first, debatable (many said that the sound quality of the first generation of compact discs was inferior to that of its contemporaneous record players). Consumers had to be convinced to change both software and hardware and to dish out thousands of dollars just to listen to what the manufacturers claimed was better quality Bach. A better argument was the longer life of the software (though contrasted with the limited life expectancy of the consumer, some of the first sales pitches sounded absolutely morbid).
The computer suffered from unclear positioning. The compact disc was very clear as to its main functions - but had a rough time convincing the consumers.
Every medium is first controlled by the technical people. Gutenberg was a printer - not a publisher. Yet, he is the world's most famous publisher. The technical cadre is joined by dubious or small-scale entrepreneurs and, together, they establish ventures with no clear vision, market-oriented thinking, or orderly plan of action. The legislator is also dumbfounded and does not grasp what is happening - thus, there is no legislation to regulate the use of the medium. Witness the initial confusion concerning copyrighted software and the copyrights of ROM embedded software. Abuse or under-utilization of resources grow. Recall the sale of radio frequencies to the first cellular phone operators in the West - a situation which repeats itself in Eastern and Central Europe nowadays.
But then more complex transactions - exactly as in real estate in "real life" - begin to emerge.
This distinction is important. While in real life it is possible to sell an undeveloped plot of land - no one will buy "pages". The supply of these is unlimited - their scarcity (and, therefore, their virtual price) is zero.
The second example involves the utilization of a site - rather than its mere availability.
A developer could open a site wherein first time authors will be able to publish their first manuscript - for a fee. Evidently, such a fee will be a fraction of what it would take to publish a "real life" book. The author could collect money for any downloading of his book - and split it with the site developer. The potential buyers will be provided with access to the contents and to a chapter of the books. This is currently being done by a few fledgling firms but a full scale publishing industry has not yet developed.
The Life of a Medium
The internet is simply the latest in a series of networks which revolutionized our lives. A century before the internet, the telegraph, the railways, the radio and the telephone have been similarly heralded as "global" and transforming.
Every medium of communications goes through the same evolutionary cycle:
The Public Phase
At this stage, the medium and the resources attached to it are very cheap, accessible, under no regulatory constraints. The public sector steps in: higher education institutions, religious institutions, government, not for profit organizations, non governmental organizations (NGOs), trade unions, etc. Bedevilled by limited financial resources, they regard the new medium as a cost effective way of disseminating their messages.
The Internet was not exempt from this phase which ended only a few years ago. It started with a complete computer anarchy manifested in ad hoc networks, local networks, networks of organizations (mainly universities and organs of the government such as DARPA, a part of the defence establishment, in the USA). Non commercial entities jumped on the bandwagon and started sewing these networks together (an activity fully subsidized by government funds). The result was a globe encompassing network of academic institutions. The American Pentagon established the network of all networks, the ARPANET. Other government departments joined the fray, headed by the National Science Foundation (NSF) which withdrew only lately from the Internet.
The Internet (with a different name) became semi-public property - with access granted to the chosen few.
Radio took precisely this course. Radio transmissions started in the USA in 1920. Those were anarchic broadcasts with no discernible regularity. Non commercial organizations and not for profit organizations began their own broadcasts and even created radio broadcasting infrastructure (albeit of the cheap and local kind) dedicated to their audiences. Trade unions, certain educational institutions and religious groups commenced "public radio" broadcasts.
The Commercial Phase
When the users (e.g., listeners in the case of the radio, or owners of PCs and modems in the example of the Internet) reach a critical mass - the business sector is alerted. In the name of capitalist ideology (another religion, really) it demands "privatization" of the medium. This harps on very sensitive strings in every Western soul: the efficient allocation of resources which is the result of competition, corruption and inefficiency naturally associated with the public sector ("Other People's Money" - OPM), the ulterior motives of members of the ruling political echelons (the infamous American Paranoia), a lack of variety and of catering to the tastes and interests of certain audiences, the equation private enterprise = democracy and more.
The end result is the same: the private sector takes over the medium from "below" (makes offers to the owners or operators of the medium - that they cannot possibly refuse) - or from "above" (successful lobbying in the corridors of power leads to the appropriate legislation and the medium is "privatized").
Every privatization - especially that of a medium - provokes public opposition. There are (usually founded) suspicions that the interests of the public were compromised and sacrificed on the altar of commercialization and rating. Fears of monopolization and cartelization of the medium are evoked - and justified, in due time. Otherwise, there is fear of the concentration of control of the medium in a few hands. All these things do happen - but the pace is so slow that the initial fears are forgotten and public attention reverts to fresher issues.
A new Communications Act was legislated in the USA in 1934. It was meant to transform radio frequencies into a national resource to be sold to the private sector which will use it to transmit radio signals to receivers. In other words: the radio was passed on to private and commercial hands. Public radio was doomed to be marginalized.
The American administration withdrew from its last major involvement in the Internet in April 1995, when the NSF ceased to finance some of the networks and, thus, privatized its hitherto heavy involvement in the net.
A new Communications Act was legislated in 1996. It permitted "organized anarchy". It allowed media operators to invade each other's territories.
Phone companies will be allowed to transmit video and cable companies will be allowed to transmit telephony, for instance. This is all phased over a long period of time - still, it is a revolution whose magnitude is difficult to gauge and whose consequences defy imagination. It carries an equally momentous price tag - official censorship. "Voluntary censorship", to be sure, somewhat toothless standardization and enforcement authorities, to be sure - still, a censorship with its own institutions to boot. The private sector reacted by threatening litigation - but, beneath the surface it is caving in to pressure and temptation, constructing its own censorship codes both in the cable and in the internet media.
This phase is the next in the Internet's history, though, it seems, unbeknownst to it.
It is characterized by enhanced activities of legislation. Legislators, on all levels, discover the medium and lurch at it passionately. Resources which were considered "free", suddenly are transformed to "national treasures not to be dispensed with cheaply, casually and with frivolity".
It is conceivable that certain parts of the Internet will be "nationalized" (for instance, in the form of a licensing requirement) and tendered to the private sector. Legislation will be enacted which will deal with permitted and disallowed content (obscenity? incitement? racial or gender bias?).
No medium in the USA (not to mention the wide world) has eschewed such legislation. There are sure to be demands to allocate time (or space, or software, or content, or hardware) to "minorities", to "public affairs", to "community business". This is a tax that the business sector will have to pay to fend off the eager legislator and his nuisance value.
All this is bound to lead to a monopolization of hosts and servers. The important broadcast channels will diminish in number and be subjected to severe content restrictions. Sites which will not succumb to these requirements - will be deleted or neutralized. Content guidelines (euphemism for censorship) exist, even as we write, in all major content providers (CompuServe, AOL, Geocities, Tripod, Prodigy).
This is the phase of consolidation. The number of players is severely reduced. The number of browser types will be limited to 2-3 (Netscape, Microsoft and which else?). Networks will merge to form privately owned mega-networks. Servers will merge to form hyper-servers run on supercomputers in "server farms". The number of ISPs will be considerably cut.
50 companies ruled the greater part of the media markets in the USA in 1983. The number in 1995 was 18. At the end of the century they will number 6.
This is the stage when companies - fighting for financial survival - strive to acquire as many users/listeners/viewers as possible. The programming is shallowed to the lowest (and widest) common denominator. Shallow programming dominates as long as the bloodbath proceeds.
From Rags to Riches
Tough competition produces four processes:
1. A Major Drop in Hardware Prices
This happens in every medium but it doubly applies to a computer-dependent medium, such as the Internet.
Computer technology seems to abide by "Moore's Law" which says that the number of transistors which can be put on a chip doubles itself every 18 months. As a result of this miniaturization, computing power quadruples every 18 months and an exponential series ensues. Organic-biological-DNA computers, quantum computers, chaos computers - prompted by vast profits and spawned by inventive genius will ensure the longevity and continued applicability of Moore's Law.
The Internet is also subject to "Metcalf's Law".
It says that when we connect N computers to a network - we get an increase of N to the second power in its computing / processing power. And these N computers are more powerful every year, according to Moore's Law.
The growth of computing powers in networks is a multiple of the effects of the two laws. More and more computers with ever increasing computing power get connected and create an exponential 16 times growth in the network's computing power every 18 months.
2. Free Availability of Software and Connection
This is prevalent in the Net where even potentially commercial software can be downloaded for free. In many countries television viewers still pay for television broadcasts - but in the USA and many other countries in the West, the basic package of television channels comes free of charge.
As users / consumers form a habit of using (or consuming) the software - it is commercialized and begins to carry a price tag. This is what happened with the advent of cable television: contents are sold for subscription and usage (Pay Per View - PPV) fees.
Gradually, this is what will happen to most of the sites and software on the Net. Those which survive will begin to collect usage fees, access fees, subscription fees, downloading fees and other, appropriately named, fees. These fees are bound to be low - but it is the principle that counts. Even a few cents per transaction will accumulate to hefty sums with the traffic which will characterize the Net (or, at least its more popular locales).
Adverising revenues will allow ISPs to offer free communication and storage volume. Gradually, connect time charges imposed by the phone companies will be eroded by tough competition from the likes of the cable companies. Accessing the internet might well be free of all charges in 10 years time.
3. Increased User Friendliness
As long as the computer is less user friendly and less reliable (predictable) than television - less of a black box - its potential (and its future) is limited. Television attracts 3.5 billion users daily. The Internet will attract - under the most exuberant scenario - less than one tenth of this number of people. The only reasons for this disparity are (the lack of) user friendliness and reliability. Even browsers, among the most user friendly applications ever - are not sufficiently so. The user still needs to know how to use a keyboard and must possess some basic acquaintance with the operating system.
The more mature the medium, the more friendly it becomes. Finally, it will be operated using speech or common language. There will be room left for user "hunches" and built in flexible responses.
4. Social Taxes
Sooner or later, the business sector has to mollify the God of public opinion by offerings of political and social nature. The Internet is an affluent, educated, yuppie medium. It necessitates a control of the English language, live interest in information and its various uses (scientific, commercial, other), a lot of resources (free time, money to invest in hardware, software and connect time). It empowers - and thus deepens the divide between the haves and have-nots, the knowing and the ignorant, the computer illiterate.
In short: the Internet is an elitist medium. Publicly, this is an unhealthy posture. "Internetophobia" is already discernible. People (and politicians) talk about how unsafe the Internet is and about its possible uses for racial, sexist and pornographic purposes. The wider public is in a state of awe.
So, site builders and owners will do well to begin to improve their image: provide free access to schools and community centres, bankroll internet literacy classes, freely distribute contents and software to educational institutions, collaborate with researchers and social scientists and engineers.
In short: encourage the view that the Internet is a medium catering to the needs of the community and the underprivileged, a mostly altruist endeavour. This also happens to make good business sense by educating a future generation of users. He who visited a site when a student, free of charge - will pay to do so when made an executive. Such a user will also pass on the information within and without his organization. This is called media exposure.
The future will, no doubt, witness public Internet terminals, subsidized ISP accounts, free Internet classes and an alternative "non-commercial, public" approach to the Net.
The Internet: Medium or Chaos?
There has never been a medium like the Internet. The way it has formed, the way it was (not) managed, its hardware-software-communications specifications - are all unique.
The Internet has no central (or even decentralized) structure. In reality, it hardly has a structure at all. It is a collection of 16 million computers (end 1996) connected through thousands of networks. There are organizations which purport to set Internet standards (like the aforementioned ISOC, or the domain setting ICANN) - but they are all voluntary organizations, with no binding legal, enforcement, or adjudication powers. The result is often mayhem.
Many erroneously call the Internet the first democratic medium. Yet, it hardly qualifies as a medium and by no stretch of terminology is it democratic. Democracy has institutions, hierarchies, order. The Internet has none of these things. There are some vague understandings as to what is and is not allowed. This is a "code of honour" (more reminiscent of the Sicilian Mob than of the British Parliament, let's say). Violations are punished by excommunication (of the violating site or person).
The Internet has culture - but no education. Freedom of Speech is entrenched. Members of this virtual community react adversely to ideas of censorship, even when applied to hard core porno. In 1999, hackers hacked major government sites following an FBI initiative against hacking-related crimes. Government initiatives (in the USA, in France, the lawsuit against the General Manager of AOL in Germany) are acutely criticized. In the meantime, the spirit of the Internet prevails: the small man's medium. What seems to be emerging, though, is self censorship by content providers (such as AOL and CompuServe).
The Internet is not dependent upon a given hardware or software. True, it is accessible only through computers and there are dominant browsers.
But the Internet accommodates any digital (bit transfer) platform. Internet will be incorporated in the future into portable computers, palmtops, PDAs, mobile phones, cable television, telephones (with voice interface), home appliances and even wrist watches. It will be accessible to all, regardless of hardware and software.
The situation is, obviously, different with other media. There is standard hardware (the television set, the radio receiver, the digital print equipment). Data transfer modes are standardized as well. The only variable is the contents - and even this is standardized in an age of American cultural imperialism. Today, one can see the same television programs all over the globe, regardless of cultural or geographical differences.
Here is a reasonable prognosis for the Internet:
It will "broadcast" (it is, of course, a PULL medium, not a PUSH medium - see next chapter) to many kinds of hardware. Its functions will be controlled by 2-5 very common software applications. But it will differ from television in that contents will continue to be decentralized: every point on the Net is a potential producer of content at low cost. This is the equivalent of producing a talk show using a single home video camera. And the contents will remain varied.
Naturally, marketing content (sites) will remain an expensive art. Sites will also be richer or poorer, in accordance with the investment made in them.
Non Linearity and Functional Modularity
The Internet is the first medium in human history that is non-linear and totally modular.
A television program is broadcast from a transmitter, through the airwaves to a receiver (=the television set). The viewer sits opposite this receiver and passively watches. This is an entirely linear process. The Internet is different:
When communicating through the Internet, there is no way to predict how the information will reach its destination. The routing of information through the network is completely random, very much like the principle governing the telephony system (but on a global scale). The latter is not a point-to-point linear network. Rather, it is a network of networks. Our voice is transmitted back and forth inside a gigantic maze of copper wires and optic fibres. It seeps through any available wire - until it reaches its destination.
It is the same with the Internet.
Information is divided to packets. An address is attached to each packet and - using the TCP/IP data transfer protocol - is dispatched to roam this worldwide labyrinth. But the path from one neighbourhood of London to another may traverse Japan.
The really ingenious thing about the Internet is that each computer (each receiver or end user) indeed burdens the system by imposing on it its information needs (as is the case with other media) - but it also assists in the task of pushing information packets on to their destinations. It seems that this contribution to the system outweighs the burdens imposed upon it.
The network has a growth potential which is always bigger than the number of its users. It is as though television sets assisted in passing the signals received by them to other television sets. Every computer which is a member of the network is both a message (content) and a medium (active information channel), both a transmitter and a receiver. If 30% of all computers on the Net were to crash - there will be no operational impact (there is enormous built in redundancy). Obviously, some contents will no longer be available (information channels will be affected).
The interactivity of this medium is a guarantee against the monopolization of contents. Anyone with a thousand dollars can launch his/her own (reasonably sophisticated) site, accessible to all other Internet users. Space is available through home page providers.
The name of the game is no longer the production - it is the creative content (design), the content itself and, above all, the marketing of the site.
The Internet is an infinite and unlimited resource. This goes against the grain of the most basic economic concept (of scarcity). Each computer that joins the Internet strengthens it exponentially - and tens of thousands join daily. The Internet infrastructure (maybe with the exception of communication backbones) can accommodate an annual growth of 100% to the year 2020. It is the user who decides whether to increase the Internet's infrastructure by connecting his computer to it. By comparison: it is as though it were possible to produce and to broadcast radio programmes from every radio receiver. Each computer is a combination of studio and transmitter (on the Internet).
In reality, there is no other interactive medium except the Internet. Cable TV does not allow two-way data transfer (from user to cable operator). If the user wants to buy a product - he has to phone. Interactive television is an abject failure (the Sony and TCI experiments were terminated). This all is notwithstanding the combining of the Internet with satellite capabilities (VSAT) or with the revenant digital television.
The television screen is inferior when compared to the computer screen. Only the Internet is there as a true two-way possibility. The technological problems that besieged it are slowly dissipating.
The Internet allows for one-dimensional and bi - dimensional interactivity.
One-dimensional interactivity: fill in and dispatch a form, send and receive messages (through e-mail or v-mail).
Two-dimensional interactivity: to talk to someone while both parties work on an application, to see your conversant, to talk to him and to transfer documents to him for his perusal as the conversation continues apace.
This is no longer science fiction. In less than five years this will be as common as the telephone - and it will have a profound effect on the traditional services provided by the phone companies. Internet phones, Internet videophones - they will be serious competitors and the phone companies are likely to react once they begin to feel the heat. This will happen when the Internet will acquire black box features. Phone companies, software giants and cable TV operators are likely to end up owning big chunks of the lucrative future market of the Net.
The Solitary Medium
The Internet is NOT a popular medium. It is the medium of affluent executives who fully master the English language, as part of a wider general education.
Alternatively, it is the medium of academia (students, lecturers), or of children of the former, well-to-do group. In any case, it is not the medium of the "wide public". It is also a highly individualistic medium.
The Internet was an initiative of the DOD (Department of Defence in the USA). It was later "requisitioned" by the National science Fund (NSF) in the USA. This continuous involvement of the administration came to an end in 1995 when the medium was "privatized".
This "privatization" was a recognition of the civilian roots of the Internet. It was - and is still being - formed by millions of information-intoxicated users. They formed networks to exchange bits and pieces of mutual interest. Thus, as opposed to all other media, the Internet was not invented, nor was its market. The inventors of the telephone, the telegraph, the radio, the television and the compact disc - all invented previously non-existent markets for their products. It took time, effort and money to convince consumers that they needed these "gadgets".
By contrast, the Internet was invented by its own consumers and so was the market for it. Only when the latter was fully forged did producers and businessmen join in. Microsoft began to hesitantly test the internet waters only in 1995!
On Line Memories
The Internet is the only medium with online memory, very much like the human brain. The memories of these two - the Net and the Brain - are immediately accessible. In both, it is stored in sites and in both, it does not grow old or is eliminated. It is possible to find sites which commemorate events the same way that the human mind registers them. This is Net Memory. The history of a site can be reviewed. The Library of Congress stores the consecutive development phases of sites. The Internet is an amazing combination of data processing software, data, a record of all the activities which took place in connection with the data and the memory of these records. Only the human brain is recalled by these capacities: one language serves all these functions, the language of the neurones.
There is a much clearer distinction even in computers (not to mention more conventional media, such as television).
Raw English - the Language of Raw Materials
The following - apparently trivial - observation is critical:
All the other media provide us with processed, censored, "clean" content.
The Internet is a medium of raw materials, partly well organized (the rough equivalent of a newspaper) - and partly still in raw form, yesterday's supper.
This is a result of the immediate and absolute access afforded each user: access to programming and site publishing tools - as well as access to computer space on servers. This leads to varying degrees of quality of contents and content providers and this, in turn, prevents monopolization and cartelization of the information supply channels.
The users of the Internet are still undecided: do they prefer drafts or newspapers. They frequent well designed sites. There are even design competitions and awards. But they display a preference for sites that are constantly updated (i.e. closer in their nature to a raw material - rather than to a finished product). They prefer sites from which they can download material to quietly process at home, alone, on their PCs, at their leisure.
Even the concept of "interactivity" points at a preference for raw materials with which one can interact. For what is interactivity if not the active involvement of the user in the creation of content?
The Internet users love to be involved, to feel the power in their fingertips, they are all addicted to one form of power or another.
Similarly, a car completely automatically driven and navigated is not likely to sell well. Part of the experience of driving - the sensation of power ("power stirring") - is critical to the purchase decision.
It is not in vain that the metaphor for using the Internet is "surfing" (and not, let's say, browsing).
The problem is that the Internet is still predominantly an English language medium (though it is fast changing). It discriminates against those whose mother tongue is different. All software applications work best in English. Otherwise they have to be adapted and fitted with special fonts (Hebrew, Arabic, Japanese, Russian and Chinese - each present a different set of problems to overcome). This situation might change with the attainment of a critical mass of users (some say, 2 million per non-Anglophone country).
Comprehensive (Virtual) Reality
This is the first (though, probably, not the last) medium which allows the user to conduct his whole life within its boundaries.
Television presents a clear division: there is a passive viewer. His task is to absorb information and subject it to minimal processing. The Internet embodies a complete and comprehensive (virtual) reality, a full fledged alternative to real life.
The illusion is still in its infancy - and yet already powerful.
The user can talk to others, see them, listen to music, see video, purchase goods and services, play games (alone or with others scattered around the globe), converse with colleagues, or with users with the same hobbies and areas of interest, to play music together (separated by time and space).
And all this is very primitive. In ten years time, the Internet will offer its users the option of video conferencing (possibly, three dimensional, holographic). The participants' figures will be projected on big screens. Documents will be exchanged, personal notes, spreadsheets, secret counteroffers.
Virtual Reality games will become reality in less time. Special end-user equipment will make the player believe that he, actually, is part of the game (while still in his room). The player will be able to select an image borrowed from a database and it will represent him, seen by all the other players. Everyone will, thus, end up invading everyone else's private space - without encroaching on his privacy!
The Internet will be the medium of choice for phone and videophone communication (including conferencing).
Many mundane activities will be done through Internet: banking, shopping for standard items, etc.
The above are examples to the Internet's power and ability to replace our reality in due time. A world out there will continue to exist - but, more and more we will interact with it through the enchanted interface of the Net.
A Brave New Net
The future of a medium in the making is difficult to predict. Suffice it to mention the ridiculous prognoses which accompanied the PC (it is nothing but a gaming gadget, it is a replacement for the electric typewriter, will be used only by business). The telephone also had its share of ludicrous statements: no one - claimed the "experts" would like to avoid eye contact while talking. Or television: only the Nazi regime seemed to have fully grasped its potential (in the Berlin 1936 Olympics). And Bill Gates thought that the internet has a very limited future as late as 1995!!!
Still, this medium has a few characteristics which differentiate it from all its predecessors. Were these traits to be continuously and creatively exploited - a few statements can be made about the future of the Net with relative assurance.
Time and Space Independence
This is the first medium in history which does not require the simultaneous presence of people in space-time in order to facilitate the transfer of information. Television requires the existence of studio technicians, narrators and others in the transmitting side - and the availability of a viewer in the receiving side. The phone is dependent on the existence of two or more parties simultaneously.
With time, tools to bridge the time gap between transmitter and receiver were developed. The answering machine and the video cassette recorder both accumulate information sent by a transmitter - and release it to a receiver in a different space and time. But they are discrete, their storage volume is limited and they do not allow for interaction with the transmitter.
The Internet does not have these handicaps.
It facilitates the formation of "virtual organizations / institutions / businesses/ communities". These are groups of users that communicate in different points in space and time, united by a common goal or interest.
A few examples:
The Virtual Advertising Agency
A budget executive from the USA will manage the account of a hi-tech firm based in Sydney. He will work with technical experts from Israel and with a French graphics office. They will all file their work (through the intranet) in the Net, to be studied by the other members of this virtual group. These will enter the right site after clearing a firewall security software. They will all be engaged in flexiwork (flexible working times) and work from their homes or offices, as they please. Obviously, they will all abide by a general schedule.
They will exchange audio files (the jingle, for instance), graphics, video, colour photographs and text. They will comment on each other's work and make suggestions using e-mail. The client will witness the whole creative process and will be able to contribute to it. There is no technological obstacle preventing the participation of the client's clients, as well.
It is difficult to imagine that "virtual performances will replace real life ones.
The mass rock concert has its own inimitable sounds, palette and smells. But a virtual production of a record is on the cards and it is tens of percents cheaper than a normal production. Again, the participants will interact through the Intranet. They will swap notes, play their own instruments, make comments by e-mail, play together using an appropriate software. If one of them is grabbed by inspiration in the middle of (his) night, he will be able to preserve and pass on his ideas through the Net. The creative process will be aided by novel applications which enable the simultaneous transfer of sound over the Net. The processes which are already digitized (the mix, for one) will pose no problem to a digitized medium. Other applications will let the users listen to the final versions and even ask the public for his preview opinion.
Thus, even creative processes which are perceived as demanding human presence - will no longer do so with the advent of the Net.
Perhaps it is easier to understand a Virtual Law Firm or Virtual Accountants Office.
In the extreme, such a firm will not have physical offices, at all. The only address will be an e-mail address. Dozens of lawyers from all over the world with hundreds of specialities will be partners in such an office. Such an office will be truly multinational and multidisciplinary. It will be fast and effective because its members will electronically swap information (precedents, decrees, laws, opinions, research and plain ideas or professional experience).
It will be able to service clients in every corner of the globe. It will involve the transfer of audio files (NetPhones), text, graphics and video (crucial in certain types of litigation). Today, such information is sent by post and messenger services. Whenever different types of information are to be analysed - a physical meeting is a must. Otherwise, each type of information has to be transferred separately, using unique equipment for each one.
Simultaneity and interactivity - this will be the name of the game in the Internet. The professional term is "Coopetition" (cooperation between potential competitors, using the Internet).
Other possibilities: a virtual production of a movie, a virtual research and development team, a virtual sales force. The harbingers of the virtual university, the virtual classroom and the virtual (or distance) medical centre are here.
The Internet - Mother of all Media
The Internet is the technological solution to the mythological "home entertainment centre" debate.
It is almost universally agreed that, in the future, a typical home will have one apparatus which will give it access to all types of information. Even the most daring did not talk about simultaneous access to all the types of information or about full interactivity.
The Internet will offer exactly this: access to every conceivable type of information simultaneously , the ability to process them at the same time and full interactivity. The future image of this home centre is fairly clear - it is the timing that is not. It is all dependent on the availability of a wide (information) band - through which it will be possible to transfer big amounts of data at high speeds, using the same communications line. Fast modems were coupled with optic fibres and with faulty planning and vision of future needs. The cable television industry, for instance, is totally technologically unprepared for the age of interactivity. This is only partly the result of unwise, restrictive, legislation which prohibits data vendors from stepping on each others' toes. Phone companies were not permitted to provide Internet services or to transfer video through their wires - and cable companies were not allowed to transmit phone calls.
It is a question of time until these fossilized remains are removed by the almighty hand of the market. When this happens, the home centre is likely to look like this:
A central computer attached to a big screen divided to windows. Television is broadcast on one window. A software application is running on another. This could be an application connected to the television program (deriving data from it, recording it, collating it with pertinent data it picks out of databases). It could be an independent application (a computer game).
Updates from the New York Stock exchange flash at the corner of the screen and an icon blinks to signal the occurrence of a significant economic event.
A click of the mouse (?) and the news flash is converted to a voice message. Another click and your broker is on the InternetPhone (possibly seen in a third window on the screen). You talk, you send him a fax containing instructions and you compare notes. The fax was printed on a word processing application which opened up in yet another window.
Many believe that communication with the future generation of computers will be voice communication. This is difficult to believe. It is weird to talk to a machine (especially in the presence of other humans). We are seriously inhibited this way. Moreover, voice will interrupt other people's work or pleasure. It is also close to impossible to develop an efficient voice recognition software. Not to mention mishaps such as accidental activation.
The Friendly Internet
The Internet will not escape the processes experienced by all other media.
It will become easy to operate, user-friendly, in professional parlance.
It requires too much specialized information. It is not accessible to those who lack basic hardware and (Windows) software concepts.
Alas, most of the population falls into the latter category. Only 30 million "Windows" operating systems were sold worldwide at the end of 1996. Even if this constitutes 20% of all the copies (the rest being pirated versions) - it still represents less than 3% of the population of the world. And this, needless to say, is the world's most popular software (following the DOS operating system).
The Internet must rely on something completely different. It must have sophisticated, transparent-to-the-user search engines to guide to the cavernous chaotic libraries which will typify it. The search engines must include complex decision making algorithms. They must understand common languages and respond in mundane speech. They will be efficient and incredibly fast because they will form their own search strategy (supplanting the user's faulty use of syntax).
These engines, replete with smart agents will refer the user to additional data, to cultural products which reflect the user's history of preferences (or pronounced preferences expressed in answers to feedback questionnaires). All the decisions and activities of the user will be stored in the memory of his search engine and assist it in designing its decision making trees. The engine will become an electronic friend, advise the user, even on professional matters.
The cessation of hostilities between the Internet and some off-the-shelf software applications heralds the commencement of the integration between the desktop computer and the Net. This is a small step for the user - and a big one for humanity. The animosity which prevailed until recently between the UNIX systems and the HTML language and between most of the standard applications (headed by the Word Processors) - has officially ended with the introduction of Office 97 which incorporates full HTML capabilities. With the Office 2000 products, the distinctions between a web computing environment and a PC computing one - have all but vanished. Browsers can replace operating systems, word processors can browse, download and upload - the PC has finally been entirely absorbed by its offspring, the internet.
The Portable Document Format (PDF) enables the user to work the Internet off-line. In other words: text files will be loaded to word processors and edited off-line. The same applies to other types of files (audio, video).
Downloading time will be speeded up (today, it takes so long to download an audio or video file that, many times, it is impracticable).
This is not a trivial matter. The ability to switch between on-line and off-line states and to continue the work, uninterrupted - this ability means the integration of the PC in the Internet.
There are two competing views concerning the future of computer hardware and both of them acknowledge the importance of the Internet.
Bill Gates - Microsoft's legendary boss - says that the PC will continue to advance and strengthen its processing and computing powers. The Internet will be just another tool available through telecommunications, rather than through the ownership of hard copies of software and data. The Internet is perceived to be a tremendous external database, available for processing by tomorrow's desktops. This view is lately being gradually reversed in view of the incredible vitality and powers of the Internet.
Gates is converging on the worldview held by Sun Microsystems.
The future desktop will be a terminal, albeit powerful and with considerable processing, computing and communications capabilities. The name of the game will be the Internet itself. The terminal will access Internet databases (containing raw or processed data) and satisfy its information needs.
This terminal - equipped with languages the likes of Java - will get into libraries of software applications. It will make use of components of different applications as the needs arise. When finished using the component, the terminal will "return" it to the virtual "shelf" until the next time it is needed.
This will minimize memory resources in the desktop.
The truth, as always, is probably somewhere in the middle.
Tomorrow's computer will be a home entertainment centre. No consumer will accept total dependence on telecommunications and on the Net. They will all ask for processing and computing powers at their fingertips, a-la Bill Gates.
But tomorrow's computer will also function as a terminal, when needed: when data retrieving or even when using NON standard software applications. Why purchase rarely used, expensive applications - when they are available, for a fraction of the cost, on the Net?
In other words: no consumer will subjugate his frequent word processing needs to the whims of the local phone company, or to those of the site operator. That is why every desktop is still likely to be include a hard (or optical)-disk-resident word processing software. But very few will by CAD-CAM, animation, graphics, or publishing software which they are likely to use infrequently. Instead, they will access these applications, which will be resident in the Net, use those parts that are needed. This is usage tailored to the client's needs. This is also the integration of a desktop (not of a terminal) with the Net.
Decentralized Lack of Planning
The course adopted by content creators (producers) in the last few years proves the maxim that it is easy to repeat mistakes and difficult to derive lessons from them. Content producers are constantly buying channels to transfer their contents. This is a mistake. A careful study of the history of successful media (e.g., television) points to a clear pattern:
Content producers do not grant life-long exclusivity to any single channel. Especially not by buying into it. They prefer to contract for a limited time with content providers (their broadcast channels). They work with all of them, sometimes simultaneously.
In the future, the same content will be sold on different sites or networks, at different times. Sometimes it will be found with a provider which is a combination of cable TV company and phone company - at other times, it will be found with a provider with expertise in computer networks. Much content will be created locally and distributed globally - and vice versa. The repackaging of branded contents will be the name of the game in both the media firms and the firms which control contents distribution (=the channels).
No exclusivity pact will survive. Networks such as CompuServe are doomed and have been doomed since 1993. The approach of decentralized access, through numerous channels, to the same information - will prevail.
The Transparent Language
The Internet will become the next battlefield between have countries and have-not countries. It will be a cultural war zone (English against French, Japanese, Chinese, Russian and Spanish). It will be politically charged: those wishing to restrict the freedom of speech (authoritarian and dictatorial regimes, governments, conservative politicians) against pro-speechers. It will become a new arena of warfare and an integral part of actual wars.
Different peer groups, educational and income social-economic strata, ethnic, sexual preference groups - will all fight in the eternal fields of the Internet.
Yet, two developments are likely to pacify the scene:
Automatic translation applications (like Accent and the Alta Vista translation engines) will make every bit of information accessible to all. The lingual (and, by extension ethnic or national) source of the information will be disguised. A feeling of a global village will permeate the medium. Being ignorant of the English language will no longer hinder one's access to the Net. Equal opportunities.
The second trend will be the new classification methods of contents on the Net together with the availability of chips intended to filter offensive information. Obscene material will not be available to tender souls. anti-Semitic sites will be blocked to Jews and communists will be spared Evil Empire speeches. Filtering will be usually done using extensive and adaptable lists of keywords or key phrases.
This will lead to the formation of cultural Internet Ghettos - but it will also considerably reduce tensions and largely derail populist legislative efforts aimed at curbing or censoring free speech.
Public Internet - Private Internet
The day is not far when every user will be able to define his areas of interest, order of priorities, preferences and tastes. Special applications will scour the Net for him and retrieve the material befitting his requirements. This material will be organized in any manner prescribed.
A private newspaper comes to mind. It will have a circulation of one copy - the user's. It will borrow its contents from a few hundreds of databases and electronic versions of newspapers on the Net. Its headlines will reflect the main areas of interest of its sole subscriber. The private paper will contain hyperlinks to other sites in the Internet: to reference material, to additional information on the same subject. It will contain text, but also graphics, audio, video and photographs. It will be interactive and editable with the push of a button.
Another idea: the intelligent archive.
The user will accumulate information, derived from a variety of sources in an archive maintained for him on the Net. It will not be a classical "dead" archive. It will be active. A special application will search the Net daily and update the archive. It will contain hyperlinks to sites, to additional information on the Net and to alternative sources of information. It will have a "History" function which will teach the archive about the preferences and priorities of the user.
The software will recommend new sites to him and subjects similar to his history. It will alert him to movies, TV shows and new musical releases - all within his cultural sphere. If convinced to purchase - the software will order the wares from the Net. It will then let him listen to the music, see the movie, or read the text.
The internet will become a place of unceasing stimuli, of internal order and organization and of friendliness in the sense of personally rewarding acquaintance. Such an archive will be a veritable friend. It will alert the user to interesting news, leave messages and food for thought in his e-mail (or v-mail). It will send the user a fax if not responded to within a reasonable time. It will issue reports every morning.
This, naturally, is only a private case of the archival potential of the Net.
A network connecting more than 16.3 million computers (end 1996) is also the biggest collective memory effort in history after the Library of Alexandria. The Internet possesses the combined power of all its constituents. Search engines are, therefore, bound to be replaced by intelligent archives which will form universal archives, which will store all the paths to the results of searches plus millions of recommended searches.
Compare this to a newspaper: it is much easier to store back issues of a paper in the Internet than physically. Obviously, it is much easier to search and the amortization of such a copy is annulled. Such an archive will let the user search by word, by key phrase, by contents, search the bibliography and hop to other parts of the archive or to other territories in the Internet using hyperlinks.
We have already mentioned SET, the safety standard. This will facilitate credit card transactions over the Net. These are safe transactions even today - but there an ingrained interest to say otherwise. Newspapers are afraid that advertising budgets will migrate to the Web. Television harbours the same fears. More commerce on the Net - means more advertising dollars diverted from established media. Too many feel unhappy when confronted with this inevitability. They spread lies which feed off the ignorance about how safe paying with credit cards on the Net is. Safety standards will terminate this propaganda and transform the Internet into a commercial medium.
Users will be able to buy and sell goods and services on the Net and get them by post. Certain things will be directly downloaded (software, e-books). Many banking transactions and EDI operations will be conducted through bank-clients intranets. All stock and commodity exchanges will be accessible and the role of brokers will be minimized. Foreign exchange will be easily tradable and transferable. Initial Public Offerings of shares, day trading of stocks and other activities traditionally connected with physical ("pit") capital markets will become a predominant feature of the internet. The day is not far that the likes of Merill Lynch will be offering full services (including advisory services) through the internet. The first steps towards electronic trading of shares (with discounted fees) have already been taken in mid 1999. Home banking, private newspapers, subscriptions to cultural events, tourism packages and airline tickets - are all candidates for Net-Trading.
The Internet is here to stay.
Commercially, it would be an extreme strategic error to ignore it. A lot of money will flow through it. A lot more people will be connected to it. A lot of information will be stored on it.
It is worth being there.
Partially Revised: 7/00.
Appendix - Ethics and the Internet
The "Internet" is a very misleading term. It's like saying "print". Professional articles are "print" - and so are the sleaziest porno brochures.
So, first, I think it would be useful to make a distinction between two broad categories:
Content-driven and Interaction-driven
Most content driven sites maintain reasonable ethical standards, roughly comparable to the "real" or "non-virtual" media. This is because many of these sites were established by businesses with a "real" dimension to start with (Walt Disney, The Economist, etc.). These sites (at least the institutional ones) maintain standards of privacy, veracity, cross-checking of information, etc.
Personal home pages would be a sub-category of content-driven sites. These cannot be seriously considered "media". They are representatives of the new phenomenon of extreme narrowcasting. They do not adhere to any ethical standards, with the exception of those upheld by their owners'.
The interaction orientated sites and activities can, in turn, be divided to E-commerce sites (such as Amazon) which adhere to commercial law and to commercial ethics and to interactive sites.
The latter - discussion lists, mailing lists and so on - are a hotbed of unethical, verbally aggressive, hostile behaviour. A special vocabulary developed to discuss these phenomena ("flaming", "mail bombing" etc.).
Where the aim is to provide consumers with another venue for the dissemination of information or to sell products or services to them the standards of ethics maintained reflect those upheld outside the realm of the internet. Additionally, codified morals, the commercial law is adhered to.
Where the aim is interaction or the dissemination of the personal opinions and views of site-owners - ethical standards are in the process of becoming. A rough set of guidelines coalesced into the "netiquette". It is a set of rules of peaceful co-existence intended to prevent flame wars and the eruption of interpersonal verbal abuse. Since it lacks effective means of enforcement - it is very often violated and constitutes an expression of goodwill, rather than an obliging code.
The Internet Cycle