gbgSumD6zB
08.12.21
asof08.11.22


Chapter Summaries
of
GEEKS BEARING GIFTS
by Ted Nelson

© 2008 Theodor Holm Nelson.
 All rights reserved.

The system of conventions called 'Computer Literacy' make little sense and can only be understood historically.  Here is the briefest possible digest.

[Introduction not included in summary]

-27  Hierarchy  (ancient beginnings)

Hierarchy is the official metaphysic of the computer world.  (Aristotle, the medieval Catholic Church and the Dewey Decimal System have all reinforced this concept,)  Many tekkies think all structure is hierarchical, and have arranged not to see any other kinds.

They say if you have a hammer everything looks like a nail.  Today's hierarchical computer tools (especially object-oriented languages and XML) make hierarchy an imposition, not an option.  Current tools cannot represent cross-connection, interpenetration, overlap, or the other tangles of the real world-- let alone opinions about it.

 


-26  Alphabets  (ancient beginnings)

Phonetic alphabets are only one form of writing, but it's the form that runs the computer world.  Our alphabet derived from the Phoenician; upper case is introduced under Charlemagne in the 800s.

Then text is represented  electrically-- in upper case because God was thought to require it.  Three men in particular make writing electric: Morse, Baudot and Bemer.  Morse represents a character with dots and dashes, Baudot with bits; and Bemer sets the number of bits in a character to eight (obsoleting many computers) and creates the ASCII alphabet (1960).

ASCII becomes the bloodstream of UNIX, which is built on textfiles, ASCII file paths and ASCII process intercommunication.

ASCII is now being replaced by Unicode, able to handle thousands of characters-- but which?  Language issues worldwide are now being decided by tekkie committees-- with a harsh impact on the traditions of  many cultures.

 


-25  Punctuation  (ancient beginnings)

MEANWHILE, punctuation begins as a way of separating, clarifying, later emphasizing.  Early Greek writing goes back-and-forth (boustrophodon) with no spaces or periods, which is disorienting and inconvenient.  The innovations of space and period are followed by such helpers and conceits as comma, question mark, colon, semicolon, exclamation point, pilcrow, interrobang, irony point.

But what matters are the accidental and committee decisions that select punctuation for the ASCII alphabet: the slash, the backslash, the @-sign, all of which take on special meaning.  And through the structure of UNIX file search commands, some of the characters come to be unusable in filenames.

(But Unicode now allows lots of strange punctuation.)

 


-24  Encryption  (ancient beginnings)

Once there are alphabets you have more to hide. Julius Caesar does it.  Edgar Allan Poe popularizes it. Kings, commoners, governments, armies encrypt their writings for thousands of years, but the early methods are pretty simple.  The big breakthrough comes in the 1970s with trapdoor codes, which the U.S. government tries to suppress.  The government gives up and now much private communication is based on trapdoor code-- secure socket transmission, digital  signatures, intranets, firewalls and more.

But of course everyone is sure the NSA is ahead of us all.

 


-23  Making Documents Hierarchical  (18th Century)

The French *philosophes* and Encyclopedists sought to put all knowledge into sequence and hierarchy, as did Melvil Dewey with his library decimal system.

The Harvard outine, in the 20th Century, has had a powerful influence; people are told that organizing hierarchically is "thinking logically", though it means cutting many connections and associations to select a few.  Many technical people believe this is correct and necessary.  However, essayists work differently.

In writing software, there are no good rearrangement systems (see Chapter 5); instead they offer Outline Processors, limited to heirarchical organization.

Since the World Wide Web, hierarchy has been imposed on documents several different ways: the exposed hierarchies of the directories and pages; internal hierarchical markup (HTML, XML); and Cascading Style Sheets.  There will undoubtedly be more.


-22  They All Invented Computers (hardware)  (1822)

A crabby London generalist, a genial physicist with a squad of brilliant youngsters, the Bletchley Park crypto team, a high-school kid collaborating with an engineer he met on CB radio-- all come separately to the concept of a machine that follows a plan, misnamed "computer".

But who should get the prize?  Perhaps the little-known businessman-scientist who is first to create a working programmable digital computer, and first to use binary numbers on a computer-- all under the nose of the Nazis.  (Plus the first to design a computer language.)

 


-21  They All Idealized Computers   (software idealizations)  (1843)

There is another way to invent computers: to see a magical unified ideal.  There are many ways you can think about the computer, which lead to different styles of using it-- which can be enacted in software.

Different people see computers differently from the beginning, and describe their idealized sytems.  Abstract visions of the computer are advanced by Lord Byron's daughter, a sexually indiscreet U-boat hunter, a Hungarian from the Manhattan Project, a University of Chicago economist, a lady admiral, a Norwegian politician-puppeteer, not to mention various mathematicians and engineers.

Like the blind men and the elephant, none of them is wrong, but unlike the blind men, some of them get to build the elephant to their own specifications.

All are correct.  Each offers a different kind of computer life.

 


-20   Database  (1880s)

A database is any principled arrangement of information that you can look things up in.  Naturally it was always the dream of both management and hobbyists: we'd use computers to keep track of  everything!  And exchange that information!  Just a small problem of specifics.

Automatic databasing starts with Herman Hollerith's punch cards, in the 1890s, leading to IBM and its 80-column mentality.  Relational Database (now standard) is designed by a feisty Oxford mathematician-RAF pilot who insists on thirteen axioms (called his "12 rules"), which are now universally ignored.  He is appalled by the SQL language which IBM later develops (now standard).  Meanwhile, Object-Oriented databases fit better with today's computer languages-- but nobody can agree on their structure.

The real problem is: for databases to be universal and shared, everyone has to agree not just on the structure, but on the categories and terminology (now called Ontologies), for all time.  Uh-huh.  Or trust committees for that.  Uh-huh.  But there's a big German company that will take care of you, promising that all databases in your company VILL work together.

 


-19   Voting Machines  (1892)

The original voting machine of 1892 could be examined by anybody to verify its fair operation.  No more.  Several brands of electronic voting machine have now been widely deployed in the USA, all dubious at best.

Barbara Simons (past president of the ACM) and the Electronic Frontier Foundation keep trying to tell Americans: today's "electronic voting machines" are INTRINSICALLY dishonest, in that no technical verification exists, or possibly can.  To verify the software would take a deep clean-room, white/black-hat operation; to actually verify each unit in the field would take a big team.

Yet these machines are "verified" by clueless certification committees who study how they respond.  But any computer scientist, hacker or teenage programmer can tell you that no amount of studying its external behavior will tell you what it's going to do on election day.  It's really a video game programmed to look like a democratic input device.  In the future we may never know the true vote count, just what some hidden technician programs these machines to report.

 


-18  Intellectual property  (1923)

Intellectual property issues have been waiting for us all along, especially copyright and patent (among many branches of intellectual property law).

Copyright determines the legal right to copy.  Anyone may hold a copyright, including artists, writers and big companies.

Today on the Internet, an us-versus-them attitude has developed, where ripping off copyrighted material from big companies has become a sport (first through Napster, now the ingenious BitTorrent, which is said to occupy a large part of world bandwidth).  The big companies have responded with the Digital Millennium Copyright Act (whose safe-haven provisions make ISPs able to operate), Digital Rights Management (which is an unworkable mess) and lawsuits against people who least expect it.

Varying licenses (such as Creative Commons) are well-intentioned but do nothing to loosen the grip of major publishers on their content; microsale, an ignored possibility, might fare better.

Meanwhile, Apple's clever Ipod system has won big.  The Ipod has to be downloaded through their Itunes program, which has provided a major sale gateway for content.

 


-17  The Mainframe Era

After developing the ENIAC during the war, Eckert and Mauchly get backing from Remington Rand (under the name Univac).  Seeing this, IBM quickly takes over most of the computer business in the 1950s through its ferocious salesmanship.  IBM arranges for computers to be run by vast teams in big bureaucracies, which they control.  That system lasts some thirty years.

Also in the 1950s, real-time computers with graphics are built for air traffic control and running the nuclear war that doesn't happen.  These developments seep through to the civilian sector, with minicomputers and graphic displays from DEC and others.  But it's a small part of the market.

After the 8-bit byte is established by Bemer as the fundamental unit of information, IBM throws out its old 36-bit machines and brings out the 360 series, announced 1964.  This hugely enlarges the market, but still dominateed by computer centers brainwashed by IBM.  Other manufacturers follow with 8-bit machines.  But none of them imagines what's coming.

 


-16  Computer Sound and Music   (1947)

In 1947 they put a CRT on the ENIAC to hear its program counter-- helpful for debugging-- so computer sound and music begins.  Academic composers made an early start; digital music goes to the public when a highschool kid invents playing from music samples in the 1970s.  'We'll listen to music on computers', predicts a 1974 book; then Philips creates the CD and the special-purpose computer that plays it.  But few realize it's a computer.

Now the world is a Babel of audio formats, and the music industry is trying to prevent their use.  A nd thousands of guys are building 'prosumer' studios, hoping to be the next Wendy Carlos, but the problem is always distribution-- even with free downloads.

 


-15  Computer Graphics in Two Dimensions  (1950)

They put a CRT on the MIT Whirlwind computer in 1955 and so computer graphics begins.  Immediately you could draw pictures, crudely at first, by program.  But early computer graphics mostly uses character printout-- it's all that most guys have.  (The Defense Department funds computer graphics, but that doesn't make graphics military.  And that money went for a lot of game-playing, too.)

Then comes Sutherland's Sketchpad system for line drawing, which instead of mimicking one-sized paper can expand to the size of a football field; it expands a lot of minds.

As bits become cheaper and color displays arrive, paint systems make the pictures fancier.  As with audio, a Babel of formats appers, and converting these formats becomes a huge issue.  Then two brothers thousands of miles apart-- a movie special-effects guy and an academic-- program a graphics file converter that grows and grows and creates an enormous industry.

 


-14  Computer Games,  1951

Back in the '50s, computer games are a furtive amusement when you sneak time on the lab machine.  No one imagines the immensity of the industry to come, whose revenue now outstrips Hollywood.

Early text games like Hunt The Wumpus and Zork became MUDs and MOOs and WOOs.  A company called Atari puts out a minimalist game called Pong with minimalist directions (HIT BALL BACK FOR HIGH SCORE).  From there it's only a few years to the spectacular explosion of Pac-Man, 1981-- with movie-size revenues-- and that's before networks or realistic graphics.

First-person shooters start with 'Star Trek' in the mid-seventies and now explode visually in blood and phlegm.  Multi-player role games, starting with Dungeons and Dragons (no computers), have become vast industries. Today's Serious Gamer (oxymoron) is a white guy with with no girlfriend and a computer costing ten grand or more.  Go figure.

Now the computer game industry is supposedly bigger than Hollywood.  The University of California at Santa Cruz creates a department of computer games and their enrollment shoots instantly past computer science.

 


-13  Disk Drive Wars   (1956)

IBM creates the RAMAC, a computer with a disk drive, in 1955.  This leads to the issue of how to organize information on disks.

Guys with different views create different kinds of filing and naming conventions.  The program and tables required for each disk idea is called a filesystem.

The main filesystems right now are FAT (Windows, left over from Tim Patterson's Quick and Dirty Operating System); NTFS, better than FAT, created for Windows; HFS, the Macintosh filesystem, which has gotten good; and Ext2, the main filesystem for Linux.  There are many more.  (The three-letter extension often used for file types is also left over from QDOS.)

Different filesystems allow different alphabets and alphabetical orders.  This makes it hard to have your files line up in a given sequence, especially if you use different computers or have drives with more than one filesystem.

There are few difference among filesystems; most closely follow the Unix model.  And alas, one of the few innovators in filesystem design is now in prison for murder.

 


-12  Engelbart's NLS   (1958)

The story in brief:  A soft-spoken and lovable farm boy, nicknamed in the navy "EagleBeak" for his fierce profile, Engelbart sets out to solve the world's hardest problems with superpowerful collaboration tools.  It is Doug Engelbart who first puts document work on screens and invents multiple windows on a screen, links among texts, shared work on screens, and much more-- including the mouse (which he disdains as a small matter).

Everyone else is taking small steps, but Engelbart thinks big.  Why worry about little problems when it's the big ones that we must face?  This means collaborating on a scale and with a clarity and scope that has never existed before.

Engelbart's fabulous prototype NLS (oN-Line System), demonstrated at the Fall Joint Computer Conference in 1968, wows the computer world.

But that world's attention shifts.  Engelbart's ideas are too radical and his interface too daunting, and in the next decade the computer world goes for fonts on paper instead of collaboration on screens.  Doug's work is overlooked by acclamation.  Worse, his vision of accelerating collaborative power is forgotten or considered impossible.  Doug's vision, like Tesla's, is too sweeping for mortals.  But some believe the time for Engelbart's ideas will come again.

 


-11  Xanadu   (1960)

A young filmmaker-intellectual and friends, envisioning the total replacement of paper by the computer screen, contrive a radical system of side-by-side documents with visible connection, and methods for a new world-wide electronic publishing industry.

Xanadu is not (as often supposed) an attempt to create the World Wide Web; it is a sweeping design for text, audio and video as an endless fabric of strips and streams visibly connected by links and identities of content.  This minimal design intrinsically solves a dozen problems.  Rather than imitating the past (like paper and movies that can only be sequential), Xanadu is a radical generalization of media, work systems and copyright to do everything previously possible and much more that still is not.

For half a century the project expands and contracts, boggled by opposing computer traditions, premature optimization, culture clashes and infighting.

Xanadu is far simpler than the Web, but utterly incompatible because of the Web's iron browser standard, which forbids all Xanadu methods and views.  "We fight on", say Xanadu diehards.


-10  Computer Graphics in Three Dimensions   (1960)

Computer 3D is with us everywhere today-- realistic fake movies, photographs mixed with fantasy elements, gaming and Second Life.  But it took decades to get here.

There are many techniques for presenting it-- the fastest way, used by gamers, is OpenGL (or the Micrsoft competitor, DirectX); the slowest is ray-tracing, the ultimate fine-grain method, looking down a virtual soda straw one pixel at a time; the middle way is RenderMan, which uses tricks and heuristics to present huge scenes in seeming detail without ray tracing.

There are many 3D representations.  OpenGl uses triangles stitched in 3D and painted with pictures (oddly called textures).  But there are also spline meshes, constructive solid geometry, metaballs (not meatballs)-- especially suited to fantastic writhing undersea shapes.  And then there's building a creature up from moving bones (inverse kinematics), pioneered by Dennis Muren and the ILM team for "Jurassic Park".

BEING THERE.  When interactive 3D arrives, some want to colonize it as "virtual reality" (a term invented by a French theatrical director).  But the moneymakers wonder, how to sell real estate in it?  That's what Second Life has figured out.  (See "Web 2.0".)

3D GAMES are now high voltage (see Games)-- on line, or on today's superboxes (Playstation and Xbox).

3D MOVIES.  In the '60s and '70s people rolled their own 3D systems.  Now lots are available, from zero up to stratospheric prices.  (See Movies.)

 


-9 - The ARPANET - Getting the Message Across   (1962)

A jovial psychologist proposes communication amongst everybody by computer.  The Defense Department (ARPA division) thinks it's a good idea.

A Rand Corporation engineer, grimly worried about nuclear war, is not the only one to invent packet switching.  Packet-switching, in fact developed to keep a thermonuclear exchange from escalating to doomsday (and opposed by telephone veterans) works.  It triumphs over telephone traditions and networking methods developed by IBM and ISO.  The Pentagon funds it and it barely works, but not all the information arrives across the network.  A Frenchman says, Why not have the endpoints take care of completing the exchange?  The result is TCP/IP, a new way to push-pull information across a tangle of connections.  It barely works, and connects a lot of researchers on their separate networks across a big supernetwork that barely works.

First it's called the ARPAnet, run with the flavor of military culture.  But the military-- who inspired it in the first place-- sees that it's totally insecure and gives it to the public.  It still barely works.  (It's still totally insecure, but we now call it the Internet.  See below.)

 


-8  Instant Messaging and Texting   (1960s)

Instant messaging begins on early time-sharing systems like CTSS (MIT) and Multics, back in mainframe days.

Chatting and messaging aren't magic.  The very same techniques as email are just differently packaged, texts going back and forth without subject lines or message containers.  Chat, chat rooms and SMS are essentially the same.  Packaged as virtual spaces with two users or twenty, the internals are pretty much alike.

(Sometimes chatrooms are enhanced by Chatbots, programs that are set up to reply as if they were people, first done by Joe Weizenbaum's ELIZA at MIT-- Turing-tests for the gullible.)

For no particular reason, instant messaging through cellphones is called something different-- text messaging, SMS, or just texting.  Many young people send dozens of texts a day for vast amounts of money (and vast profit to phone companies).  Various medical authorities say texting is Officially Addictive.  (President-elect Obama is supposedly an addict.)

 


-7  Computer Movies (1963)

As soon as the computer can put one picture on a screen, it can do two, then a series of pictures and thus movies.  That starts in the sixties.

People are highly motivated.  Ken Knowlton at Bell Labs craftily mixes his movie assignments with personal art projects.  John Whitney moves from analog computers to digital for his art.  Jim Blinn naps 24/7 in a cold computer room to maximize his time on the machine.

A lot of people see what's coming.   Hundreds of academics make early movies, but two key institutions-- ILM and what would become Pixar-- lead the drive to full seamless movie realism.

THE BIG TWO.  Industrial Light and Magic, George Lucas' special effects arm, having made the models for the Star Wars Series, seamlessly merge computer graphics into real-actor films (most amazingly, "Jurassic Park" and "Pirates of the Caribbean").  What would be Pixar, making all-animated 3D, begins as a department at the New York Institute of Technology.  NYIT expects to take over Hollywood; eventually the group DOES take over Hollywood, and indeed Disney.

Now anyone can make a feature animated film with a thousand dollars in software and  thousands of hours of work.  But of course the problem is always distribution.

 


-6   Shared Texts   (1965)

The Compatible Time-Sharing System, at MIT (1965) allowed the sharing of texts.  This grew in many directions: Newsgroups, Forums, Discussion Groups, USENET, Mailing Lists, Bulletin Boards, the Web, Blogs.  In many ways these are all the same thing: someone posts (publishes) a document, others comment on it.

The first publicly available system is Community Memory, firing up in 1973, with Teletypes all over Berkeley.  Usenet, growing out of the Unix-to-Unix Protocol (UUCP), fires up in 1979, and offers a vast system of forums computer professionals and hangers-on a vast system of forums.  General consumers are allowed in when the Source and Compu-Serve started offering consumer information services in 1979.  At the same time, the Bulletin Board industry springs up, with many individuals offering free and paid document services by modem from dinky machines in their living rooms.

The World Wide Web, a page storage and transfer method with a standard protocol and client (and free content the default), supersedes most of these things in the 1990s.   Web subsystems like wikis and blogs have also caught on as ways of adding comments, in a profusion of formats.

Meanwhile, Project Gutenberg has been patiently digitizing thousands of texts for free use.  So has Google, but the results are different: the Gutenberg texts are available in digital text form, whereas Google gives us only images.

 


-5  Email, 1965

Once there is sharing of files and texts, it's easy to put "Dear Charlie" at the beginning of a file and email is inevitable.  The next step is sending it TO someone.  Local email starts quickly, e.g. at the AI Lab's CTSS at MIT.  Then the network of networks starts to fire up.

The ARPANET has a variety of email address formats, some requiring that you list all the computers on the way to the destination; Bob Taylor at ARPA insists on unifying these address formats, which leads to the "@" convention.

Wrangling in the Email Committee is smoothed by Dave Crocker, resulting in the present system-- especially the extensible set of fields, From, To, CC, BCC, Priority, Subject, Body.  (A big problem is the Priority field, which people immediately start misusing, inflating the importance of their messages.)

Most email is now spam, sent out by unwitting Windows users who don't know their machine has been secretly enslaved.   (See Malware.)

"Email is forever" (unless you want to keep it, in which case you're more likely to lose it-- Murphy's Law).  If you like, you can go voyeuring in the Enron email corpus, placed on line after their court case.

 


-4   Hypertext Goes the Wrong Way (1967)

A hypertext project at Brown University leaves a bitter legacy.  A supposed Xanadu implementation turns sour, with main ideas of hypertext thrown out in an ugly atmosphere.  Xanadu concepts are dumbed down to a one-way link structure with views locked to printout (early WYSIWYG).  Simplifying traditionalists throw out the deeper ideas, insisting on the screen content being printable, which dumbs down the idea of hypertext to what fits on paper.

The resulting system, HES, is a clean but vacuous design, with one-way links and text editing constrained to be printable.  But it is stunning at the time, when computer screens still astonish the public.

This design is copied, successively becoming FRESS, Intermedia and NoteCards, and influencing HyperCard.  Now traditional, it becomes the prevailing notion of hypertext, and passes on to become the World Wide Web.  The Xanadu guy has published his regrets about participating.

 


-3  Object-Oriented Programming  (1967)

The Simula language, developed in Norway by Kristen Nygaard (pron. 'nugard') and Ole-Johan Dahl, turns programming inside out.  Instead of structuring the program, the programmer specfies the objects to be dealt with and how they respond to each other.

Since then "OO", as it is now affectionately called, has largely taken over the computer world.  C++, an OO language, is now the main system programming language.  There are many approaches and warring doctrines about OO, including UML, Booch Method, Gang Of Four, Patterns, and more.

The problem is that OO usually forces a hierarchical model, requiring going up and down through layers.  And once the boundaries of objects are laid down, they cannot evolve in different directions.


-2   Local Networking   (1970)

(Paradoxically, Local Area Neworks (LANs) don't take off till work on the great ARPANET has begun.)  First reported LAN is the Octopus network, at Lawrence Radiation Laboratory, 1970.

FIGHTS:  There are many different early initiatives and standards for local area networking in the late mainframe, Dinky and early Personal eras.  Some initiatives emphasize the hardware and some emphasizing the protocol.  Contenders include StarLan, AppleTalk, Datapoint ARC (the best), token ring (IBM's, clumsy, which loses).

Ethernet wins.  Originally it requires honking big cables with exact spacing for different machines, but now it's down to a familiar thinnish wire.

 


-1   Datapoint-- the Personal Computer with a Mainframe Mentality   (1970)

Datapoint straddles the mainframe and dinky era and might win in personal computing, but they can't imagine it.

THE STORY IN BRIEF:  Harry and Vic, a high school kid and an engineer, meeting on CB radio, design a minimalist programmable chip that they think will make a good input terminal (replacing card-punches).  A company in San Antonio puts a sexy box around it as the Datapoint 2200, with cassette drives to record keystrokes.  But Datapoint doesn't realize it's a computer until a guy in Denmark creates an operating system for it.

Fabulous success follows.  With an excellent local network and robust database search, by 1981 the Datapoint line is the most effective office computer system.  Unfortunately, Datapoint management pays no attention to the well-known developments in the personal computer world or at Xerox PARC.  After the IBM PC hits the world, Datapoint goes straight down.

But on a billion desktops the Datapoint computer lives on, because Harry and Vic's design became today's Intel architecture.  You could in principle run Datapoint programs on your stock PC today.

 


0   UNIX*-- Modern Computer History Begins   (1970)
*What the hell to call it?  Supreme trademark mess.
The universe officially begins on Jan 1, 1970 (when Unix time kicks off-- now the official timing system of most of the computer world.)

Two guys at Bell Labs, noting the bloat of the operating system Multics and wanting to play space games, create a file transfer system on a discarded computer with the disapproval of their boss.

It grows and grows.

This system, UNIX, becomes the framework for university computing and the development of much of today's world.  It is the framework for the development of the Internet, and for the style of the Internet's protocols.

It defines file systems and the structures of operating systems to come.  It becomes Gnu, Linux and the post-2001 Macintosh.  And it would have been the standard operating system of today's personal-computer world if not for Bell's lawyers.

Clean design, power, effectiveness-- that's UNIX.  But so are the clumsy interface delays, the shortcuts that break, and the one-way links of the Web.

 


1   Malware and Security   (1973)

Nobody imagines in the early days of cooperation (Unix and the ARPANET) just how malicious and predatory the computer world will become.  Attacking the computers of corporations and innocent users from thousands of miles away, for fun and profit, is now a vast industry.  As an analogy, imagine the different ways you could attack people on the street if you were invisible-- from just tickling them for fun, on to bloody murder.  That's the Internet today.  But there is no way to eal up the cracks.  Defense has to be at the endpoints.

The number of possible attack methods (called "exploits") is astronomical and growing daily-- spam, Nigerian, virus, worm, Trojan, phishing, pharming, distributed denial of service, cross-site Java scripting, zombie botnets, cracking and hacking (a Stanford professor has apologized for the latter word).  Make no mistake: some of the best minds of our time are the worst people, and they may be out to get YOU.

It's best to hire a profesional.  But Security is a profession with two sides (White Hat and Black Hat)-- two vast industries, one furtive and profitable, the other expensive.

 


2   The Era of Dinky Computers: Kits and Stunts  (1974)

The dinky era of personal computer kits is a time of wild fanaticism and ferment-- from 1974, when personal computers are announced, to 1977, when they go from kits to reliable prebuilt machines.

The story in brief:  When programmable chips became available, different companies bring out hobby computer kits-- notably the Altair, which makes its debut in 1974.   Many competing and incompatible machines come out-- but they have to be physically assembled from parts, and programs have to be typed in by hand.  Memories are  painfully small.   Nevertheless, enthusiasts, feeling themselves on the cusp of the future, fanatically buy kits and magazines.  Warbling cassettes, 20-character screens, exciting new hardware boards fill the air.

But this is a hobby world for determined guys working evenings and Sundays in their basements, whose wives rarely understood the point of what they're doing.  Though they feel they are participating in the great wave of the future, few of the hobbyists will contribute anything to it.

 


3   The PUI Dumbs Down the Computer   (1974)

Nearly all of today's computers wear the PARC User Interface or PUI (often called "the modern GUI").  It's not what people think.

THE STORY IN BRIEF.  The hard-driving tekkies at Xerox Palo Alto Research Center, claiming to "design the future," settle on a simplified future based on the disguising of hierarchical directories with pictures of office folders.

The PUI consists of a "desktop" (vertical, unlike worldly desktops), "folders" (in no way different from directories), a "wastebasket" (no way different from deletion of a file in UNIX, except delayed), and an icon-languge to represent the few operations that are allowed the user.  Also WYSIWYG documents and a "clipboard" (see next chapters).

Xerox can't sell the PUI-- their package is much too expensive, and they don't know how to market it anyway-- but then Apple implements the PUI under the name Macintosh and succeeds big.  Gates then implements the PUI gradually at Microsoft under the name Windows.

The PUI is a figleaf on the standard computer and Unix-inspired operating system, hiding its more obtrusive aspects.  Underneath the PUI are standard file structure, standard file structure operations, standard running of programs.  Little has changed except for what has been taken away.


4   Paperdigm: the PUI Dumbs Down the Document (and Turns the Computer into a Paper Simulator)  (1974)
 

The PARC guys, further claiming to "design the future," settle on a simplified future based on the imitation of paper (Xerox wanted to sell printing machines).  Inspired by a Quaker mediator brandishing a slogan from a transvestite TV comedian, they dumb down the computer to a paper simulator.  This slashes away a universe of possibilities, but it helps Xerox sell printers.

The PARC guys are first to have bit-mapped screens with pretty fonts, which electrifies all the visitors.  They dismiss the Engelbart and Xanadu notions of connection and instead go after appearance.

Instead a new slogan rules: "WYSIWYG", standing for "What you see is what you get".  This turns the computer into a paper simulator.

Project Bravo, Charles Simonyi's WYSIWYG project, goes from PARC to Microsoft and becomes Microsoft Word.  John Warnock's Interpress Project, using an astronomer's language brought into the text field by Cap'n Crunch Draper, creates PostScript and Acrobat-- the perfect simulator of paper under glass.

 


5  The PUI Takes Away Our Ability To Write and Organize (1974)
 

The term "cut and paste", as used by writers and editors for many years, refers to rearrangement of paper manuscripts by actually cutting them and physically rearranging them on desktop or floor.  It is a process of parallel contemplation and rearrangement, where you look at all the parts, move pieces around, put them in temporary nonsequential arrangements, and choose a final sequence.

In 1984 this changed: when the Macintosh comes out, they change the meaning of "cut" to *hide* and the meaning of "paste" to *plug*.  To be fair, many people, not just the PARCies, imagine that the serial process of hiding and plugging contents is THE SAME as the parallel process of rearrangement.

This original parallel rearrangement process is fundamental to writing and revision, especially in large-scale projects.  Because no good rearrangment software exists, it is far harder to organize large projects on the computer than it was with typewriters, before the PUI.  The organization of large projects has become far more difficult, unless you print out and physically cut and paste the old way.

 


6   Personal Computing   (1977)
 

To the existing computer establishment it's unthinkable.  To many dinky-computer hobbyists it's beyond the horizon.  But a dogged few dream a world of personal computing-- not a mainframe, not a kit, but a computer and screen on every desk. (In emotional and intellectual involvement, computing has ALWAYS been personal for its practitioners, but that's another level.)

The big year it happens is 1977.  That's when the Commodore PET and the TRS-80 come out.  But the one that hits the bullseye, the Apple II, is created by Steve'n'Steve (Jobs'n'Woz)-- Woz designs it, Jobs sees the potential and gives it a hard-shell box that looks friendly and appropriate for a home.  Besides a happy opening program called HELLO, it offers games, graphics and four levels of programming.  But what gets it into corporations is a program called Visi-Calc (below).

It takes four years for the big gun to roll out.  IBM, reacting gradually, builds a huge piece of iron, big as a refrigerator top, much more expensive and unfriendly than the Apple with four times the size and ten times the weight, but it has the IBM name.  (Instead of HELLO, its startup program is called AUTOEXEC.BAT-- a very bad sign.)  IBM hires Microsoft to build the operating system, and that is the beginning of their end.  Apple brings out other machines and the game is on.

These systems have now changed the world, enslaving people at billions of desks who don't even feel their shackles.


7  The World Wars  (1977): Consumer Operating Systems: Microsoft Versus Apple Versus Everybody
 

To use a computer, you have to have a way to store data and connect programs.  That's what an operating system does.  It is a whole world.  There are two such worlds for personal computers, and they have been at war for some time.

Others have come and gone, and Linux bubbles in the background, but the fight between Microsoft and Apple (and literally, Bill and Steve) has been an enduring soap opera at center stage for thirty years now.  Apple began when Woz, a chip genius, puts together a minimalist computer to impress the guys at the club, but it is Jobs who sees the consumer potential.  Microsoft begins when Bill, a freshman, steals time on the Harvard computer to build a Basic interpreter with his pal Paul.  MS-DOS is created by a 22-year-old programmer who doesn't work for Microsoft.

The rest of the story is leverage, accident, intimidation, hoopla, and sheer luck.  The pingpong interaction of Apple and Microsoft has a zoological interest not unlike a fight between male elephants.

They are selling two slightly different but incompatible PUIs, which enthusiasts imagine represent profound philosophical differences.  But the systems' differences are really about the way Jobs and Gates run their empires.  The real difference is: Jobs deeply knows the soul of the user; Gates at least knows what a user IS; and the Linux people haven't a clue.

However, millions of mortal users are now using UNIX without knowing it, since it's inside every new Macintosh, making it far more reliable than Windows.


8  Spreadsheet (1979)
 

The first computer spreadsheet program, Visi-Calc (designed by Bricken, programmed by Frankston) is startlingly simple.  It allows you to fiddle with numbers and see results immediately, in a conceptually simple 2D layout (though with formulas invisible).  It allows a kind of programming that didn't seem like programming.  It allows you to visualize what's happening in your finances and your company-- to an extent.

Visi-Calc saves Apple and puts the Apple II in offices.  But a teacher of transcendental meditation adds graphics with a program called Tiny Troll, which then grows into Lotus 1,2,3 and smashes VisiCalc (and saves the IBM PC as Visi-Calc had saved Apple).

Microsoft Excel has since become the dominant spreadsheet program.

Unfortunatley many spreadsheets are as empty as the Sahara and almost as big.  This is a fundamental problem with the construct.  And it is said that half of all spreadsheet models are wrong.

 


9  The Domain Name System, DNS   (1983)
As ARPANET expands, it gets harder and harder to keep a list on each computer of all the other computers and their actual numeric addresses on the network.

At first ARPANET is at only a few dozen universities and military installations; but as the number of nodes grows, nobody can keep a list of machines any more, so they centralize the listings.  They divide it the addresses into different types-- .MIL, .EDU, GOV, NET, and, as an afterthought, .COM.  Then, as users are let in from abroad, they add the country codes from a list that's handy (ISO 3116), but .MIL and .GOV still refer to the U.S. military and government respectively.

MYTH:  The domain-name system is a U.S. imperialist trick.  No, it's just an arrangement they threw together when the HOSTS file ran out.  Nevertheless, other countries rant about it at world meetings, not seeing they can change the whole system any time.

¿HOW COME?  How did Catalonia, the Bohemian/anarchist province of Spain, get its own domain as if it were a country (.CAT)?

OPPORTUNITY:  You can get your own top-level domain!  (Like, .BASQUE!)  It will set you back $180,000.

 


10   Open Source (and Linux)  (1983)
 

A proud, combative idealist-- a top programmer living a Spartan life-- announces that software should be free, and that he, personally, is going to duplicate UNIX in a free version.  All by himself.  Yeah, sure.  Nobody's ever written such a big program single-handed.  (This is not early UNIX we're talking about.)

But he does it.  Richard Stallman builds his own version of UNIX (called GNU), which takes off (but only runs on non-consumer machines).  In the process, Stallman founds the Open Source movement and creates legal history with the unprecedented powers (legal and psychological) of his software license.  The Open Source movement and methods, with Linux as their figurehead, become a fierce new engine of software development, harnessing the volunteer labor of many thousands and bringing surprise reliability to the development process.

And Stallman's GNU is at the apex, taking over much of the Internet-- but under the name Linux, because  of the one part Stallman left out.

 


11  The Internet-- Enjoy It While You Can   (1989)

The ARPANET is renamed to Internet and opened to the general public.  ISPs and long-line providers come into the game.

Now the Internet is an ocean of communication. By lowering the price of content transfer to nearly zero, the Internet has revolutionized the world.  Millions, now billions of users join up.  Previously unimaginable opportunities, problems and dangers infest our lives.  Many lament the "digital divide" which sets apart those who do not yet have these opportunities, problems and dangers.

All the oldest issues of freedom and tyranny, privacy and speech and press and legal rights of every kind, are thrown onto this stage to fight and fight over.

The Internet has no head or center.  No one owns it.  Its use cannot be controlled.  Its technical and political aspects are overseen by various committees but they can only recommend-- anyone with a server on the Internet can choose to do things differently.  And anyone who sees a new opportunity for mischief or crime can take advantage of it, sending packets to probe, attack and steal.

Governments hate the freedom it gives to citizens.  But the Internet may be only our brief moment of freedom, like Periclean Athens; many things may bring it down.

 


12  The Simple Early Web, 1989
The story in brief:  An Oxford-educated idealist at a nuclear research facility creates a system for exchanging writings among physicists, a page distribution scheme with a format, one-way pointers and a simple protocol.  It crudely equates a document with a place, an address on the ARPANET.  This curious hybrid structure takes off.

The World Wide Web is like karaoke-- anyone can do it.  But its rough and beguiling simplicity pushes dozens of problems into the laps of users and creates a maintenance nightmare, resulting in the Content Management industry and millions of broken links.  Its pages simulate paper; and as with the WYSIWYG pages of the PUI, there is no way to annotate or overlay these pages.

Talked out of a clumsier name, Berners-Lee says, "okay, let's call it the World Wide Web".  Little does he (or anyone) know.

 


13  PUI on the Internet-- the Browser Salad (1992)
 

In a sense the real creators of the Web are two students at the University of Illinois.  Berners-Lee's audaciously-named World Wide Web page distribution system-- files, protocol and one-way connections-- catches the eye of students Marc Andreessen and Eric Bina, who see great possibilities for the Web system and colonize it.

THE FRAMEUP:  They design a frame for it, the NCSA Web browser (now just called "the browser"), a frame for windowing and decorating Web pages.  In other words, it is the PARC User Interface (PUI) on the Internet, but with one-way links in addition (as are traditional from the Unix filesystem).

WHY'D IT CATCH ON?  Most important for its catching on, they add pictures.  And an easy interface.  And formatting.  And bookmarks.  And cookies.  And an editor for simple pages.  And the visible URL line.  The boys expansively design a whole way of life, little knowing how many it will ensnare.

The browser catches on like wildfire and use grows geometrically.  It hits the public in 1994.  It starts hosting major commerce around 1996.  It reaches a million users, then a billion, and is still growing.

Others rush to put more kinds of interaction into the browser, which becomes Coney Island-- tarted up with JavaScript, Cascading Style Sheets, Flash, Ajax, streaming audio and video.  But all of them are locked to a paper-like rectangle.  It's still the PARC model of paper under glass, but with interactive acrobatics inside the rectangular page and one-way links between the pages.

The browser defines what is possible on the Web-- and what is impossible.


 

14   Cyberfashion  (1993)
 

Under every new regime there is a new social and fashion elite.

THE BARLOW DECREE.  The founding manifesto of the Internet fashion elite comes in 1996, when John Perry Barlow demands that network users be exempted from all terrestrial laws.  His eloquent if baffling plea makes it into a lot of textbooks.

The fashion center of today's computer world is Wired magazine, a garish monthly designed to make readers feel they're at the center of the action.  Like Apple's products, Wired is as much a fashion statement as a substantive product, to be worn by those who want to be With It.

John Brockman, a literary agent, is the shadowy figure at the top of the cyberfashion food chain.  Brockman tells book publishers (slightly clueless) who and what matters.  The current decade's books bear the Brockman imprint, which is close to that of Wired.

Internet fashion statements are being made worldwide.  The most charming is the Chinese couple who try to name their child "@".


15  The URL Rejiggers Net Addresses (1994)

To fit his World Wide Web distribution scheme, Berners-Lee creates the URL (a uniform way of addressing anything on the Internet).  The idea is to reconcile all the different file systems and reference methods and reduce them all to one standard notation.

This makes the network uniformly traversible without special cases, hiding the variety of filesystems.

That said, the URL is possibly the most user-hostile item that ordinary users ever see; the tangled complexity of that single line in the browser is a model of the difficulties of all the computer world.

 


16  Web Biz:  The Dot-Com World and the New Monopolies (1995)
 

Few expect what will happen when business starts up on the Web.  From the days of a few professors exchanging files by ARPANET, and the .COM domain (for commerce) an afterthought, few imagined the business expansion of the Internet broght about by the World Wide Web.

Public awareness of the Web began on a large scale about 1994.  Shop windows, secure payment systems began, 'shopping cart' conventions build up, and people begin actually buying things.  Smashing old industries on every side, threatening every form of commerce and business and providing startling new opportunities, the new way of doing business confuse everyone into a variety of sometimes-idiotic strategies.  To do nothing seems worse than doing something, anything.

The world picks itself up again and now the dot-com cycle is repeating.

Meanwhile, the old order changeth: big old companies lie by the wayside, and we see the new monopolies-- Adobe, Amazon, Ebay (which began by auctioning a broken laserpointer), PayPal, Wikipedia, CraigsList, BitTorrent, Google, YouTube, Lulu.com.  These companies are like nothing that ever happened before, each with its own odd story.


17  Streaming Goes Private (1995)
 

Streaming is not magic.  It is the same technique(s) of transmission as in email and file transfer, differently treated.  (It's considered part of "the Web" now because it uses Web pages as portals, though the protocols are as different from the Web as email.)

Based on the Unix and Internet cooperative spirit, there used to be a great dream of standard formats for streaming andio and video, including VOIP, Voice Over Internet Protocol.

But now all of that is swept away: the open standards are diminishing to niche status, pushed aside by private methods, and monopolies have moved in.  What we see now are mostly the special methods of RealNetworks, Apple QuickTime, Skype and YouTube.

 


18  Google (1996)
 

Two clever and careful Stanford students take over the world one step at a time, but moving audaciously from the beginning.  First with a super search engine.  Then they figure out how to make money, getting the trick from another company.  Then they go public their own way (kind of), for apocalyptic profit.

Google now has the largest computer-- i.e., unified computer system-- in the world (thought to be well over a million processors), with the most reliable operating system in the world-- a distributed parallel Linux that never stops, even as disk drives die constantly.  (Hey, if something's missing, how would you know?)

ORGANIZATION.  Google is a very flat company (no new project is vital).  It's very tough to get hired, requiring many interviews.  Employees make bargains with their bosses on their proposed projects, and then better deliver.  The company is tightly run by two software veterans, Eric Schmidt and Peter Norvig; the founders get to jet around and hold pep rallies for the employees.

Google threatens every content industry, publishing industry and library industry.  "This is a wake-up call," says an executive at the Bibliothèque de France.  "We thought we could take much longer to get around to it."

 


19  The World Wars Go Mobile-- PDAs, Cellphones
 

Mobile phones (called in USA cellphones) are originally just telephones (though designed around wiretapping).

Then manufacturers and services realize they can profitably offer more.  For instance, answering service, changeable ringtones.  Cellphones began getting fancy, with complicated buttons, chambers and services no one over forty can understand.  Text messaging (see elsewhere) especially divides the world on age levels and is vastly profitable to the phone companies.

Who thinks up the camera phone as a product?  Nobody.  A Frenchman wanting to send out pictures of his baby daughter jury-rigs a camera to his mobile phone, and the camera phone is born.  There are now a billion out there.

A NEW WORLD, NOT THE DESKTOP.  As digital capabilities get smaller, the mobile phone can in principle do everything your "computer" (more traditional packaging) can do.  How design such a new world?  And how sell it, how persuade the public to use it?

Steve Jobs knows (with the Iphone).  Google THINKS they know (with Android).

 


20  Web 2.0-- Walled Gardens, Cattle Pens, Collaboration Places Sort Of
 

"Web 2.0" and "social media" are journalists' phrases to lump together a bunch of stuff on the Net, made to seem oh so new.  They generally refer to Facebook, MySpace, Second Life and Wikipedia (and ever so many other wannabes).  These services, like so many others on the net, are contrived to capture customers.

But what's so new about social media?  Email was always a social medium!  Facebook, Myspace and their imitators are essentially the same internal techniques as email and the Web, just different packaging.

Journalists lump Wikipedia with Facebook and MySpace but it's quite different.  Yes anyone can edit it, temporarily, but not really.  Your "edit" might just as well be put in a submission window, because it will be scrutinized and judged by the REAL editors.  (The internal politics are fierce.)

BLOGS.  The term "blog", short for "weblog", is a loaded term.  It refers to the articles and columns that people self-publish on the net.  In the current fashion, a blog "posting" (publication) may be followed by random comments by any number of readers, some slovenly and illiterate, some elegantly written.  There is currently no principled method for controlling this kind of forum.

 


[Post-historical chapters not included in summary]
 

=30=