Chapter 17: A Future So Bright You'll Need to Wear Sunglassesby Philip Greenspun, part of Philip and Alex's Guide to Web Publishing |
The most successful Internet punditry is a lot like Rabkin's survey of science fiction. University labs got corporate money in the 1980s to invent virtual reality. Magazines and newspapers got advertising dollars in the 1990s to tell the public about the amazing new development of virtual reality. All of this was greatly facilitated by Ivan Sutherland and Bob Sproull. They placed an array of sensors in a ceiling to track the user's head position and attitude. With a head-mounted display and real-time information about the user's position, Sutherland and Sproull were able to place synthetic chairs in the room (Sutherland joked that the ultimate computer display would let the user sit down in the chair). They built a completely functioning virtual reality system, using government research funds, and published the results to the world with papers, photographs, and movie demonstrations. The year? 1966.
Note: You can read more about Ivan Sutherland at http://www.sun.com/960710/feature3/.
I did a careful study of book and magazine Internet punditry, graphing the author's wealth and fame versus the novelty of the ideas presented. Based on this research, here are my predictions for the future:
Can we learn anything general from my results? Absolutely. Armies of hardware engineers will work anonymously in cubicles like slaves for 30 years so that the powerful computers used by pioneers in the 1960s will be affordable to everyone. Then in the 1990s rich people and companies will use their PR staffs to take credit for the innovations of the pioneers in the 1960s, without even having the grace to thank the hardware geeks who made it possible for them to steal credit in the first place. Finally, the media will elect a few official pundits who are (a) familiar enough with the 1960s innovations to predict next year's Cyberlandscape for the AOL crowd, but (b) not so familiar with the history of these innovations that they sound unconvincing when crediting them to the rich people and companies of the 1990s.
Where does that leave me? I'm not one of the pioneers of the 1960sI was born in 1963. I'm not a rich person of the 1990sI forgot to get rich during the Great Internet Boom. I'm not an official pundit, except once for an Italian newsweekly (see http://philip.greenspun.com/personal/ for the full story)I guess I must have done a bad job for those Italians.
I may be a failure, but they can't take away my aspirations. There isn't much point in aspiring to be a pioneer of the 1960s. The '60s are over, even if some folks in my hometown (the People's Republic of Cambridge, Massachusetts) haven't admitted it. There isn't much point in my aspiring to be an official real dead-trees media pundit. My friends would only laugh at me if I started writing for Wired magazine. However, being a rich person of the 1990s has a certain indefinable appeal for me. Perhaps this comment I made in http://philip.greenspun.com/materialism/sums it up: "Not being a materialist in the U.S. is kind of like not appreciating opera if you live in Milan or art if you live in Paris. We support materialism better than any other culture. Because retailing and distribution are so efficient here, stuff is cheaper than anywhere else in the world. And then we have huge houses in which to archive our stuff."
Materialism is definitely more fun when one is rich. How to get there, though. Conventional wisdom in Italy has it that "There are three ways to make money. You can inherit it. You can marry it. You can steal it." Based on my survey of the computer industry, the third strategy seems to be the most successful. With that in mind, I'm going to present the following ideas (mostly stolen from smarter people):
What people need and, with the ubiquitous Internet, can finally get, are collaborative Web-based applications. Web-based apps let people use computers without becoming mired in system administration. Web-based apps help people collaborate. Web-based apps can weave an individual's contribution into a larger work produced by many people over the decades.
The future is WimpyPoint, not PowerPoint.
If Web-based apps are so great, why aren't we all using them now? Desktop apps serve one user at a time and tend to be copies of systems from the '60s and '70s. Web-based apps serve thousands of users simultaneously and oftentimes are based on completely new service ideas. Thus Web-based apps require programmers with great skill, imagination, and taste.
Are we moving into an era in which wizards will be overpaid or underpaid? Here's a thought experiment:
You're managing the information system for the British gas pipeline company. Thanks to Margaret Thatcher, customers get to buy their gas from their choice among 20 vendors. However, since each household only has one physical gas line, these purchasing choices are really an accounting fiction maintained in a relational database. How big a database? The pipeline company divides the country into eight regions, each of which is supported by an HP Unix box running a 400 GB Oracle database. If a sloppy programmer leaves out an AND clause in a SQL statement that does JOINs against a 400 GB database, query time will rise from 1/10th of a second to one million seconds and the entire server may be effectively frozen while the query is executing. If an inexperienced programmer can't elegantly solve the little logic puzzles that writing SQL presents, your system may be delivering the correct answers but perhaps by chewing up 1,000 times as many computing resources as necessary. How much are you willing to pay for a programmer who is guaranteed to be careful and experienced?Here's a real-life example drawn from this installation: Reports from the 400 GB Oracle databases were being generated every night and FTP'd out to regional offices. A good programmer spent a few weeks writing a Web application to deliver these reports only on demand. How much did this guy save the company? Let's just say that they'd previously been FTP'ing 180,000 reports every night.
Modern computing systems, which generally incorporate massive relational databases, greatly amplify the benefits delivered by good programmers and the mistakes made by bad ones.
The compressed development schedules in the Web/db world also contribute to high programmer salaries. Fortune 500 customers come to my little consulting company (arsdigita.com) with absurdly ambitious projects. They want to take an entire existing business with thousands of customers and make the whole thing Web-based. In three months. Having always sold to retailers, they now want to sell product direct to consumers. Starting in two and a half months. The inherently conservative IT departments of these companies would laugh these guys out of the glass room. Three months? That's not enough time to call up Andersen Consulting and pay for them to write a project plan, much less write any code, do any testing, or install a production system. The only programmers who can do something in three months are programmers who've built a substantially similar system before.
Suppose that you're a business executive with an important new idea. The Web/db application has to be live in three months. Your internal IT people have refused to touch the project. If you picked up one of those "business secrets from Microsoft" books at the airport, you'd hire a bunch of kids fresh out of college, let them spend a few years working through their mistakes, and be a little late to a market in which you already had a monopoly. Sadly, however, you realize that nobody has a monopoly in the Web service world. If you don't go live soon, someone else will and then you'll have to spend millions of dollars advertising your way into users' thoughts. The answer? Find the best and most experienced Web service developers that you can and pay them whatever they ask. What if you can get 500 competent programmers in Banglor for the same price? That's great but it will take you at least three months just to develop a management plan to yoke those 500 programmers together. Save them for a three-year project.
The above picture may seem a bit bleak for the manager, who won't enjoy scrambling to find a competent and available Web/db developer. Nor will the manager enjoy paying programmers $250 an hour. However, think about it from the programmer's point of view. When I graduated from MIT in 1982, my classmates and I had but one choice if we wanted to get an idea to market: Join a big organization. When products, even software, needed to be distributed physically, you needed people to design packaging, write and mail brochures, set up an assembly line, fill shelves in a warehouse, fulfill customer orders, etc. We went to work for big companies like IBM and Hewlett-Packard. Our first rude surprise was learning that even the best engineers earned a pittance compared with senior management. Moreover, because of the vast resources that were needed to turn a working design into an on-sale product, most finished designs never made it to market. "My project was killed" was the familiar refrain among Route 128 and Silicon Valley engineers in 1982.
How does the Web/db world circa 1998 look to a programmer? If Joe Programmer can grab an IP address on a computer already running a well-maintained relational database, he can build an interesting service in a matter of weeks. By himself. If built for fun, this service can be delivered free to the entire Internet at minimal cost. If built for a customer, this service can be launched without further effort. Either way, there is only a brief period of several weeks during which a project can be killed. That won't stop the site from being killed months or years down the road, but very seldom will a Web programmer build something that never sees the light of day (during my entire career of Web/db application development, 1994-1998, I have never wasted time on an application that failed to reach the public).
So is this the Golden Age for programmers? We work at a high level of abstraction and rely on powerful subsystems (e.g., Oracle, the Internet, and ubiquitous browsers) and therefore get a lot done in brief periods. Companies want things done within weeks rather than years and hence are willing to pay huge bucks to programmers with good track-records. Development schedules are so short and it is so easy to release a Web application to the world that very rarely do we work hard to develop something that doesn't get released.
All of the foregoing is true and yet what makes this the Golden Age for programmers is that Web publishers are often willing to distribute source code. Deeply ingrained in the culture of engineering companies is that the way to get rich is to keep technology from falling into the hands of competitors. You make your engineers sign over all their intellectual property rights and you sit on that intellectual property, even if you can't find a use for it. The goal is to achieve Bill Gates-style world domination where nobody else can think of competing with you. Publishing doesn't work this way. Companies explicitly recognize that their main assets are name recognition, readers, graphics, etc. Publishers don't realistically think that they can get 100 percent of the book market, 100 percent of the movie market, 100 percent of the magazine readers, or 100 percent of the TV viewing audience. They can give away all of their source code to a publisher with a slightly different demographic and not be any worse off. As George Bush said to Michael Dukakis, "A short man never got any taller by sawing the legs off of a tall man".
When people share source code, it means that programmers spend more time doing the interesting work of solving new problems and less time doing the uninteresting work of reverse-engineering a solution to an old problem. Source code sharing also means that programmers have more chances to become personally recognized for their achievement, rather than their corporate bosses getting the credit.
Does that mean all software will become free? No. But I predict a continued growth in the popularity and power of free software for Web applications. Later in this chapter we look at whether or not the non-free software couldn't be sold in a way that is less destructive for users and software developers.
"Those weren't small companies," he replied. "In 1900, a company with 100 employees would have been considered large. Without the telephone, it wasn't really practical to manage a larger organization."
People are bad at extrapolating from the early years of a technology. Edison thought the phonograph would be used for recording business dictation, not the Bee Gees singing Saturday Night Fever. In July 1998, I heard a computer science professor say "the Web is really good for distributing papers but I don't see it being used for collaboration."
Suppose Professor Forwardthinker is wrong and we are able to engineer truly great Web-based collaboration tools. At that point, it might become feasible to drop some of the assumptions about corporate and project management. Here are some of the things that shape current corporate structures:
How does that change the work that Web technologists must do? Right now we tend to see projects in terms of Internet or intranet. People say "We have some really great applications, but we can't show them to you because they're behind the firewall." If work gets organized more along project lines than corporation lines, this kind of thinking becomes a serious impediment to progress.
For example, in 1998 if people from Companies A, B, and C need to work together, they'd expect to be able to call up the phone company and ask it to set up a conference call in 15 minutes. In 2018, it is possible that cross-company collaboration will be far more prevalent. In that case, people will expect to be able to ask the phone company to set up a Web-based collaboration environment, in 15 minutes.
Absolutely.
If you think about it a bit more, it begins to seem absurd even without the box. Users are out of the system administration business. Why need they be consciously paying for, downloading, and installing software products?
If you think about it yet still more, it comes to seem absurd even in
the primitive world of the 1990s.
It is the way that software is sold that keeps software technology mired in the 1950s. We put it into packages and sell it like tables or chairs because we have a highly efficient distribution and retail system for tables and chairs and because we've been buying things like tables and chairs for centuries. It would all work out beautifully for everyone if only tables and chairs needed periodic upgrades, if tables and chairs required documentation and support, if tables and chairs could be downloaded over networks, if users developed significant investments in the interfaces of tables and chairs, and if it cost $30 million to develop a slightly better table or chair from scratch.
Look at the choices that current software pricing forces people to make.
Suppose that Johnny buys PhotoShop. Adobe gets $500 and is happy. Johnny gets manuals and support and he's working efficiently. Johnny doesn't have to drive anywhere, so society doesn't suffer from increased pollution and traffic congestion. Unfortunately, probably not too many people would pay $500 for software that they're only going to use for a day or two. Also, when Johnny next wants to use the software, he'll probably find that the version he has no longer runs with Apple's new operating system, or that Apple has gone belly-up and his version doesn't run on his new Linux machine, or that the instructor wants him to use a program feature that is only in the latest version of PhotoShop.
Let's be realistic. Johnny probably isn't going to buy PhotoShop. He's going to steal it from Adobe by borrowing the CD-ROM from his friend's friend. He'll spend his $500 on a spring break trip to Florida. Unfortunately for Johnny, PhotoShop is almost impossible to use without the manuals. Johnny drives to the bookstore and spends $30 on an "I stole the program and now I need a book on how to use it" book. Johnny wastes some time; Adobe gets no money; society has to breathe Johnny's exhaust fumes and wait behind his jalopy at intersections.
If Johnny is remarkably honest, he may go to Kinko's and rent a Macintosh running PhotoShop. This is great, except that the network was supposed to free users from having to physically move themselves around. Johnny is inconvenienced and society is inconvenienced by the externalities of his driving.
Let's revisit the same people under the new model . . .
Amanda still wants her users to be able to employ the familiar Lotus user interface. It is now in Lotus's interest to tell other programmers how to call their user interface code. Because licensing consortium revenues are apportioned according to usage, every time a Lotus menu is displayed or command is run, Lotus is going to get some extra money from the consortium. Amanda's company will get paid when her new spreadsheet core program is executing. Lotus and Amanda's company are sharing revenue and it is in both of their interests to make the user productive.
Adobe sponsors conferences to help other software developers call PhotoShop's internals. Adobe will not file any look-and-feel lawsuits because they're getting paid every time someone uses their user interface code.
My personal prediction is that two kinds of consortia would emerge in this New World. One kind would cater to business. Users would pay $x per year and get the old familiar software. Consortia catering to home users, however, would offer a $0 per year deal: You can use any software you want, but we're going to replace those startup screens and hourglasses with ads for McDonald's and Coke. Ask for a spell check in your word processor? While it is loading, an ad for Rolaids will ask you how you spell relief. Ask PhotoShop to sharpen a big PhotoCD scan? That thermometer progress bar will be buried underneath an ad reminding you how sharp you'd feel if you were dressed from head to toe in L.L. Bean clothing.
However, renting software would not solve the deeper problem created by
software developers standing on each other's toes rather than each
other's shoulders.
Web publishers will need to operate in a world where the typical client
is not a desktop PC running a manually-installed browser. That doesn't
have many implications for service design as long as most of the clients
still have the familiar keyboard, mouse, and monitor. Oops.
Do I believe in this explosion of Internetworking because I'm a technology optimist? Have I decided to write for Wired magazine after all? No. I believe this because I've become a technology pessimist.
An engineer's age is thus determinative of his or her attitude toward home networking. Young engineers think that we'll have home appliance networking because it will make life easier for consumers. Gerry Sussman, my former advisor at MIT, is a bit grizzled and probably wouldn't argue with my characterization of him as an old engineer. Gerry loves to pull a huge N ("Navy") RF connector out of his desk drawer to show students how it can be mated with the small BNC ("Bayonet Navy Connector") for expediency. "These were both designed during World War II," Gerry will say. "You don't get strain relief but it makes a perfectly good contact in an emergency. The guys who designed these connectors were brilliant. On the other hand, there has been a commission meeting in Europe for 15 years trying to come up with a common power-plug standard."
The problems of home appliance networking are human and business problems, not technical problems. There is no reason why a Sony CD player shouldn't have been able to communicate intelligently with a Pioneer receiver 10 years ago. Both machines contain computers. How come when you hit "Play" on the CD player, the receiver doesn't turn itself on and switch its input to CD?
Why can't a Nikon camera talk to Minolta's wireless flash system? Or, for that matter, why can't this year's Nikon camera talk intelligently to last year's Nikon flash?
Computer engineers are confused into thinking that companies care about interoperability. In fact, the inherently monopolistic computer industry was dragged kicking and screaming toward interoperability by the United States federal government, the one buyer large enough to insist on it. Many of the standards in the computer industry are due to federal funding or conditions in government purchasing contracts. Buyers of home appliances are too disorganized to insist on standards. General Electric's appliance division, the market leader in the United States, isn't even a sponsor of the Consumer Electronics Bus consortium. IBM is. AT&T Bell Labs is. Hewlett-Packard is.
Does this mean you have to figure out how to fry an egg on your PC or telephone before you'll have a really smart house? No. As I have hinted, I think that companies such as GE will start to put Internet interfaces into their appliances as soon as about 20 percent of American households are wired for full-time Internet, for example with cable modems (see Chapter 6). But they won't do it because they think it is cool for your GE fridge to talk to your Whirlpool dishwasher. They'll do it because it will cut the cost of tech support for them. Instead of paying someone to wait on the 800 line while you poke around with your head underneath the fridge looking for the serial number, they'll want to ping your fridge across the Internet and find out the model, its current temperature, and whether there are any compressor failures.
So I watched as the sites I'd built for big publishers got tarted up with imagemaps and tables and frames and flashing GIFs and applets. If it looks Okay in Netscape Navigator on a Mac or a PC, then ship it. Don't even worry whether it is legal HTML or not. Then one day WebTV came out. Suddenly there was a flurry of e-mail on the group mailing lists. How to redesign their sites to be compatible with WebTV? I had to fight the urge to reply, "I looked at my personal site on a WebTV the other day; it looked fine."
WebTV was a big shock to a lot of publishers. Yet WebTV is much more computer-like than any of the other household appliances that consumers will be connecting to the Internet. Be ready: Focus on content. Standard HTML plus semantic tags can make your content useful to household devices with very primitive user interface capabilities.
This chapter started off by saying that average users shouldn't be forced to maintain computer systems. Then we threw rocks at the 100 MB desktop apps that people buy in computer stores. Then we looked at whether people might not just browse Web content from their stoves and microwave ovens. Does that mean that corporate information systems staffers will be attacking the challenge of reformatting Web content to fit 8x80-character displays? No. They will be busy attacking the challenge of meaningfully communicating with other companies.
Acme is also a modern company. They have an integrated order entry, inventory, and billing system backed by an RDBMS. As soon as the order goes into the system, it sets into motion a synchronized chain of events in the factory.
How does the data for the 2,500-widget order get from Spacely to Acme? Each decade had its defining technology:
What stops Spacely's computer from talking directly to Acme's?
In the old (pre-Internet) days, we would say that it was the impossibility of getting the bits from Acme's glass room to Spacely's. Now we have an Internet and any computer in the world can talk to any other. But sadly it turns out that they have nothing to say.
Security is the first thing that worries most companies when they hook critical systems to the Internet. Can we be sure that Spacely's computer won't attempt any naughty transactions on Acme's computer? For example, if Spacely had full access to Acme's RDBMS, it could mark lots of invoices as having been paid. The issue of security is an anthill, however, compared to the craggy mountain of data model incompatibility.
Column names may be different. Acme's programmers choose "part_number" and Spacely's use "partnum". To humans they look the same, but to the computer they might as well be completely different. Worse yet are differences in the meaning of what is in that column. Acme has a different part number for the same widget than does Spacely. Nor need there be a one-to-one mapping between columns. Suppose Spacely's data model uses a single text field for shipping address and Acme's breaks up the address into line_1, line_2, city, state, postal_code, and country_code columns? Nor finally need there be a one-to-one mapping between tables. Spacely could spread an order across multiple tables. An order wouldn't contain an address at all, just a factory ID. You'd have to JOIN with the factories table if you wanted to print out one order with a meaningful shipping address. Acme might just have one wide table with some duplication of data. Multiple orders to the same factory would just contain duplicate copies of the factory address.
We could fix this problem the way GM did. Go over to Germany and buy some data models from SAP (www.sap.com). Then make every division of the company use these data models and the same part numbers for the same screws. Total cost? About $1 billion. A smart investment? How can you doubt GM? This is the company that spent $5 billion on robots at a time when they could have purchased all of Toyota for about the same sum. Anyway, the bureaucrats at MIT were so fattened by undergrads paying $23,000 a year and so impressed by GM's smart move that they bought SAP data models, too. My advisor was skeptical that data models designed for a factory would work at a university. "Sure they will," I said, "You just have to think of each major as an assembly line. You're probably being modeled as a painting robot."
Was my faith in SAP shaken when, two calendar years and 40 person-years into the installation process, MIT still wasn't really up and running? Absolutely not. SAP is the best thing that ever happened to computer people. It appeals to businesses that are too stupid to understand and model their own processes but too rich to simply continue relying on secretaries and file cabinets. So they want to buy SAP or a set of data models from one of SAP's competitors. But since they can't understand their business processes well enough to model them themselves, they aren't able to figure out which product is the best match for those processes. So they hire consultants to tell them which product to buy. A friend of mine is one of these consultants. If you take what you've learned from this book and score a $2000 per day Web consulting gig, don't bother to gloat in front of David. His time is worth $6,500 a day. And he doesn't even know SQL! He doesn't have to do any programming. He doesn't have to do any database administration. He doesn't have to do any systems administration. David just has to fly first class around the world and sit at conference tables with big executives and opine that perhaps PeopleSoft would be better for their company than SAP.
There are plenty of rich stupid companies on the Web. Is it therefore true that the same "convert everyone to one data model" approach will achieve our objective of streamlined intercompany communication? No. There is no central authority that can force everyone to spend big bucks converting to a common data model. Companies probably won't spend much voluntarily either. Company X might have no objection to wasting billions internally, but management is usually reluctant to spend money in ways that might benefit Company Y.
What does that leave us with? n companies on the Web technically able to share data but having n separate data models. Each time two companies want to share data, their programmers have to cooperate on a conversion system. Before everyone can talk to anyone, we'll have to build n*(n-1) unidirectional converters (for each of n companies we need a link to n-1 other companies, thus the n*(n-1)). With just 200 companies, this turns out to be 39,800 converters.
If we could get those 200 companies to agree on a canonical format for data exchange, we'd only need to build 400 unidirectional converters. That is a much more manageable number than 39,800 particularly when it is obvious that each company should bear the burden of writing two converters (one into and one out of its proprietary format).
The fly in the ointment here is that developing canonical data models can be extremely difficult. For something like hotel room booking, it can probably be achieved by a committee of volunteer programmers. For manufacturing, it apparently is tough enough that a company like SAP can charge tens of millions of dollars for one copy of its system (and even then they haven't really solved the problem because they and their customers typically customize about 20 percent of their systems). For medical records, it is a research problem.
That's why the next section is so interesting.
Suppose I wanted to build a database for indexing photographs. When I was 14, I would have sat down and created a table with precisely the correct number of columns and then used it forever. Today, though, I would build a Web front-end to my database and let other photographers use my software. I'd give them the capability of extending the data model just for their images. After a few months, I'd look at the extensions that they'd found necessary and use those to try to figure out new features that ought to be common in the next release of the software.
Note: If this example sounds insufficiently contrived, it is because it is one of my actual back burner projects; check www.photo.net to see if I've actually done it.
Ditto for the SPAM mailing list manager system described ad nauseum in Chapter 15. The interesting thing to do with it would be to let each publisher add extra columns to his or her private data model and then see what people really wanted to do with the system.
A much more challenging problem is building a computer system that can find commonality among the extensions that users have made to a data model and automatically spit out a new release of the canonical data model that subsumes 85 percent of the custom modifications (you want as much capability in the canonical data model as possible because off-the-shelf software will only be convenient when working with the standard portions of the model).
Why this obsession with data modeling? Computers can't do much to help us if we can't boil our problems down to formal models. The more things that we can formally model, the more that computers can help us. The Web is the most powerful tool that we've ever had for developing models. We don't need focus groups. We don't need marketing staff reading auguries. We don't need seven versions of a product before we get it right. We have a system that lets users tell us directly and in a formal language exactly what they need our data model to do.
Hog farmers are smarter than Web publishers. A hog farmer would not spend $500,000 on a state-of-the-art payroll software system, the market-leading purchasing package, and an award-winning accounts receivable management system. A hog farmer would buy Hogfarmix, the same integrated software package used by other smart hog farmers. Hogfarmix represents the information that is critical to the hog farming business. If its payroll, purchasing, and receivables components are less than state-of-the-art, they still work better for a hog farmer than the market leading products. All the information in these components is interpreted in a hog-farming light.
Does that mean that publishers must build everything from scratch? Not necessarily.
A retail bank that wishes to start offering on-line services should try to find an "on-line bank in a box" package. If it can't find adequate software, it should team up with banks in other regions to fund the construction of a good package that they can all use.
A magazine should ask itself whether the community software package outlined in Chapter 3 wouldn't be better than starting from scratch. As noted in the "Case Studies" chapter, it doesn't matter whether or not the discussion forum software in the community system is better or worse than that in a standalone package. It is integrated with the users table and enables the publisher to ask, "Show me the documents that people were reading immediately before asking questions" or "Show me users who've answered more than 50 questions and who've read at least 60 percent of our static content" (these are people who might be promotable to co-moderators).
A company with an existing information system should find Web applications that will run from their current data model. It is insanely costly to maintain multiple sets of the same data.
"I must say, that all of you who do not recognize the absolute genius of Bill Gates are stupid. You say that bill gates stole this operating system. Hmm.. i find this interesting. If he stole it from steve jobs, why hasn't Mr. Jobs relentlessly sued him and such. Because Mr. Jobs has no basis to support this. Macintosh operates NOTHING like Windows 3.1 or Win 95/NT/98. Now for the mac dissing. Mac's are good for 1 thing. Graphics. Thats all. Anything else a mac sucks at. You look in all the elementary schools of america.. You wont see a PC. Youll see a mac. Why? Because Mac's are only used by people with undeveloped brains."
-- Allen (chuggie@geocities.com), August 10, 1998