What if there is a silver bullet
by Brad J. Cox, Ph.D.
Few programmers could develop a compiler, word processor or spreadsheet to compete in today's crowded software market. The cost and complexity of modern-day applications exceed the financial and intellectual capacity of even the rarest of individuals. Even large-granularity sub-components like window systems, persistent object databases and communication facilities are more than most individuals could handle. But individuals can build smaller (so-called 'reusable') software components that others could assemble into larger objects; components as small as Stacks and Queues.
So why don't we? Why do we drudge away our lives in companies with the financial, technical, and marketing muscle to build the huge objects we call applications? Why don't we start software companies, like Intel, to invent, build, test, document, and market small-granularity objects for other companies to buy? Think of the reduction in auto emission pollution if more of us stayed home to build small- granularity components for sale! Think of not having to get along with the boss!
Object-oriented programming technologies have brought us tantalizingly close to making this dream technically, if not economically, feasible. Subroutines have long been able to encapsulate functionality into modules that others can use without needing to look inside, just as with Intel's silicon components. Object-oriented programming languages have extended our ability to encapsulate functionality within Software-ICs that can support higher-level objects than subroutines ever could. Such languages have already made the use of pre-fabricated data structure and graphical user interface classes a viable alternative to fabricating cut-to-fit components for each application. All this is technically feasible already, even though the software industrial revolution [COX] has hardly begun.
Yet these technical advances have not really changed the way we organize to build software. They've just provided better tools for building software just as before. The pre- fabricated small components of today are not bought and sold as assets in their own right. They are bundled (given away) inside something larger. Sometimes they are bundled to inflate the value (and price!) of some inexpensive commodity item, as in Apple's ROM software that turns a $50 CPU chip into a $5000 Macintosh computer. Sometimes they play the same role with respect to software objects, as in the libraries that come with object-oriented compilers.
There is no robust way to market the small active objects that we call reusable software components, at least not today. The same is true of the passive objects we call data. For example, nearly half of our landfill bulk is newspapers and magazines. This could be eliminated if we could only break the habit of fondling the macerated remains of some forest critter's home as we drink our morning coffee. But this is hardly a bad habit from the viewpoint of newspaper publishers. If they distributed news electronically, how would they charge for their labor?
Paper-based information distribution makes certain kinds of information unavailable even when the information is easily obtainable. For example, I hate price-comparison shopping and would gladly pay for high-quality information as to where to buy groceries and gasoline inexpensively. This information is avidly collected by various silver-haired ladies in my community. But they collect it solely for their own use. The lack of robust marketing mechanisms for such objects removes any incentive for them to distribute their expertise to potential customers such as myself.
What if entrepreneurs could market electronic information objects for other people to buy? Couldn't geographically specialized but broadly relevant objects like my gasoline price example be the killer apps that the hardware vendors are so desperately seeking? Think of what it could it mean to today's saturated hardware market if everyone who buys gasoline and groceries started buying computers to access Aunt Nellie's coupon-clipping acumen!
Information Age Economics
These questions outline the fundamental obstacle of the manufacturing age to information age transition. The human race is adept at selling tangible goods such as Twinkies, automobiles, and newspapers. But we've never developed a commercially robust way of buying and selling easily copied intangible goods like electronic data and software.
Of course, there are more obstacles to a robust market in electronic objects than I could ever mention here. Many of them are technological deficiencies that could easily be corrected, such as the lack of suitably diverse encapsulation and binding mechanisms in today's object- oriented programming languages, insufficient telecommunications bandwidth and reliability, and the dearth of capable browsers, repositories and software classification schemes. My second book, What, if anything, is an Object? considers these obstacles in detail to show that each one could in principle be overcome once suitable incentives were in place.
The biggest obstacle of all is that electronic objects can be copied so easily that there is no way to collect revenue the way Intel does, by collecting a fee each time another copy of a silicon object is needed. More than any other reason, this is why nobody would ever quit their day job to build small-granularity software components for a living.
A striking vestige of manufacturing age thinking is the still-dominant practice of charging for information age goods like software by the copy. Since electronic goods can be copied easily by every consumer, the producers must inhibit copying with such abominations as shrinkwrap license agreements and copy protection dongles. But since these are being vehemently rejected by software consumers, SPA (Software Publishers Association) and BSA (Business Software Alliance) are using handcuffs and jail sentences as copy protection technologies that actually do work even for information age products like software.
The lack of robust information age incentives explains why so many corporate reuse library initiatives have collapsed under a hail of user complaints. "Poorly documented. Poorly tested. Too hard to find what I need. Does not address my specific requirements." Except for the often rumored "Not invented here" syndrome, the problem is only occasionally a demand side problem. The big problem is on the supply side. There are no robust incentives to encourage producers to provide minutely specialized, tested, documented and (dare I hope?) guaranteed components that quality-conscious engineers might pay good money to buy. As long as reuse repositories are waste disposal dumps where we throw poorly tested and undocumented trash for garbage pickers to "reuse", quality-conscious engineers will rightly insist, "Not in my backyard!"
Paying for software by the copy (or reusing it for free) is so widespread today that it may seem like the only option. But think of it in object-oriented terms. Where is it written that we should pay for an object's instance variables (data) according to usage (as network access charges) yet pay for methods (software) by the copy? Shouldn't we also consider incentive structures that could motivate people to buy and sell electronic objects in which the historical distinction between program and data are altogether hidden from view?
Lets consider a different approach that might work for any form of computer-based information. It is based on the following observation. Software objects differ from tangible objects in being fundamentally unable to monitor their copying but trivially able to monitor their use. For example, it is easy to make software count how many times it has been invoked, but hard to make it count how many times it has been copied. So why not build an information age market economy around this difference between manufacturing age and information age goods?
If revenue collection were based on monitoring the use of software inside a computer, vendors could dispense with copy protection altogether. They could distribute electronic objects for free in expectation of a usage-based revenue stream.
Legal precedents for this approach already exist. The distinction between copyright (the right to copy or distribute) and useright (the right to 'perform', or to use a copy once obtained) is long-established in copyright law. These laws were stringently tested in court a century ago as the music publishers came to terms with broadcast technologies such as radio and TV.
When we buy a record, we acquire ownership of a physical copy. We also acquire a severely limited useright that only allows us to use the music for personal enjoyment. Conversely, large television and radio companies often have the very same records thrust upon them by the publishers for free. But they pay substantial fees for the useright to play the music on the air. The fees are administered by ASCAP (American Society of Composers, Authors and Publishers) and BMI (Broadcast Musicians Institute) by monitoring how often each record is broadcast to how large a listening audience.
Dr. Ryoichi Mori, the head of the Japanese industry-wide consortium, JEIDA (Japanese Electronics Industrial Development Association) is developing an analogous approach for software. Each computer is thought of as a station that broadcasts, not the software itself, but the use of the software, to an audience of a single 'listener' [MORI]. The approach is called superdistribution because, like superconductivity, it lets information flow freely, without resistance from copy protection or piracy.
Its premise is that copy protection is exactly the wrong idea for intangible, easily copied goods such as software. Instead, superdistribution turns ease of copying into an asset. It actively encourages the free distribution of information age goods via whatever distribution mechanism you please. You are positively encouraged to acquire superdistribution software from networks, to give it away to your friends, or even send it as junk mail to people you've never met. Broadcast my software from satellites if you want. Please!
This generosity is possible because the software is actually 'meterware'. It has strings attached that make revenue collection independent of how the software was distributed. The software contains embedded instructions that make it useless except on machines that are equipped for this new kind of revenue collection. The computers that can run superdistribution software are otherwise quite ordinary. In particular, they run ordinary pay-by-copy software just fine. They just have additional capabilities that only superdistribution software uses.
In JEIDA's current prototype, these services are provided by a silicon chip that plugs into a Macintosh coprocessor slot. Electronic objects (not just applications, but active and/or passive objects of every granularity) that are intended for superdistribution invoke this hardware to ensure that the revenue collection hardware is present, that prior usage reports have been uploaded, and that prior usage fees have been paid.
The hardware is not complicated (the main complexities are tamper-proofing, not base functionality). It merely provides several instructions that must be present before superdistribution software can run. The instructions count how many times they have been invoked by the software, storing these usage counts temporarily in a tamper-proof persistent RAM. Periodically (say monthly) this usage information is uploaded to an administrative organization for billing, using public key encryption technology to discourage tampering and to protect the secrecy of this information.
The end-user gets a monthly bill for their usage of each top-level component. Their payments are credited to each component's owner in proportion to usage. These accounts are then debited according to each application's usage of any sub-components. These are credited to the sub-component owners, again in proportion to usage. In other words, the end-user's payments are recursively distributed through the producer-consumer hierarchy. The distribution is governed by usage metering information collected from each end-user's machine, plus usage pricing data provided to the administrative organization by each component vendor.
Since communication is infrequent and involves only a small amount of metering information, the communication channel could be as simple as a modem that autodials a hardwired 800 number each month. Many other solutions are viable, such as flash cards or even floppy disks to be mailed back and forth each month in the mails.
A Revolutionary Approach
Whereas software's ease of replication is a liability today (by disincentivizing those who would provide it), superdistribution turns this liability into an asset by allowing software to be distributed for free. Whereas software vendors must spend heavily to overcome software's invisibility, superdistribution thrusts software out into the world to serve as its own advertisement. Whereas the personal computer revolution isolates individuals inside a standalone personal computer, superdistribution establishes a cooperative/competitive community around an information age market economy.
Of course, there are many obstacles to this ever happening for real. A big one is the information privacy issues raised by usage monitors in every computer from video games to workstations to mainframes. Although we are accustomed to usage monitoring for electricity, telephone, gas, water and electronic data services, information privacy is an explosive political issue. Superdistribution could easily be legislated into oblivion out of the fear that the usage information would be used for other than billing purposes.
A second obstacle is the problem of adding usage monitoring hardware to a critical number of computers. This is where today's computing establishment could be gravely exposed to those less inclined to maintain the status quo. It is significant that superdistribution was not developed by the American computer establishment, who presently controls 70% of the world software market. It was developed by JEIDA, an industry-wide consortium of Japanese computer manufacturers.
What if the Japanese superdistribution metering instructions were built into every next-generation CPU chip, much as ADD and JSR instructions are built in today?
Review the benefits I've discussed in this column and then ask: Whose computers would you buy? Whose computers would Aunt Nellie and her friends buy? What if superdistribution really is a Silver Bullet for the information age issues that I've raised in this column? And what if the competition builds it first?
Brad Cox, Ph.D.
[COX] Brad J. Cox; What, if anything, is an Object; Addison Wesley; in development. Planning the Software Industrial Revolution; IEEE Software; Nov 1990. There is a Silver Bullet; Byte; Oct 1990.
[MORI] Ryoichi Mori and Masaji Kawahara; Superdistribution: An Overview and the Current Status; Technical Research Reports of the Institute of Electronics, Information and Communication Engineers Vol 89 #44. What lies ahead; Byte; Jan 1989; pp 346-348. On Superdistribution; Byte; Sep 1990; p 346.
Please report errors to --> firstname.lastname@example.org