Local Access to the Information Infrastructure
Branko J. Gerovac
David C. Carver
a white paper presented at America in the Age of Information:
A Forum on Federal Information and Communications R&D
July 6 and 7, 1995
Center for Technology, Policy, and Industrial Development
Massachusetts Institute of Technology
Table of Contents
A stated objective of many initiatives toward an Information Infrastructure is to improve individual access, for example: provide advanced and "widely-accessible information services," facilitate personal information processing, allow formation of virtual groups performing over distances, encourage broader use of information technologies, and promote personal skills development and training. However, when translated into research and development activities, an individual's access to the emerging information infrastructure remains largely neglected or dismissed as something someone will deliver given the right legislative or regulatory incentive.
Indeed, local access presents many fundamental technical, economic, and policy issues. For example: (a) There is no demonstrated technology that will deliver high bandwidth connectivity to homes and small businesses that we commonly enjoy in advanced institutions. (b) There is no convincing business model or market evolution that is sustainable. (c) There is no consensus on or clear description of government's role and objectives in promoting development and use. The interplay of these technical, economic, and policy issues is far more complex than commonly portrayed. Though they are inextricably linked, it is clear that this is not just a business or legislative issue - the technology simply does not exist.
Local access is a critical component and potential barrier to achieving the promises of an information infrastructure (Community, National, or Global), local access in this context means the physical plant and facilities, equipment and services, appliances, and standards and practices used to connect to the information infrastructure from homes, schools, clinics, small businesses, etc. Failure to develop effective and sustainable local access will ultimately prevent businesses from reaching customers, vendors, and other business, will prevent government from reaching its constituencies, and will constrain individuals' options for work, learning, and personal development.
While large institutions can capitalize and amortize the costs to their own dedicated high bandwidth access to the information infrastructure, individuals and small enterprises depend on the development of local access. Though costs are coming down, individuals and small enterprises are still prohibited by entry costs, and are limited to what primitive communications is provided them. It may be tempting to pool individual resources into a pseudo institution (as some communities propose to do) to participate among institutional networks, but this only temporarily sidesteps the issues, limits individual mobility, and undermines the economies of scale that would result from addressing local access more broadly.
Industries are investing heavily in upgrading traditional systems and while many acknowledge the developing information infrastructure, few are willing to risk the investment necessary to address its broader requirements. Local access is not served by providing only a limited set of applications as defined by historical technology and market partitions. Yet, existing or proposed access schemes are often service and/or industry specific. Cross-industry harmonization is lacking.
Government has a dilemma. Government will not be able to completely fund local access deployment. It is recognized that industry will have to make the investment, but there is difficulty in setting conditions that balance industry`s commercial viability with government's responsibility to promote the public good.
Government can promote the development of key missing technologies enabling an infrastructure that is open, robust, secure, useful, attractive, and affordable.
Many component technologies (microprocessors, DRAM, mass storage, etc.) are following well defined evolutionary curves - generally along the lines of achieving twice the performance or half the price every two years. Historically, these technologies have continued to follow these projections. Even when there appeared to be major technical obstacles to maintaining the curve, new techniques appeared. There is every reason to believe that these technologies will continue to progress along the same curves.
On the other hand, local access, as needed for the information infrastructure, does not yet have a curve. Communications in general has its technology curves. Telephone data modems have evolved to the present 28.8Kbs, local area networks expand with ethernet, fast ethernet, FDDI,... and long lines have progressed: T1, T3, OCxx... But none of these address the needs of high bandwidth interactive communications for individuals and small enterprises.
A list of exploratory technologies include: hybrid fiber coax, cable, fiber to the neighborhood/home, twisted pair (ISDN/HDSL/ADSL), mini/micro-cellular, single frequency networks, geosynchronous satellite, low earth orbit satellite (LEOS), etc. Many of these technologies may have their place, but none are currently readily deployable, nor are there a clear set of technical objectives and requirements to address.
The following requirements are key to achieving the sought after benefits of the information infrastructure and putting local access on the "technology curve". Many of the requirements are interrelated and/or interdependent. Some are further refinements of others. Thus, the descriptions here are to be taken as a whole.
Of course, everyone would like to have infinite bandwidth at zero latency. And some people predict that technology is approaching being able to do that, i.e., "bandwidth is free". However, as a practical matter, there will always be situations where communications is constrained - either due to unforeseen application demands, congestion such as during emergency events, cost, media constraints (e.g., wireless vs wireline), scalability for low cost devices, service specific allocation, etc.
It is tempting to simply use historical media types and requirements. But, this doesn't answer what are the critical levels in network access performance at which entire new classes of applications become possible and at which applications or features of applications become unusable. This understanding needs to be formalized and used to guide system requirements and specifications.
Those who argue for asymmetry often view themselves as the provider of relevant content and see the viewers role as simply a consumer of that content. Hence, the only interaction they are interested in is selection and payment. An asymmetric design often corrupts the system architecture so that symmetry is never achievable, thus maintaining anachronistic producer/consumer boundaries. It is attractive to providers to set themselves apart technically from their customers and from potential competitors thereby setting an artificial distinction that allows them to be a gatekeeper with the necessary privileged access to provide services.
In a symmetric system, though aggregate bandwidth may be tuned to usage patterns, the bandwidth availability is inherently symmetric, with no barriers to becoming an information producer.
Can the infrastructure be designed to always accommodate symmetry regardless of whether early deployment constraints warrant some degree statistical asymmetry - i.e., a symmetric ready system? How are protocols defined to smoothly transition between asymmetric and symmetric transports, services, and applications?
Minimum service guarantees, negotiated transport characteristics, negotiated graceful degradation, willingness to pay, quality vs cost, virtual vs abstract control, etc., all imply new needed flexibility in the underlying infrastructure. Existing communications systems partition applications based on requirements into distinct transports (telephone, cable, radio broadcast, packet networks, etc.). The move to all digital coding promises but is not sufficient to remove the partitions and merge applications to be employed in more useful ways. The ability to negotiate quality of service is critical to achieving a common communications infrastructure built on a common bearer service.
A peer-to-peer environment permits direct access among equals on the network - all members of the network have access to the same capabilities and features. Whereas a master-slave environment requires all access to be mediated by the master - the master has critical capabilities not available to the slave. Industry is investing in facilities to support master-slave services like video-on-demand.
Master-slave confers on the access provider complete control over what services are available on the system - the opposite of common carriage. Peer-to-peer enables a marketplace of small business service providers where individuals are free to set up shop in a timely manner (market responsiveness) without having to go through or be delayed by a broker for permission and access - i.e., much the way business operates now. In a master-slave environment there is an inherent payoff to the broker that occurs, which makes it attractive to set up oneself as a broker.
Master-slave encourages partitioned markets so that for a particular service area the master can achieve and maintain dominance. Master-slave encourages monopolistic behavior, reducing the potential of smaller service providers.
A peer-to-peer environment offers greater freedom of choice as to how services are provided (e.g., client-server). For example, it is readily possible to provide a master-slave like service in a peer-to-peer environment (but not vis a versa). A peer-to-peer environment has collateral benefits to development of new markets.
More and more, people and their activities are becoming mobile - not tied to a particular office, building, or geographic location. To be properly addressed, a solution to local access must not be just local - a decoupling of network and services interfaces from underlying physical and logical topology. Technologies need to be developed to support dynamic attachment, dynamic reconfiguration, automatic failure discovery and recovery (self healing), and migration of one's communications and processing environment (including naming and addressing, authentication, billing and payment, etc.).
The scale of investment required to deploy local access infrastructure motivates a design that achieves significant sharing of infrastructure resources, but sharing an infrastructure, especially one that is open, with neighbors and local institutions exasperates privacy, security, and network management issues. Traditional local area networks and wide area networks have established systems of security through a combination of architectural, institutional, and physical security. How or even if these approaches translate to deploying open networks to homes, neighborhoods, and municipal areas remains largely unexplored.
The rapid pace of technology overwhelms the traditional exhaustive standards development process. It is becoming critical to identify the few pivotal interfaces that are the foundation for smooth and effective system evolution.
These topics are not new. They are many of the architectural goodness criteria that are a foundation of communications development. But, local access provides a new context in which they need to be considered.
Different industries represent different strengths and liabilities; no one industry can deliver a complete answer. Furthermore, some in industry continue to invest in service-specific access systems in order to preserve old market partitions.
Computing, communications, and media (including consumer electronics) are adopting a common set of base digital technologies. The common technology base and the economies of scale in the marketplace are inextricably driving the industries together. The result is much more than a simple technological leverage across industries, instead, there is an spreading interplay and merger of the industries, products, and services themselves.
Unfortunately, digital transmission technology is not as well developed as is necessary to be broadly deployable. In many cases, this not just a cost and technology curve issue, but rather an area that requires new motivation for basic technology R&D, e.g., RF modulation schemes, link protocols, and network topologies for broadband interactive communication on a hybrid fiber coax wire plant.
Simultaneously achieving universality and sustainable competition in a communications marketplace motivates the deployment of open interoperable systems. Open, to allow for competitive evolution of individual components. Interoperable, to ensure the efficient sharing of information (audio/video, voice, and data) across technologies, equipment, and services.
The deployment of a universal bearer service would provide interoperability at the network level. Realistically, however, several kinds of networks are likely to develop pushing the issue of interoperability up to the data interchange level.
For example, a self-identifying header facilitates access to information services. A header, common to all environments, plays a key role in realtime communications by maintaining separation between content, method of transmission, and type of application. It enables data to be recognized and usable outside its original context and permits data driven communication.
The pace of technological change demands that forethought be given to accommodating new functionality in the design of a system. Broad deployment in a consumer market makes it particularly difficult to make major changes that obsolete previous systems. The use of presentation level headers and descriptors enable the construction of an extensible system. New media formats and information tags can be integrated without obsoleting equipment and services already deployed.
Scalability is an objective at several levels. Data streams should scale to accommodate a wide range of transmission conditions. The cost of equipment should scale according to quality and capability. Services should scale to accommodate an ever increasing number of users without serious degradation in performance. A transmission scheme that could intermingle low-cost as well as high performance streams is a clear opportunity for further research and development.
Industries are restructuring to survive. Traditional industry divisions as represented by television, telephone, and computing are defined by distinct vertically integrated systems of content, distribution, and usage - i.e., stovepipes - and operate with a different fundamental business models and under different regulatory constraints. But, as the markets restructure, dominate business practices collide. And, each is threatened by a new market where old distinctions cease to exist or be relevant and new distinctions remain unclear or require fundamental changes in how business is conducted.
A new model emerges where the significant distinction is made between content, communications, and interaction. Who knows what future structure will evolve? Yet, we have a basis from which to start: interoperability across previously distinct systems will be required to capitalize on the information marketplace.
Industries most directly involved are telephone, cable television, terrestrial television, and data networking (Internet and private data networks), but others are also vying for position: publishing, consumer electronics, computer, satellite, beepers, mobile communications, and practically anyone holding substantial rights-of-way (e.g., power and gas companies). Industry response to changing technology and markets is often varied: lead, follow, obstruct, ignore, usually in some combination.
In the long run (or not so long run), government(s) can not permit the information infrastructure to be based on proprietary technology and can not allow markets to be partitioned geographically or by segment. However, it is very difficult for companies to fully endorse this view. Proprietary solutions and partitioning markets are a natural pursuit of successful companies. It's hard to do otherwise. But, companies need to find another approach
An efficient market enables barrier-free entry for new useful enterprises and services. Wide adoption of critical standards, interfaces, and practices is needed. Strictly speaking proprietary interfaces can be used, but only if universality available and easily licensed.
In order have an effective local access system for the consumer, there can't be a different set of technologies and standards that are used in different geographic locations or for different services. Some in industry continue to invest in service specific access systems in order to preserve old market partitions. There is often a ill-perceived benefit in optimizing an interface for a particular application. Either in cost or performance. Technology is changing so rapidly that any cost or performance benefits are short lived and therefore this approach is ill-advised for mass deployment.
To promote universal access/service in a competitive environment, it is important that the system is based on non-proprietary technology and not partitioned by service or constrained by anachronistic industry divisions. Further, wide use requires attention to plug-n-play equipment and services, location independence/mobility, personal privacy, intellectual property, among others.
There is tension between technological, business, and public concerns. Successful business plan models are still wanting. Existing and proposed local access systems should support incremental deployment and subsequent incremental upgrade.
Most everyone acknowledges that deployment of the information infrastructure must be achieved through private investment and competition. Nevertheless, development and deployment funds are often insufficient or misplaced, as industries are investing heavily in expanding traditional systems in pursuit of illusive "killer-applications" (e.g., video on demand, 500 channels, home shopping, HDTV, etc.).
The single most important endeavor government could undertake is set and clearly articulate national and public policy objectives (the ground rules). Without this, the heavy private investment required will remain inherently risky and ineffective. Of almost equal importance is government's role as facilitator. If done well, facilitation can be a low cost way to influence industry and private investments to achieve national objectives in a mutually beneficial (and highly leveraged) manner.
Government should ensure the development of pivotal technologies and standards though direct funding of research and development and pre-competitive collaboration. This defrays the cost of developing technologies to meet objectives that are too broad to be met in a strict business context. For local access, this means removing technical barriers for industry moving away from anachronistic structures to a shared structure that is flexible, efficient, and scalable.
Antitrust is a particularly critical issue as we move from a regulated monopoly structure to competition. Local access systems necessarily depend upon scarce resources (RF spectrum and public rights-of-way) and major investments. These combined with anti-competitive behavior could create undue barriers to entry and threaten the sought after benefits of competition.
Local access systems, particularly wireless, depend upon the use of public resources (e.g., RF spectrum). Government needs to update its approach to how these resources are allocated. For example, the FCC continues to staticly allocate RF spectrum for service specific purposes whereas a dynamic allocation scheme built upon a bearer service would result in a more efficient and scalable system.
A basic decision needs to be made about adopting the objective of universal communications. It is hard to see how public benefits normally associated with the information infrastructure (e.g., re-education and training and delivery of public services) can be achieved if universality is not an objective. In any case, a new approach is needed. Universal service as formally defined does not translate to a service environment that changes at the pace of technology. Universal access, the ability for anyone to attach to the network in a barrier free manner is clearly a step in the right direction, but is this enough? Should the government be provider of last resort?
Concerns over privacy and security in the information infrastructure are pervasive and severe. An effective system of electronic commerce necessarily relies on robust security, privacy, and intellectual property protection. And, just as businesses have been cautious about connecting to the Internet for security reasons, consumers will not accept digital networks in their home without confidence that their data is secure and their privacy is not invaded. Moreover security and privacy needs to be pervasive and robust, otherwise the communications will remain partitioned along security domains (i.e., separate networks for commerce, health-care, law enforcement, banking, education, etc.) undermining the economies of scale.
Public policy issues concerning privacy and security in some circumstances overwhelm the technical issues discussed earlier and to a large degree technical solutions cannot be developed and validated until policy objectives and criteria are better established. Nevertheless, policy formation needs to be informed by a clearer understanding of what is technically feasible (what can and cannot be done and what can and cannot be prevented). Some in industry (and government) are looking for technological fixes for problems that aren't technical and the trend is towards developing systems that punish those who would willingly comply and that fail to deter those who would profit from non-compliance.
These issues are rooted in fundamental questions, e.g., what are an individuals rights to privacy under the law in the context of the information infrastructure? Should security/privacy expectations in the information infrastructure be any different than those in regular society? If networks are considered public places and communicating over networks equivalent to communicating in public, are existing laws sufficient or can they be slightly modified so that a common set of laws apply to both? How is the concept of "in the privacy of one's home" reconciled with the need for mobility?