The single-system image is a reality. In the future, cheap and powerful workstation technology will be available to everyonewith truly distributed applications using processing power wherever it is
available and providing information wherever it is needed. In the future, information will be available for use by owners and authorized users, without the constant need for professional systems developers and their complex programming languages. The
future will bring information captured at its source and available immediately to authorized users.
The future will provide information from data in its original form: image, video, audio, graphics, document, spreadsheet, or structured data, without the need to be aware of specific software for each form. Successful organizations of the
futurethose that are market-driven and competitivewill be ones using client/server as an enabling technology to add recognized value to their product or service. The future is now for early adopters of technology. By the turn of the century,
the enterprise on the desk will be the norm for all successful organizations. Laggards will not be price competitive, will not provide competitive customer services, and soon will cease to exist.
Trends in computer hardware clearly indicate that D-RAM and processor MIPS are going to become very cheap. Object technologies based on the CORBA model and represented today by Sun's DOE project will enable the resources of a network of
machineseach processor available as client and serverto participate in providing business solutions. Networked computing provides an opportunity for whole new classes of client/server computing. OS/2, various versions of UNIX, and Windows NT
provide the necessary components-shared memory, preemptive multitasking, database servers, communications servers, and GUI services. Suddenly, because of the conjunction of these components, truly distributed, peer-to-peer computing is a reality.
Applications will find their servers without the need for application developers help. This new environment has been intriguingly labeled the post-scarcity personal computing environment by two IBM OS/2 architects, Robert Orfali and Dan Harkey.1
The power available on each processor enables architects to layer software through application program interfaces (APIs) that hide the underlying platform hardware and software from the developer. APIs show the developer a single-system image across a
heterogeneous network of processors. Platforms will be selected for their cost effectiveness in meeting a particular business need rather than as upgrades to existing installed equipment. Hardware and software vendorsbased on their capability to
provide the platform that best meets the business needwill compete. The real competition will revolve around who provides the best user/developer productivity. Effective application maintenance and enhancement will be the primary criteria for product
Object-oriented development (OOD) can facilitate the system development environments (SDE) described throughout this book. The premise behind OOD is code reuse. The traditional concept of code reuse involves creating repositories of
software that can be reused by developers. The object-oriented concept takes this traditional view and recycles it with greater formalism and improved repository management tools. The good news is that code reuse and OOD works; we have measured significant
productivity improvements for development and maintenance, compared to standard development methodologies based on sound structured development practices. However, there is a steep learning curve that must be climbed before these gains are realized. OOD is
not a new technology; it has been around for more than 15 years. A true OOD standard developing environment has yet to be established. Until this standardization occurs, the full potential of OOD described in this chapter will not be reached.
OOD probably will be accepted for its contribution to zero defect development. The capability to reuse previously tested components is fundamental to most engineering and manufacturing processes. It is now becoming fundamental to the systems
Client/server computing describes a model for building application systems, along with the core hardware and software technology that helps in building these systems. The material in the following paragraphs describes aggregations of these core
technologies that have created enabling technologies. Enabling technologies are combinations of hardware and software that can be used to assist in creating a particular kind of application system.
The main business advantage of using expert systems technology is the opportunity to protect business know-how. Many organizations are severely short of experts. With the aging of the work force, and as a large number of experienced workers
retire together, this shortage will become worse. Encapsulating the rules that describe an expert's response to a business situation into a stored electronic rules base, provides the substantive opportunity for higher productivity and reduced costs as
these stored rules are consistently applied.
In applications using expert systems products, such as those from Trinzic Corp., Nexpert, and others, objects are created that include this expert knowledge. These objects can be reused in applications to support decision making throughout the entire
organization. Figure 10.1 illustrates the benefits organizations can obtain by using expert systems technology.
Figure 10.1. A knowledge-based system (KBS).
Many expert systems products are merely glitz. They are simplified to enable trivial applications to be developed but are not useful because they do not have the capacity to handle the complexity of real-life business processes. A major flaw in many
products is their inability to integrate with a company's information systems and databases. If a product cannot be integrated into the organizational SDE or cannot use the organization's databases directly, it is not useful. It isn't practical to create
multiple development environments and copies of data to support real-time decision making. Some expert systems products are used for after-the-fact analysis, but the best products are integrated into the business. Figure 10.2 illustrates a typical
Figure 10.2. Architecture of a typical expert systems application.
Expert systems applications are well-suited to the client/server model. In addition to the advantages offered by the user interface component, the rules base benefits from the processing power and ease of use at the workstation. In most
implementations, rules should be managed by a knowledgeable user and not by a professional programmer, because the user is the only one intimately familiar with how his or her job worksa job the expert system must emulate.
The rules are enforced by the inference engine, a CPU-intensive process that takes advantage of the low-cost processing and RAM available with client/server technology. Most applications will be implemented using existing databases on host-based DBMSs.
The client/server model provides the necessary connectivity and services to support access to this information.
Expert systems currently are used mainly in government and financial organizations. In the government sphere, knowledgeable personnel create rule bases to determine people's eligibility for programs such as welfare aid. Welfare programs, in particular,
change these rules rapidly, and an expert system that manages and applies these rules can improve the fairness and decrease the cost of adjudication for the program.
The financial community has built loan determination rules that can significantly reduce the time required to analyze loan application and determine the loan's risk of default. A new application of expert systems is network management. In particular,
remote LAN management is an ideal application for expert systems technology. The network alerts are processed by a rules-based analyzer to help diagnose problems. Historical data is captured and maintained for subsequent comparison. The rules base also can
invoke regular preventive maintenance.
The retail sector is beginning to use expert systems for real-time management. In an ideal scenario, a manager uses a rules base to describe the expected results for products as they are introduced or repackaged. The system audits the reality against
the expectation in real time. Only when results are different than the defined expectation is the manager notified. This allows unexpected results, good or bad, to be detected early and allows the manager to concentrate on customers or new programs when
expectations are being met.
Geographic information systems (GISs) provide the capability to view the topology of a landscape, including features such as roads, sewers, electrical cables, and mineral and soil content. GIS is a technology that has promised much and finally
is beginning to deliver. As with the expert systems technology, GISs are truly useful when they integrate with the business process. From a technological perspective, GISs must operate on standard technologies, integrate with the organization SDE, and
directly access the organizational databases.
Conceptually, GISs enable users to store virtually unlimited geographic information as a series of layers. Some layers, such as street layouts, compose the base map. Other layers, such as wetlands and subterranean water sources, are
thematic layers that serve a specific, sometimes narrow purpose. A GIS user can custom design a printed map to fill a particular need by simply selecting the relevant layers. Selecting the street layer and the wetlands layer would produce a map of
wetlands and their relationship to the streets. Selecting the subterranean water sources layer and the wetlands layer would show the wetlands superimposed on the features of the underlying aquifer.
Each line, curve, and symbol in a map is fixed in space by a series of numbers, called the spatial data. Spatial data describes the precise positioning of map objects in three-dimensional space.
Besides storing map objects such as street segments and wetland boundaries, GISs enable designers to specify attributes the users want to associate with any map object. Such attributes may be descriptive data, detailed measurements of any kind, dates,
legal verbiage, or other comments. When viewing a map on-screen, the user can click any map object, and a data-entry window will open to display the attributes associated with that object. Attribute information is usually stored in RDBMS tables, and each
map layer can draw attributes from multiple tables.
GIS applications are naturals for client/server technology. Powerful workstations manage the mapping. Connectivity enables shared access to the layers maintained by various departments. The GIS database is related to attributes stored in other
databases that provide considerably more value in combination. For example, combining the voters list with the street maps allows polling booths to be located with easy access for all and ensures that no natural or artificial barriers are blocking the way.
Point-of-service (POS) technologiestraditionally known as point-of-sale technologiesare ubiquitous. Every restaurant, supermarket, most department stores, and even auto service stations use POS technology at the site for pricing,
staff management, accounting, product distribution, and inventory control. POS is one of the most widely installed examples of client/server technology. Implementations use an intelligent cash register, bar code scanner, scale, or gas pump as the client
working with a UNIX or OS/2 server.
The integration of technology, business process, and management information in POSs is a model for the implementation of client/server applications. Some older implementations continue to use dumb client devices, but lower technology costs and the
growing use of object-oriented development techniques are moving more processing to the client. These applications have a specific set of characteristics; namely, they run in a large number of distributed sites and are frequently used by users with little
training in a business environment demanding rapid change. Appendix A describes a large POS application built and implemented for the United States Post Office.
There is a growing demand for POSs, such as applications, to improve service and reduce costs. Self-service customs and excise processing, postal counters, help services, libraries, and even vending machines are demanding the processing power and ease
of use that can be provided by this technology.
Imaging is the conversion of documents from a physical medium (for example, paper) to a digital form where they can be manipulated by computers. Imaging should be viewed as an enabling technology. Information that is available in
machine-readable form never should be converted to paper and scanned back into machine-readable form. The business process should strive to maintain and use information in machine-readable form from the earliest moment.
There is an unfortunate tendency to automate existing processes by converting recycled paper to digital form without considering whether the information printed on the form can be captured elsewhere and used without rekeying. Optical character
recognition (OCR) is an existing technology that offers powerful capabilities to convert the image of typed information on a form to text. Intelligent character recognition (ICR) enables handwritten input to be recognized. Our experience in text
form processing shows this technology to be capable of a high degree of reliability. Even more efficiency can be gained, whether through EDI or capture-at-source techniques, so the information on the form can be maintained in machine-readable form at all
times and communicated electronically where needed.
Figure 10.3 shows a typical document imaging system. Information is entered into the system from a scanner. The scanner, similar to a fax machine, converts the paper image into digital form. This image is stored on a permanent medium, such as a
magnetic or optical disk. Information must be indexed on entry so it can be located after it is stored. The index usually is stored in a relational database on a high-speed magnetic disk. Access to stored images is always initiated by an index search.
High-resolution screens enable users to view the images after storage. Laser printers are used to recreate the image on paper as required.
Document images are stored and accessed through standard data access requests. The only difference between the image of an application form and the textual information keyed from the form is the amount of space required to store the image. Typically, a
black-and-white image occupies 35K of storage space. The keyed data from a form typically occupies less than 2K of storage space.
As Figure 10.3 illustrates, images can be accessed by any workstation with access to the image server. Note that the image server replaces the filing cabinet but provides the additional advantage of allowing multiple access to the same documents or
folders. The movement toward standards for the creation, distribution, indexing, printing, display, and revision of images has enabled a large number of vendors to enter the market with products. This has led to a dramatic reduction in the price of these
components. Figure 10.4 plots this price change.
Figure 10.3. A typical document imaging system.
Figure 10.4. Declining costs of imaging peripherals.
The concepts of electronic image management relate to the manipulation of information contained in forms, blueprints, x-rays, microfilm, fingerprints, photographs, and typewritten or handwritten notes. Electronic document management adds the capability
to manipulate information from other media, such as audio and video. In addition, the "folder" device gives the user an intuitive interface to information equivalent to, but more flexible than, a filing cabinet. The information is always
available on the desktop. Several users can use the folder simultaneously. Folders are always refiled in the correct location. Billions of these documents exist and are used daily in the process of providing government services. Consider that the Los
Angeles County municipal hospitals alone have 5 billion pieces of paper, x-rays, scans, photographs, audio reports, and videos (and so on) filed to maintain patient records. Currently, the cost of this technology is prohibitively high for most
organizations, but these systems will come down in price as all computer components do.
Figure 10.5 illustrates the range of information sources that can be manipulated digitally. To make efficient and effective use of this information, the means must exist for rapid filing, retrieval, and sharing of this information among all persons.
This is the principle of making information available only to those with a "need and a right to know."
Figure 10.5. Multimedia technologies.
Electronic mail can be delivered routinely in seconds anywhere in the United States. Consumers can have direct access to suppliers. Goods can be ordered and paid for electronically. A retired engineer in Northern California can teach algebra to
disadvantaged children in Compton, located in the southern part of the state. A parent can deliver office work to an employer in downtown Los Angeles while he cares for children at home. Library and museum materials can be explored at the users' own pace,
with their personal interests in mind, to tap into a rich assortment of interactive, graphical how-to lessons. The community library can provide the conduit to government services: taking drivers' photographs and producing drivers' licenses on-site,
producing birth certificates, or searching the titles on properties. Lawyers can file case data, review their calendars, or locate their clients' criminal records all from the law office and under the safeguards provided by electronic passwords, user
auditing, and caller ID validation.
Each of these functions can be conveniently provided to citizens and consumers without them having to travel to an office location. The cost savings and environmental impact of this convenience are important considerations in today's society.
Businesses no longer need to rent or buy expensive office space close to clients and suppliers. Individuals can live where they want and commute electronically to work. It is easy to imagine how the provision of these services in such a convenient manner
can generate significant revenues that more than offset the cost of providing the service.
High-speed communications networks can provide the capability to distribute information other than voice conversations throughout a county, state, or country. With the advent of fiber-optic cabling, the capacity for information distribution to a
location, office, library, or home is essentially infinite. As this technology become readily available, we will be able to consider where best to store and use information without concern for transmission time or quality. This is particularly true within
a small geographical area, such as a county where the "right of way" is owned and private fiber-optic networks can be installed. High-speed networks in conjunction with new standards for data integrity ensure that information can be stored
throughout the network and properly accessed from any point in the network.
Electronic documents can be transmitted and received just like any other digital information. The same networks and personal computers can send and receive. The major stumbling blocks to widespread sharing of electronic documents have been the
incompatible formats in which various vendors store and distribute the digital image and the lack of a central repository of indexes to the documents. These indexes should describe the document content to enable users to select the correct folder and
Most information used by business and government today is contained in formats that are not manipulatable through traditional data-processing techniques. This is consistent with the "need and a right to know," mentioned earlier. Los Angeles
County, for example, decided to overcome these problems through the definition of standards that must be adhered to by all products acquired for county projects.
An area of explosive growth, coincident with the availability of high-powered workstations and RISC servers, is full-text retrieval. Originally a technology used by the military to scan covert transmissions and media communications, full-text
retrieval is now a mainstream technology. Vendors such as Fulcrum and PLS have packaged their technology to be used in more traditional business applications. Northern Telecom bundles a product, Helmsman, with all its switch documentation to facilitate
document access. All major news services provide an electronic feed of their information. This information is continuously scanned by reporters, businesses, and government offices to identify significant events or locate trends. Dow Jones provides their
news retrieval system with access to 10**12bytes (that's three billion pages) of textual data. Many criteria searches can be run against all this text in a few seconds. Figure 10.6 illustrates the flow of information in a full-text retrieval application.
The major hardware and software technologies that have made this technology production viable are Optical Character Recognition (OCR), ICR, optical storage, powerful workstations, large D-RAM, software algorithms, and high-resolution monitors. OCR and
ICR technologies convert the paper documents to text files. Companies such as Colera provide software to convert typewritten documents directly into WordPerfect format. Recent improvements in these algorithms provide support for most major fonts.
Improvements in handwriting recognition promise to enable users to enter data from handwritten documents as well. Colera provides a fax link that enables documents to be entered by way of OCR as they are received from a fax. Mitek provides high-speed ICR
engines to be used with document workflow applications. Embedded diagrams are maintained in image format.
Figure 10.6. Text management process.
Full-text indexing of documents is a CPU-intensive function, and the availability of low-cost, high-powered workstations has made the technology viable. (See Figure 10.7.) PC products such as Lotus Magellan enable the casual user to create full-text
indexes of all their files. Viewers and launchers within the products enable users to access these files in their native format and manipulate them using a data editor of choice. With the advent of Object Linking and Embedding (OLE 2.x) and CORBA-based
object solutions such as DOE, full-text access will become much more common to support capture and inclusion of source material. For high-performance retrievals, the indexes must support boolean search requests. The availability of large and low-cost D-RAM
provides the necessary environment. High-resolution monitors are necessary as we move to a multiwindowed environment using facilities such as OLE and DOE. Extensive use of these facilities will not be viable without the appropriate resolution, because
eyestrain will discourage use. We recommend Super VGA, a resolution of 1024 by 768, as a minimum for this type of multiwindowed work.
Figure 10.7. Text management process.
In the more than 40 years since the introduction of the stored program computer in 1951, we have seen tremendous advances in the capabilities of this technology. Computers have proven over and over that they can add numbers at mind-numbing rates. We
have extrapolated from this capability the functionality to maintain accounts, calculate bills, print checks, and create memos. All this functionality has enabled organizations to grow and do more work with fewer clerical and administrative staff.
As the world economy becomes more integrated, goods and services are provided by companies and individuals from all parts of the world. Consumers can and will buy the most cost-effective quality product and service available. This substantially
increases the necessity for organizations to demonstrate their value. Western economies, with their higher salaries and cost of plant, are particularly threatened by this trend. However, Western economies have the advantage of a highly educated population.
Educated staff are willing to accept decision-making responsibility and are better able to adapt to change. The challenge is to find ways in which technology can enable the West to capitalize on these advantages.
Many organizations and industries are finding solutions that use client/server technology to truly transform the working environment. The following are brief examples of business solutions and technology partnerships that apply this technology to
fundamentally change the business process. Several of these examples are further described in Appendix A, along with other client/server project examples.
Emergency (E911) dispatch operators are responsible for sending the right emergency response vehicles to an incident as quickly as possible and at the same time dealing with the crisis being reported over the telephone. This functionality must be
provided 24-hours-per-day, 365-days-per-year, with the maximum possible performance.
As you can imagine, most 911 callers are in a state of anxiety. The telephone switch provides the caller's telephone number and address to the dispatcher workstation. Traditional technical design of a 911 system involves the use of redundant
minicomputers connected to character-based terminals. This design solution provides the benefits of fault tolerance and high performance with the costs of complex user interfaces, considerable redundancy, and excess capacity.
Through the use of client/server computing, it is now possible to duplicate all of the functionality of such an existing traditional design with the additional advantages of better performance, a graphical user interface (GUI), a single point of
contact, higher reliability, and lower costs. With a client/server-based system, the dispatch operator is empowered to oversee how staff and equipment are allocated to each incident. The operator uses a GUI to dynamically alter vehicle selection and
routing. Maps may be displayed that show the location of all incidents, emergency response centers, and vehicles. Vehicles are tracked using automatic vehicle locator (AVL) technology. Obstacles, such as traffic congestion, construction, and environmental
damage (such as earthquakes) are shown on the map so the dispatcher can see potential problems at a glance.
The implementation of such an E911 service can dramatically improve the rate at which emergency calls can be answered and reduce the incidence of unnecessary dispatches. Workstation technology provides the dispatcher with a less stressful and more
functional user interface. The dispatcher can respond quickly to changes in the environment and communicate this information immediately to the vehicle operator. The system is remarkably fault-tolerant. If a single workstation is operating, the dispatcher
can continue to send emergency vehicles to the incident. This architecture is general enough to apply to any application that has reasonable quantities of transient data.
Electronic data interchange (EDI) technology enables unrelated organizations to conduct their business computer to computer without the need to use the same computer applications or technology in all locations. Combining just in time
(JIT) manufacturing with EDI truly transforms the process:
With EDI, a single entry by the person closest to the customer causes the facilities of the manufacturer and its suppliers to schedule appropriate production, shipping, and billing. The maximum possible time is allowed for all parties to process the
order, thus reducing their need to carry inventory. A further advantage comes when production is driven by orders, because only those products that will actually be sold are manufactured. Manufacturers are able to offer more flexibility in product
configuration, because they are manufacturing to order. The use of EDI standards allows organizations to participate in this electronic dialog regardless of differences among their individual technologies or application systems.
Financial analysts are overloaded with data. It is impossible for them to process all the data received. They must read it, looking for gems of information. Powerful workstation technology enables these analysts to specify personal filters to be
applied against the data in order to present only information of likely interest and to present it in order of most likely interest. These filters provide search criteria specific to each analyst and provide only information satisfying the filter criteria
to the analyst.
Improvements in technology enable the data to be scanned in real time. Alerts can be generated to the analyst whenever a significant event is detected. In this way, the analyst's job is transformed. He or she is now concerned with developing the rules
to drive the filters and with understanding how to react to the significant events that are detected. Meaningful and useful data is available to support the analyst's decision making. He or she has more time to make informed decisions.
This book discussed the vision of an application of technology that provides a single-system image view to all users of the technology. In the single-system model, each user has access to all applications for which he or she has a "need and
right" of access, without regard to the technology of the workstation, the network, or the location of the business data and logic. In this model, technology is treated as a commodity to be chosen for its price, performance, and functionalitynot
for the color of its box.
Achieving this vision requires the system developer to be equally insensitive to the technology. If the developer is aware of the specific technology, he or she will develop in a manner specific to that technology. In the single-system image model, the
developer needs to know only the syntax in which the business logic is specified. Through client/server technology available today, it is possible for developers to design and develop systems to support this single-system image concept.
Attention to industry standards and the creation and use of a dev-elopment environment that isolates the user from the technology is mandatory to enable platform technology to become a commodity. Object-oriented technology recognizes this fact and
offers the future promise of systems that are generated for an arbitrary target platform. Technology buyers will now be in control of their purchasing decisions and not subject to the whim of their current supplier. Applications can be developed in a
scalable manner and implemented on a platform appropriate for the workload at a particular location.
1 Robert Orfali and Dan Harkey, Client-Server Programming with OS/2 (New York: Van Nostrand Reinhold, 1991), p. 75.