Previous Page TOC Next Page

— 1 —
The Business Opportunity

Executive Summary

We are in the midst of a fundamental change in both technology and its application. Organizations today expect to get more value from their investments in technology. In the "postscarcity era of computing"1 the availability of processing power is not a constraint. Cost of platform technology has become a minor factor in selecting among alternatives to build the business solution. The constraining factors are the organizational impact of reengineering the business process and the costs and time required for system development. In addition, the need to re-educate personnel to the required level of expertise can be an extremely expensive proposition.

Open systems enable organizations to buy off-the-shelf solutions to business problems. Open systems standards define the format in which data is exchanged, remote systems are accessed, and services are invoked. The acceptance of open systems standards supports the creation of system architectures that can be built from technology components. These standards enable us, for example, to

Contrary to the claims of groups ranging from the Open Software Foundation (OSF) to the user/vendor consortium Open User Recommended Solutions (OURS), open systems are not exclusively systems that conform to OSF or OURS committee recommendations, or necessarily to UNIX specifications.

The client/server model makes the enterprise available at the desk. It provides access to data that the previous architectures did not. Standards have been defined for client/server computing. If these standards are understood and used, organizations can reasonably expect to buy solutions today that can grow with their business needs without the constant need to revise the solutions. Architectures based on open systems standards can be implemented throughout the world, as global systems become the norm for large organizations.2 While a supportable common platform on a global scale is far from standardized, it certainly is becoming much easier to accomplish. From the desktop, enterprise-wide applications are indistinguishable from workgroup and personal applications.

Powerful enabling technologies with built-in conformance to open systems standards are evolving rapidly.

Examples include object-oriented development, relational and object-oriented databases, multimedia, imaging, expert systems, geographic information systems (GIS), voice recognition and voice response, and text management. These technologies provide the opportunity to integrate their generic capabilities—with the particular requirements of an organization—to create a cost-effective and customized business solution. The client/server model provides the ideal platform with which to integrate these enabling technologies. Well-defined interface standards enable integration of products from several vendors to provide the right application solution.

Enterprise systems are those that create and provide a shared information resource for the entire corporation. They do not imply centralized development and control, but they do treat information and technology as corporate resources. Enterprise network management requires all devices and applications in the enterprise computing environment to be visible and managed. This remains a major challenge as organizations move to distributed processing. Standards are defined and are being implemented within the client/server model. Client/server applications give greater viability to worker empowerment in a distributed organization than do today's host-centered environments.

Driving Forces in the 1990s

Opportunities are available to organizations and people who are ready and able to compete in the global market. A competitive global economy will ensure obsolescence and obscurity to those who cannot or are unwilling to compete. All organizations must look for ways to demonstrate value. We are finally seeing a willingness to rethink existing organizational structures and business practices.

Organizations are aggressively downsizing even as they try to aggressively expand their revenue base.

There is more willingness to continue improvement practices and programs to eliminate redundancy and increase effectiveness. Organizations are becoming market-driven while remaining true to their business vision.

To be competitive in a global economy, organizations in developed economies must employ technology to gain the efficiencies necessary to offset their higher labor costs. Reengineering the business process to provide information and decision-making support at points of customer contact reduces the need for layers of decision-making management, improves responsiveness, and enhances customer service.

Empowerment means that knowledge and responsibility are available to the employee at the point of customer contact. Empowerment will ensure that product and service problems and opportunities are identified and finalized. Client/server computing is the most effective source for the tools that empower employees with authority and responsibility.

The following are some key drivers in organizational philosophy, policies, and practices.

Business Process Reengineering

Competitiveness is forcing organizations to find new ways to manage their business, despite fewer personnel, more outsourcing, a market-driven orientation, and rapid product obsolescence. Technology can be the enabler of organizational nimbleness.

Globalization—The World as a Market

To survive and prosper in a world where trade barriers are being eliminated, organizations must look for partnerships and processes that are not restrained by artificial borders. Quality, cost, product differentiation, and service are the new marketing priorities. Our information systems must support these priorities.

Operational Systems—Competition for Investment Dollars

Competition demands that information systems organizations justify their costs. Companies are questioning the return on their existing investments. Centralized IS operations in particular are under the microscope.

Market Driven—Flexible to Meet Needs

Product obsolescence has never been so vital a factor. Buyers have more options and are more demanding.

Technology must enable organizations to anticipate demand and meet it.

Downsized Organizational Structure

Quality and flexibility require decisions to be made by individuals who are in touch with the customer. Many organizations are eliminating layers of middle management. Technology must provide the necessary information and support to this new structure.

Enterprise Network Management

If a business is run from its distributed locations, the technology supporting these units must be as reliable as the existing central systems. Technology for remote management of the distributed technology is essential in order to use scarce expertise appropriately and to reduce costs.

Information and Technology Viewed as a Corporate Asset

Each individual must have access to all information he or she has a "need and right" to access, without regard to where it is collected, determined, or located. We can use technology today to provide this "single-system image" of information at the desk, whatever the technology used to create it.

Cost Competitive—New Offerings

Standardization has introduced many new suppliers and has dramatically reduced costs. Competition is driving innovation. Organizations must use architectures that take advantage of cost-effective offerings as they appear.

Increasing Power and Capacity of Workstations

Desktop workstations now provide the power and mainframe capacity that mainframes did only a few years ago. The challenge is to effectively use this power and capacity to create solutions to real business problems.

Growing Importance of Workgroup Computing

Downsizing and empowerment require that the workgroup have access to information and work collectively. Decisions are being made in the workplace, not in the head office.

Expanded Network Access

Standards and new technologies enable workstation users to access information and systems without regard to location. Remote network management enables experts to provide support and central, system-like reliability to distributed systems. However, distributed systems are not transparent. Data access across a network often has unpredictable result sets; therefore, performance on existing networks is often inadequate, requiring a retooling of the existing network infrastructure to support the new data access environment.

Open Systems—Multivendor Environment

Standards enable many new vendors to enter the market. With a common platform target, every product has the entire marketplace as a potential customer. With the high rate of introduction of products, it is certain that organizations will have to deal with multiple vendors. Only through a commitment to standards-based technology will the heterogeneous multiple vendor environment effectively service the buyer.

Client/Server Computing

Workstation power, workgroup empowerment, preservation of existing investments, remote network management, and market-driven business are the forces creating the need for client/server computing.

The technology is here; what is missing is the expertise to effectively apply it.

Major Issues of the 1990s

Organizational pressures to demonstrate value apply as much to the information systems (IS) functions as to any other element or operating unit of the business. This is a special challenge because most IS organizations have not previously experienced strong financial constraints, nor have they been measured for success using the same business justification "yardstick" as other value-creating units within the business enterprise. IS has not been under the microscope to prove that the role it plays truly adds value to the overall organization. In today's world, organizations that cannot be seen to add value are either eliminated or outsourced.

Complexity and Delivery Cost of IS Services

Fortune 1000 companies, on average, spend 90 percent of IS dollars maintaining existing systems. Major business benefits, however, are available only from "new" systems. Dramatic reductions in the cost of technology help cost justify many systems. Organizations that adapt faster than their competitors demonstrate value and become the leaders in their marketplace. Products and services command a premium price when these organizations are "early to market." As they become commodities, they attract only commodity prices. This is true of both commercial organizations wishing to be competitive in the market with their products and of service organizations wishing to demonstrate value within their department or government sector.

Wise Use of Existing Investments

"It only took God seven days to create the world because he didn't have an existing environment to deal with."3 Billions of dollars have been invested in corporate computing infrastructure and training. This investment must be fully used. Successful client/server solutions integrate with the existing applications and provide a gradual migration to the new platforms and business models.

Connectivity—Management of Distributed Data Resources

To meet the goals of the 1990s, organizations are downsizing and eliminating middle-management positions. They want to transfer responsibility to empower the person closest to the customer to make decisions. Historically, computer systems have imposed the burden of data collection and maintenance on the front-line work force but have husbanded information in the head office to support decision making by middle management. Information must be made available to the data creators and maintainers by providing the connectivity and distributed management of enterprise databases and applications. The technology of client/server computing will support the movement of information processing to the direct creators and users of information.

Online Transaction Processing (OLTP)

OLTP applications traditionally have been used in insurance, financial, government, and sales-related organizations. These applications are characterized by their need for highly reliable platforms that guarantee that transactions will be handled correctly, no data will be lost, response times will be extremely low (less than three seconds is a good rule of thumb), and only authorized users will have access to an application. The IS industry understands OLTP in the traditional mainframe-centered platforms but not in the distributed client/server platforms.

Mission-Critical Applications

Organizations do (and will continue) to rely on technology to drive business. Much of the IS industry does not yet understand how to build mission-critical applications on client/server platforms. As organizations move to employee empowerment and workgroup computing, the desktop becomes the critical technology element running the business. Client/server applications and platforms must provide mainframe levels of reliability.

Executive Information Systems (EIS)

Executive information systems provide a single-screen view of "how well we are doing" by comparing the mass of details contained in their current and historical enterprise databases with information obtained from outside sources about the economy and competition. As organizations enter into partnerships with their customers and suppliers, the need to integrate with external systems becomes essential in order to capture the necessary information for an effective EIS.

Decision Support Systems (DSS)

Organizations want to use the EIS data to make strategic decisions. The DSS should provide "what if" analyses to project the results of these decisions. Managers define expectations, and the local processing capability generates decision alerts when reality does not conform. This is the DSS of the client/server model.

Enterprise Solutions

Information is now recognized as a corporate resource. To be truly effective, organizations must collect data at the source and distribute it, according to the requirements of "need and right to access," throughout the organization. Workgroups will select the platforms that best meet their needs, and these platforms must integrate to support the enterprise solution. Systems built around open systems standards are essential for cost-effective integration.

Single-System Image

Los Angeles county issued a request for information (RFI) stating simply that its goal was "to implement and operate a modern telecommunications network that creates a seamless utility for all County telecommunications applications_from desktop to desktop."4

The United States government has initiated a project—the National Information Interchange (NII)—that has the simple objective of "making the intellectual property of the United States available to all with a need and right to access."5

"Computers will become a truly useful part of our society only when they are linked by an infrastructure like the highway system and the electric power grid, creating a new kind of free market for information services."6

The feature that makes the highway and electric power grids truly useful is their pervasiveness. Every home and office has ready access to these services; thus, they are used—without thought—in the normal course of living and working. This pervasive accessibility has emerged largely because of the adoption of standards for interconnection. If there were no standards for driving, imagine the confusion and danger.

What if every wall plug were a different shape, or the power available on every plug were random? If using a service requires too much thought and attention, that service cannot become a default part of our living and working environment.

"Imagine the United States without its highways. Our millions of cars, buses, and trucks driven in our own backyards and neighborhood parking lots, with occasional forays by the daring few along uncharted, unpredictable, and treacherous dirt roads, full of unspeakable terrors."7 The parking lot analogy illustrated in Figure 1.1 represents the current information-processing environment in most organizations.

It is easy and transparent to locate and use information on a local area network (LAN), but information located on another LAN is almost inaccessible. End-user access to enterprise data often is unavailable except for predefined information requests. Although computers—from mainframes to PCs—are numerous, powerful, flexible, and widely used, they are still used in relative isolation. When they communicate, they usually do so ineffectively, through arcane and arbitrary procedures.

1.1. Islands of automation.

Information comes with many faces. As shown in Figure 1.2, it can take the form of text, drawings, music, speech, photographs, stock prices, invoices, software, live video, and many other entities. Yet once information is computerized, it becomes a deceptively uniform sequence of ones and zeros. The underlying infrastructure must be flexible in the way it transports these ones and zeros. To be truly effective—besides routing these binaries to their destinations—the infrastructure must be able to carry binaries with varying degrees of speed, accuracy, and security to accommodate different computer capabilities and needs.

Because computers are manufactured and sold by vendors with differing views on the most effective technology, they do not share common implementation concepts. Transporting ones and zeros around, however flexibly, isn't enough. Computers based on different technologies cannot comprehend each other's ones and zeros any more than people comprehend foreign languages. We therefore need to endow our IS organizations with a set of widely understood common information interchange conventions. Moreover, these conventions must be based on concepts that make life easier for humans, rather than for computer servants. Finally, the truly useful infrastructure must be equipped with "common servers"—computers that provide a few basic information services of wide interest, such as computerized white and yellow pages.

Figure 1.2. Multimedia technologies.

Technological innovation proceeds at a pace that challenges the human mind to understand how to take advantage of its capabilities. Electronic information management, technological innovation in the personal computer, high-speed electronic communication, and digital encoding of information provide new opportunities for enhanced services at lower cost.

Personal computers can provide services directly to people who have minimal computer experience. They provide low-cost, high-performance computing engines at the site that the individual lives, works, or accesses the service—regardless of where the information is physically stored. Standards for user interface, data access, and interprocess communications have been defined for the personal computer and are being adopted by a majority of the vendor community. There is no reason to accept solutions that do not conform to the accepted standards.

Most large organizations today use a heterogeneous collection of hardware, software, and connectivity technologies. There is considerable momentum toward increased use of technology from multiple vendors.

This trend leads to an increasingly heterogeneous environment for users and developers of computer systems. Users are interested in the business functionality, not the technology. Developers rarely are interested in more than a subset of the technology. The concept of the single-system image says that you can build systems that provide transparency of the technology platform to the user and—at the largest extent possible—to the developer.

Developers will need sufficient knowledge of the syntax used to solve the business problem, but will need little or no knowledge of the underlying technology infrastructure. Hardware platforms, operating systems, database engines, and communications protocols are necessary technological components of any computer solution, but they should provide services—not create obstacles to getting the job done. Services should be masked; that is, they should be provided in a natural manner without requiring the user to make unnatural gyrations to invoke them. Only by masking these services and by using standard interfaces can we hope to develop systems quickly and economically. At the same time, masking (known as encapsulation in object-oriented programming) and standard interfaces preserve the ability to change the underlying technology without affecting the application. There is value in restricting imagination when you build system architectures. Systems development is not an art; it is an engineering discipline that can be learned and used. Systems can be built on the foundations established by previous projects.

Within the single-system image environment, a business system user is totally unaware of where data is stored, how the client and server processors work, and what networking is involved in gaining connectivity. How is this transparency accomplished?

Services are provided by the virtual "cloud" server in the sky. Figure 1.3 illustrates the user view of these services. The workstation on the desk appears to provide all services, "the enterprise at the desk."

Figure 1.3. Single-system image.

The complexity of a heterogeneous computing platform will result in many interfaces at both the logical and physical level. Organizations evolve from one platform to another as the industry changes, as new technologies evolve that are more cost effective, and as acquisitions and mergers introduce other installed platforms. All these advances must be accommodated. There is complexity and risk when attempting to interoperate among technologies from many vendors. It is necessary to engage in "proof of concept" testing to distinguish the marketing version of products and architectures from the delivered version.

Many organizations use a test lab concept called technology competency centers (TCCs) to do this "proof of concept." The TCC concept provides a local, small-scale model of all the technologies involved in a potential single-system, interoperable image.

Installing a proposed solution using a TCC is a low-cost means of ensuring that the solution is viable.

These labs enable rapid installation of the proposed solution into a proven environment. They eliminate the need to set up from scratch all the components that are necessary to support the unique part of a new application. Organizations—Merrill Lynch, Health Canada, SHL Systemhouse, BSG Corporation, Microsoft, and many others—use such labs to do sanity checks on new technologies. The rapid changes in technology capability dictate that such a resource be available to validate new products.

Client/Server Computing

The single-system image is best implemented through the client/server model. Our experience confirms that client/server computing can provide the enterprise to the desktop. Because the desktop computer is the user's view into the enterprise, there is no better way to guarantee a single image than to start at the desktop.

Unfortunately, it often seems as if the number of definitions of client/server computing depends on how many organizations you survey, whether they're hardware and software vendors, integrators, or IS groups. Each has a vested interest in a definition that makes its particular product or service an indispensable component.

Throughout this book, the following definitions will be used consistently:

Client/server computing is an environment that satisfies the business need by appropriately allocating the application processing between the client and the server processors. The client requests services from the server; the server processes the request and returns the result to the client. The communications mechanism is a message passing interprocess communication (IPC) that enables (but does not require) distributed placement of the client and server processes. Client/server is a software model of computing, not a hardware definition.

This definition makes client/server a rather generic model and fits what is known in the industry as "cooperative processing" or "peer-to-peer."

Because the client/server environment is typically heterogeneous, the hardware platform and operating system of the client and server are not usually the same. In such cases, the communications mechanism may be further extended through a well-defined set of standard application program interfaces (APIs) and remote procedure calls (RPCs).

The modern diagram representing the client/server model was probably first popularized by Sybase.

Figure 1.4 illustrates the single-system image vision. A client-user relies on the desktop workstation for all computing needs. Whether the application runs totally on the desktop or uses services provided by one or more servers—be they powerful PCs or mainframes—is irrelevant.

Effective client/server computing will be fundamentally platform-independent. The user of an application wants the business functionality it provides; the computing platform provides access to this business functionality. There is no benefit, yet considerable risk, in exposing this platform to its user.

Changes in platform and underlying technology should be transparent to the user. Training costs, business processing delays and errors, staff frustration, and staff turnover result from the confusion generated by changes in environments where the user is sensitive to the technology platform.

Figure 1.4. A modern client/server architecture.

It is easily demonstrated that systems built with transparency to the technology, for all users, offer the highest probability of solid ongoing return for the technology investment. It is equally demonstrable that if developers become aware of the target platform, development will be bound to that platform. Developers will use special features, tricks, and syntax found only in the specific development platform.

Tools, which isolate developers from the specifics of any single platform, assist developers in writing transparent, portable applications. These tools must be available for each of the three essential components in any application: data access, processing, and interfaces. Data access includes the graphical user interface (GUI) and stored data access. Processing includes the business logic. Interfaces link services with other applications. This simple model, reflected in Figure 1.5, should be kept in mind when following the evolution to client/server computing.

The use of technology layers provides this application development isolation. These layers isolate the characteristics of the technology at each level from the layer above and below. This layering is fundamental to the development of applications in the client/server model. The rapid rate of change in these technologies and the lack of experience with the "best" solutions implies that we must isolate specific technologies from each other. This book will continue to emphasize and expand on the concept of a systems development environment (SDE) as a way to achieve this isolation. Figure 1.6 illustrates the degree of visibility to specific technology components required by the developers.

Figure 1.5. Simplified application model.

Figure 1.6. Degree of technology visibility to developer.

Developer tools are by far the most visible. Most developers need to know only the syntax of these tools to express the business problem in a format acceptable to the technology platform. With the increasing involvement of noncomputer professionals, as technology users and application assemblers, technology isolation is even more important. Very few—perhaps none—of an organization's application development staff needs to be aware of the hardware, system software, specific database engines, specific communications products, or specific presentation services products. These are invoked through the APIs message passing, and RPCs generated by tools or by a few technical specialists.

As you will see in Chapter 6, the development of an application architecture supported by a technical architecture and systems development environment (SDE) is the key to achieving this platform independence and ultimately to developing successful client/server applications.

As organizations increase the use of personal productivity tools, workstations become widely installed. The need to protect desktop real estate requires that host terminal capabilities be provided by the single workstation. It soon becomes evident that the power of the workstation is not being tapped and application processing migrates to the desktop. Once most users are connected from their workstation desktop to the applications and data at the host mainframe or minicomputer, there is significant cost benefit in offloading processing to these powerful workstations. The first applications tend to be data capture and edit. These simplify—but still use—the transaction expected by an already existing host application. If the workstation is to become truly integrated with the application, reengineering of the business process will be necessary. Accounting functions and many customer service applications are easily offloaded in this manner. Thus, workgroup and departmental processing is done at the LAN level, with host involvement for enterprise-wide data and enforcement of interdepartmental business rules.

Figure 1.7 illustrates an increasingly rare viewpoint of tradition-bound developers and MIS directors who do not yet appreciate the role of workstations as an integral part of the application solution. The power of the desktop workstation and client/server technology must be unleashed in order to achieve the cost effectiveness available from the low-cost and high-powered processors available today.

Figure 1.8 illustrates the existing environment in many organizations wherein desktop workstations have replaced the unintelligent terminal to access existing host-based applications.

Figure 1.7. An increasingly rare viewpoint.

Figure 1.8. Existing environment.

In this "dumb" terminal (IBM uses the euphemism nonprogrammable to describe its 327x devices) emulation environment, all application logic resides in the minicomputer, mainframe, or workstation. Clearly a $5000 or less desktop workstation is capable of much more than the character display provided by a $500 terminal. In the client/server model, the low-cost processing power of the workstation will replace host processing, and the application logic will be divided appropriately among the platforms. As previously noted, this distribution of function and data is transparent to the user and application developer.

Mainframe-Centric Client/Server Computing

The mainframe-centric model uses the presentation capabilities of the workstation to front-end existing applications. The character mode interface is remapped by products such as Easel and Mozart. The same data is displayed or entered through the use of pull-down lists, scrollable fields, check boxes, and buttons; the user interface is easy to use, and information is presented more clearly. In this mainframe-centric model, mainframe applications continue to run unmodified, because the existing terminal data stream is processed by the workstation-based communications API.

The availability of products such as UniKix and IBM's CICS OS/2 and 6000 can enable the entire mainframe processing application to be moved unmodified to the workstation. This protects the investment in existing applications while improving performance and reducing costs.

Character mode applications, usually driven from a block mode screen, attempt to display as much data as possible in order to reduce the number of transmissions required to complete a function. Dumb terminals impose limitations on the user interface including fixed length fields, fixed length lists, crowded screens, single or limited character fonts, limited or no graphics icons, and limited windowing for multiple application display. In addition, the fixed layout of the screen makes it difficult to support the display of conditionally derived information.

In contrast, the workstation GUI provides facilities to build the screen dynamically. This enables screens to be built with a variable format based conditionally on the data values of specific fields. Variable length fields can be scrollable, and lists of fields can have a scrollable number of rows. This enables a much larger virtual screen to be used with no additional data communicated between the client workstation and server.

Windowing can be used to pull up additional information such as help text, valid value lists, and error messages without losing the original screen contents.

The more robust GUI facilities of the workstation enable the user to navigate easily around the screen.

Additional information can be encapsulated by varying the display's colors, fonts, graphics icons, scrollable lists, pull-down lists, and option boxes. Option lists can be provided to enable users to quickly select input values. Help can be provided, based on the context and the cursor location, using the same pull-down list facilities.

Although it is a limited use of client/server computing capability, a GUI front end to an existing application is frequently the first client/server-like application implemented by organizations familiar with the host mainframe and dumb-terminal approach. The GUI preserves the existing investment while providing the benefits of ease of use associated with a GUI. It is possible to provide dramatic and functionally rich changes to the user interface without host application change.

The next logical step is the provision of some edit and processing logic executing at the desktop workstation. This additional logic can be added without requiring changes in the host application and may reduce the host transaction rate by sending up only valid transactions. With minimal changes to the host application, network traffic can be reduced and performance can be improved by using the workstation's processing power to encode the datastream into a compressed form.

A more interactive user interface can be provided with built-in, context-sensitive help, and extensive prompting and user interfaces that are sensitive to the users' level of expertise. These options can be added through the use of workstation processing power. These capabilities enable users to operate an existing system with less intensive training and may even provide the opportunity for public access to the applications.

Electronic data interchange (EDI) is an example of this front-end processing. EDI enables organizations to communicate electronically with their suppliers or customers. Frequently, these systems provide the workstation front end to deal with the EDI link but continue to work with the existing back-end host system applications. Messages are reformatted and responses are handled by the EDI client, but application processing is done by the existing application server. Productivity may be enhanced significantly by capturing information at the source and making it available to all authorized users. Typically, if users employ a multipart form for data capture, the form data is entered into multiple systems. Capturing this information once to a server in a client/server application, and reusing the data for several client applications can reduce errors, lower data entry costs, and speed up the availability of this information.

Figure 1.9 illustrates how multiple applications can be integrated in this way. The data is available to authorized users as soon as it is captured. There is no delay while the forms are passed around the organization. This is usually a better technique than forms imaging technology in which the forms are created and distributed internally in an organization. The use of workflow-management technology and techniques, in conjunction with imaging technology, is an effective way of handling this process when forms are filled out by a person who is physically remote from the organization.

Intelligent Character Recognition (ICR) technology can be an extremely effective way to automate the capture of data from a form, without the need to key. Current experience with this technique shows accuracy rates greater than 99.5 percent for typed forms and greater than 98.5 percent for handwritten forms.

Figure 1.9. Desktop application integration.

Downsizing and Client/Server Computing

Rightsizing and downsizing are strategies used with the client/server model to take advantage of the lower cost of workstation technology. Rightsizing and upsizing may involve the addition of more diverse or more powerful computing resources to an enterprise computing environment. The benefits of rightsizing are reduction in cost and/or increased functionality, performance, and flexibility in the applications of the enterprise. Significant cost savings usually are obtained from a resulting reduction in employee, hardware, software, and maintenance expenses. Additional savings typically accrue from the improved effectiveness of the user community using client/server technology.

Downsizing is frequently implemented in concert with a flattening of the organizational hierarchy.

Eliminating middle layers of management implies empowerment to the first level of management with the decision-making authority for the whole job. Information provided at the desktop by networked PCs and workstations integrated with existing host (such as mainframe and minicomputer) applications is necessary to facilitate this empowerment. These desktop-host integrated systems house the information required to make decisions quickly. To be effective, the desktop workstation must provide access to this information as part of the normal business practice. Architects and developers must work closely with business decision makers to ensure that new applications and systems are designed to be integrated with effective business processes. Much of the cause of poor return on technology investment is attributable to a lack of understanding by the designers of the day-to-day business impact of their solutions.

Downsizing information systems is more than an attempt to use cheaper workstation technologies to replace existing mainframes and minicomputers in use. Although some benefit is obtained by this approach, greater benefit is obtained by reengineering the business processes to really use the capabilities of the desktop environment. Systems solutions are effective only when they are seen by the actual user to add value to the business process.

Client/server technology implemented on low-cost standard hardware will drive downsizing. Client/server computing makes the desktop the users' enterprise. As we move from the machine-centered era of computing into the workgroup era, the desktop workstation is empowering the business user to regain ownership of his or her information resource. Client/server computing combines the best of the old with the new—the reliable multiuser access to shared data and resources with the intuitive, powerful desktop workstation.

Object-oriented development concepts are embodied in the use of an SDE created for an organization from an architecturally selected set of tools. The SDE provides more effective development and maintenance than companies have experienced with traditional host-based approaches.

Client/server computing is open computing. Mix and match is the rule. Development tools and development environments must be created with both openness and standards in mind.

Mainframe applications rarely can be downsized—without modifications—to a workstation environment. Modifications can be minor, wherein tools are used to port (or rehost) existing mainframe source code—or major, wherein the applications are rewritten using completely new tools. In porting, native COBOL compilers, functional file systems, and emulators for DB2, IMS DB/DC, and CICS are available for workstations. In rewriting, there is a broad array of tools ranging from PowerBuilder, Visual Basic, and Access, to larger scale tools such as Forte and Dynasty.

Preserving Your Mainframe Applications Investment Through Porting

Although the percentage of client/server applications development is rapidly moving away from a mainframe-centric model, it is possible to downsize and still preserve a larger amount of the investment in application code. For example, the Micro Focus COBOL/2 Workbench by Micro Focus Company Inc., and XDB Systems Inc., bundles products from Innovative Solutions Inc., Stingray Software Company Inc., and XDB Systems Inc., to provide the capability to develop systems on a PC LAN for production execution on an IBM mainframe. These products, in conjunction with the ProxMVS product from Proximity Software, enable extensive unit and integration testing to be done on a PC LAN before moving the system to the mainframe for final system and performance testing. Used within a properly structured development environment, these products can dramatically reduce mainframe development costs.

Micro Focus COBOL/2 supports GUI development targeted for implementation with OS/2 Presentation Manager and Microsoft Windows 3.x. Another Micro Focus product, the Dialog System, provides support for GUI and character mode applications that are independent of the underlying COBOL applications.

Micro Focus has added an Object Oriented (OO) option to its workbench to facilitate the creation of reusable components. The OO option supports integration with applications developed under Smalltalk/V PM.

IBM's CICS for OS/2, OS400, RS6000, and HP/UX products enable developers to directly port applications using standard CICS call interfaces from the mainframe to the workstation. These applications can then run under OS/2, AIX, OS400, HP/UX, or MVS/VSE without modification. This promises to enable developers to create applications for execution in the CICS MVS environment and later to port them to these other environments without modification. Conversely, applications can be designed and built for such environments and subsequently ported to MVS (if this is a logical move). Organizations envisioning such a migration should ensure that their SDE incorporates standards that are consistent for all of these platforms.

To help ensure success in using these products, the use of a COBOL code generator, such as Computer Associates' (previously Pansophic) Telon PWS, provides the additional advantages of a higher level of syntax for systems development. Telon provides particularly powerful facilities that support the object-oriented development concepts necessary to create a structured development environment and to support code and function reuse. The generated COBOL is input to the Micro Focus Workbench toolkit to support prototyping and rapid application development. Telon applications can be generated to execute in the OS/2, UNIX AIX, OS400, IMS DB/DC, CICS DLI, DB2, IDMS, and Datacom DB environments. This combination—used in conjunction with a structured development environment that includes appropriate standards—provides the capability to build single-system image applications today. In an environment that requires preservation of existing host-based applications, this product suite is among the most complete for client/server computing.

These products, combined with the cheap processing power available on the workstation, make the workstation LAN an ideal development and maintenance environment for existing host processors. When an organization views mainframe or minicomputer resources as real dollars, developers can usually justify offloading the development in only three to six months. Developers can be effective only when a proper systems development environment is put in place and provided with a suite of tools offering the host capabilities plus enhanced connectivity. Workstation operating systems are still more primitive than the existing host server MVS, VMS, or UNIX operating systems. Therefore, appropriate standards and procedures must be put in place to coordinate shared development. The workstation environment will change. Only projects built with common standards and procedures will be resilient enough to remain viable in the new environment.

The largest savings come from new projects that can establish appropriate standards at the start and do all development using the workstation LAN environment. It is possible to retrofit standards to an existing environment and establish a workstation with a LAN-based maintenance environment. The benefits are less because retrofitting the standards creates some costs. However, these costs are justified when the application is scheduled to undergo significant maintenance or if the application is very critical and there is a desire to reduce the error rate created by changes. The discipline associated with the movement toward client/server-based development, and the transfer of code between the host and client/server will almost certainly result in better testing and fewer errors. The testing facilities and usability of the workstation will make the developer and tester more effective and therefore more accurate.

Business processes use database, communications, and application services. In an ideal world, we pick the best servers available to provide these services, thereby enabling our organizations to enjoy the maximum benefit that current technology provides. Real-world developers make compromises around the existing technology, existing application products, training investments, product support, and a myriad other factors.

Key to the success of full client/server applications is selecting an appropriate application and technical architecture for the organization. Once the technical architecture is defined, the tools are known. The final step is to implement an SDE to define the standards needed to use the tools effectively. This SDE is the collection of hardware, software, standards, standard procedures, interfaces, and training built up to support the organization's particular needs.

The Real World of Client/Server Development Tools

Many construction projects fail because their developers assume that a person with a toolbox full of carpenter's tools is a capable builder. To be a successful builder, a person must be trained to build according to standards. The creation of standards to define interfaces to the sewer, water, electrical utilities, road, school, and community systems is essential for successful, cost-effective building. We do not expect a carpenter to design such interfaces individually for every building. Rather, pragmatism discourages imagination in this regard. By reusing the models previously built to accomplish integration, we all benefit from cost and risk reduction.

Computer systems development using an SDE takes advantage of these same concepts: Let's build on what we've learned. Let's reuse as much as possible to save development costs, reduce risk, and provide the users with a common "look and feel."

Selecting a good set of tools affords an opportunity to be successful. Without the implementation of a comprehensive SDE, developers will not achieve such success.

The introduction of a whole new generation of Object Technology based tools for client/server development demands that proper standards be put in place to support shared development, reusable code, interfaces to existing systems, security, error handling, and an organizational standard "look and feel." As with any new technology, there will be changes. Developers can build application systems closely tied to today's technology or use an SDE and develop applications that can evolve along with the technology platform.

Chapter 6 discusses the software development issues and the SDE, in particular, in greater detail.


1 Robert Orfali and Dan Harkey, Client-Server Programming with OS/2 Extended Edition (2: Van Nostrand Reinhold, 1991), p. 95.

2 Amdahl Corporation, Globalization, The IT Challenge (Amdahl Corporation, 1950) p. 14.

3 Anonymous.

4 Los Angeles County, RFI for Telecommunications Systems and Services (September 1991).

5 Dertouzos, Lester, and Solow, "Made in America," President's Commission on Industrial Productivity, (MIT, 1989), paperback edition VI, p. 163.

6 Michael L. Dertouzos, "Building the Information Marketplace," Technology Review, No. 94, (January 1991), pp. 30-31.

Previous Page TOC Next Page