Previous Table of Contents Next


Consequently, 2-tier client/server has had a difficult time becoming the mainframe killer it was predicted to be, because it has run into several limitations of the PC platform. It has become an administrative nightmare and too costly to deploy on a widespread basis. The rapid adoption of the Web as an application platform speaks to this. (I sometimes wonder whether we would find ourselves in this same crisis had X.11 terminals and Motif taken off just a bit faster.)

Even the current state of the art in automated software distribution systems still requires some amount of installation and periodic maintenance at each workstation, a situation that is completely unacceptable to larger organizations with hundreds or thousands of connected users.

Business managers have recently placed much attention on the total cost of ownership (TCO) for desktop computing. The personal computer industry has responded with a raft of initiatives and products designed to try to lower costs by easing the maintenance burden. However, most of these initiatives involve taking some degree of control away from the operator, which has not proven popular with end users, because they don’t get anything in return.

But even if users would accept them, managed or locked-down PCs are not enough. In order for users to want to give up control of their personal mainframes, they will have to be provided with a compelling array of services, applications, productivity tools, and personal freedom that is better than what they have today.

To remain competitive, companies must find a way to more quickly build and deploy applications at a lower cost, integrate them with existing systems, and manage them with a smaller staff, handling a much larger base of users with greater security. And, they need to get off the treadmill of constant software upgrades and the continual need for more memory and processor power.

A new architecture for rapidly developing and deploying reliable applications is needed. The computer industry is hard at work on the technologies to make network computing a near-term reality.

Network Centricity and Network Computing

First and foremost, network computing is about standards. Just as plumbers and electricians have developed standard parts, measurements, and construction techniques, so must those who produce computer programs begin to standardize. This is how the information industry, which has become the most important industry in the world, will begin to mature. Object technology and modeling standards, such as the Uniform Modeling Language (UML), will help.

Network computing is about universal access to data. By using standards for data and applications content such as HTML, Java, and XML, it is possible to configure a universal client. This client can be in the form of a variety of hardware devices, including personal computers, graphical terminals, Network Computers, or even mobile personal digital assistants. The client may also function as a terminal to a variety of host types, including UNIX, 3270, and Windows servers.

Network computing is also about universal access to applications. The location at which programs execute should be just as transparent to the end user as the location of a Web page is today. It should be easy to integrate all our business systems so that they can be uniformly accessed from the universal client. Standards for distributed processing and middleware services, such as CORBA, are providing, for the first time, a vendor-neutral way for applications to transparently run anywhere on the network. The choice of server operating system is becoming less relevant as the network becomes a single large computer system (witness the number of new commercial sites that are running versions of Linux and FreeBSD).

Network computing is about network services. The intelligence is in the network rather than all at your desk. High-powered, high-value services are centralized, as are access to massive content and computational power. Personal storage is professionally managed on secure, fast RAID. Powerful, reliable, and secure database servers and enterprise-class applications’ platforms guarantee that mission-critical information keeps flowing.

Finally, network computing is about bringing new applications online faster and at lower costs. The buying and selling of companies, acquisitions, mergers, and a high rate of change are the norm. The good news is that standards for component software are rapidly emerging and generating a huge amount of excitement and venture capital in the computer industry. Powerful applications servers will take simple objects such as Enterprise JavaBeans and turn them into reliable, scalable, and robust applications services.

Programmers will not have to constantly reinvent the wheel. And software, or at least common facilities and services, will become plug-and-play, enabling organizations to mix and match best-of-breed components into their information architectures.

The Appeal of Network Computing

For users, network computing promises to relieve the end user from unpleasant tasks such as system configuration, management, backups, installing software, getting everything to work together, and software upgrades. All of this is possible while still providing excellent personal productivity, individuality, and control.

For IT managers, network computing addresses the integration of disparate systems so that a single universal client can access any application or perform any function that is needed. It lowers costs by placing systems management chores in the hands of professional network administrators who can leverage their time and expertise by providing these services for hundreds of users at a time.

For developers, network computing is about rapidly developing and deploying new solutions or new interfaces to existing systems. The technologies lend themselves to rapid prototyping and iterative development. By creating object-oriented components that run on powerful application servers, developers can concentrate on just the functionality needed to solve the business problem at hand, instead of having to constantly reinvent and reimplement infrastructure.

Emerging Technologies

To understand how network computing will achieve all these benefits, you need to know the underlying technology that makes it possible. In this section, you look at briefly the benefits of objects; then you explore Java, CORBA, and Enterprise JavaBeans.

The Benefits of Object Technology

It is assumed that you understand the basics of object technology; however, here is a reminder of some of the benefits that objects bring to a system. All the other technologies covered are object-based, and as such, share in these benefits.


Previous Table of Contents Next
Используются технологии uCoz