Page 10
modestbased on a 4.77 MHz Intel 8088 processor, it included 64KB of RAM, a monochrome display, and a 5.25-inch floppy disk drive for permanent storage.
One of the first programs written for the PC was a terminal emulator which allowed the PC to mimic an IBM 3270 or DEC VT100 terminal. Of course, compared with the cost of a dumb terminal, the first PC was quite expensive. However, a number of office automation programs, such as WordStar (word processor) and VisiCalc (the first spreadsheet program), became very popular. As a result, a PC could serve dual purposes:
Also in 1981, Ashton-Tate introduced dBASE II, a single-user database program for the PC. At the time, most DP departments had a significant backlog for new applications development. In addition, developing a new mainframe-based applicationand operating itwas an expensive and complex proposition. The development and operational expenses included sizable charges for computer time and permanent data storage. The PC offered the user an attractive alternativethe purchase of a PC and the dBASE program and the development of a customized dBASE program that he had control over.
As the popularity of the PC exploded, organizations with multiple PCs needed a convenient, cost-effective method for the machines to communicate with each other and share expensive resources such as mass storage, modems, and printers. The Local Area Network (LAN)a combination of Network Interface Cards (NICs), cables, and software driversfilled this requirement. With the acceptance of the LAN by the marketplace, users wanted a database that could be shared by multiple PCs. Multiuser versions of dBASE and other file management systems, such as Paradox and FoxPro, were quickly released.
During this time, many single-user and multiuser applications were developed by both software vendors and in-house programmers. Applications based on these tools offered tremendous benefits, including an excellent return on investment when compared with similar mainframe-based applications. However, there were some negative aspects to this phenomenon:
Page 11
development was new for many of these organizations; because they had little experience with the software development methodology and lifecycle, many important steps, such as documenting requirements and design, were ignored, making it difficult to maintain the program when the original developers were no longer available.
As the PC-based databases were growing during the mid to late 1980s, another dramatic shift was occurring in the DP department: a rapid increase in the use of the relational database. For the first few years, relational databases were used primarily to build decision support systems (DSS). For example, a marketing department could load a relational database with information about product-line sales for several sales regions. Using SQL, a marketing analyst could pose queries against the database that required little or no programming.
In the mid 1980s, the relational database vendors recognized the value of separating the processing for an application between two machines: a client machine that would be responsible for controlling the user interface and a server machine that would host the relational database management system. To achieve this, a category of software, called middleware, was invented.
Each RDBMS vendor created two middleware components: a proprietary client driver and a corresponding proprietary server driver. Each network protocol and client or server operating system required a specific implementation by the RDBMS vendor. But the development of middleware provided some serious advantages to application developers: middleware made it possible for an application program running on client machines, each running a different operating system, to communicate with a database on a server machine, running yet a different operating system. Oracle's middleware product is named SQL*Net.
A two-tier architecture is a client/server computing architecture that consists of client machines communicating directly with a database server (see Figure 1.1).
Page 12
Figure 1.1.
Two-tier architecture.
A three-tier architecture is a client/server computing architecture consisting of client machines communicating with an application server. The application server may contain an Oracle database in which stored program units, written in PL/SQL, are invoked by the client application program. These stored program units communicate with the database server, which resides on a separate machine. The three-tier architecture is commonly used to balance the processing load of the server machines (see Figure 1.2).
Figure 1.2. Page 13
Three-tier architecture.
At first, most client/server applications were constructed with third-generation languages (3GLs) such as FORTRAN, C, or COBOL, that called RDBMS library routines. However, some RDBMS vendors, such as Oracle, saw the need for a tool that would streamline the development of database applications.
As a result, a new category of development software was created: fourth-generation languages (4GLs), whose name correctly implies that these tools were a higher level of abstraction than 3GLs. Today, 4GLs are commonly referred to as application development environments.
Initially, 4GLs generated a character-based user interface. However, with the acceptance of the Windows platform and widespread availability of VGA and SVGA graphics display adapters, the 4GL vendors began to support graphical user interfaces (GUIs) such as Windows, Mac, and Motif. In the early 1990s, these tools generated applications that were quite slow and buggy. Since then, many improvements have been achieved, both in operating system software (for example, the upgrade from Windows 3.11 to Windows 95) and hardware performance so that it is feasible to build reliable, well-performing applications.
By 1996, the development of client/server applications was a mature technology. Organizations understood how to use application development environments, such as Oracle Forms and PowerBuilder, to construct an application. But for a large organization, the administration of a client/server application was not a trivial exercise:
A widely-referenced study by the Gartner Group indicated that the average annual cost to administer a PC was $12,000. Essentially, many managers viewed these problems as a financial and administrative nightmareand they looked for a way out.
At the same time, two events were taking place that could not be ignored: