Roadblocks on the Information Highway

By Charles E. Gardner


SECTION ONE:
Understanding the problem

Some agencies and departments are more highly evolved than others in terms of how they have carried out their automation and some are on the frontier of technology. But if you look at what's connected to the end of all the high tech stuff you are likely find a manager who can't even do a simple spreadsheet.


Author's Note: The terms Data Processing (DP), Information Management (IM) and Information Technology (IT) are used interchangeably throughout this paper. These terms indicate the computer management / automation bureaucracy in general, both inside and outside of government, not any specific DP/IM/IT department or individuals.

IM managers who read this it will probably say the guy who wrote it really doesn't understand the big, important, highly technical issues of the Information Technology. They are probably right, but what they don't seem to understand is that in order to get down the highway to Disney World you need to get the car off the blocks and out of the driveway. That's what this paper is all about.

Those supervising Information Managers who read this would be wise to keep in mind that the monorail at Disney's "Tomorrow Land" is state-of-the-art technology, but it runs in circles and really doesn't go anywhere.


Introduction

IM departments have never been able to provide custom tailored applications to solve the problems of the users. Back when there were only a couple of computers around that wasn't a problem because the few people who actually used a computer to solve problems were members of the IM staff; everyone else just entered data. Computers have become ubiquitous, but only a relatively few people have learned to use them to solve their own problems. Why? Over the years as IM departments have evolved into huge bureaucracies bloated with specification writers, contract administrators, and equipment specialists they have become further and further out of touch with the users and the tools they need to solve their own problems. In the 1980s, while choices for hardware and problem solving software tools proliferated, IM departments made narrow equipment and software choices based on what was cheapest and easiest to maintain.

On a spreadsheet the traditional IM approach looks great, but adding screens and keyboards does not solve problems. Problems are usually solved by the people that encounter them in the workplace. If a carpenter sees a nail sticking up he fixes the problem with a quick whack with his hammer. But, what if he had been given a screwdriver instead? What is the most common complaint of the people in the office who actually knows how to use a computer effectively? Usually that they cannot get IM to approve the hardware or software they need to solve problems. They get screwdrivers instead of hammers because many IM department have failed to learn these lessons:

This paper was written to explain the real issues of office automation from the perspective of a knowledgeable end-user. It attempts to explain why, despite all the progress in technology, managers still get the "real" information they use to manage the same way they did 20 years ago: on a printed report which is sorted in the wrong order and delivered a day late to their in box.

Millions of PC owners have proven that people can solve most of their automation problems themselves if they are given user-friendly applications like Microsoft FoxPro or Claris FileMaker Pro and some training on how to solve problems with automation. The latter is the most important because it is more difficult to work out the operational aspects and convince people that the change needed than it is to actually sit down and design a database. Teaching managers how to use the computer as a problem solving tool is the other objective of this paper.

Identifying the Roadblocks

When traffic is backed-up for miles the problem is usually a few slow drivers up at the front. The information highway is no different and there are three roadblocks that have kept traffic tied-up for years:

The people making the most progress on the information highway are the ones riding bicycles on the shoulder. They supply their own power, go as fast as they can pedal and get to the destination before anyone else because they can dodge roadblocks. They are the "power" users that are scattered throughout the end-user community. Many learned to use computers as a hobby in the 1970s when MS-DOS, Macintosh, spreadsheets, and PC databases were still several several years in the future and the only software was what you wrote yourself. They know how use computers productively and create new applications which make work more productive for themselves and others in their offices. More importantly, power-users know why the IM Bureaucracy can't provide the solutions users need and given the chance could help to fix the problem.

The IM roadblock started 30 years ago

Back in the 1960s when "Information Technology" was known as "Data Processing" it was an honest profession. Real men programmed in COBOL and the mission of the data processing deaparments was crystal clear; data was processed and something meaningful was done with it--usually the payroll. In the evolution from processing data to techonologizing information the mission became obscured by smoke and mirrors. OIS systems replaced typewriters and three generations of PCs have replaced OIS. Now PCs talk to each other over LANs and WANs, but like 900-TALK-SEX, the conversation is expensive and for the most part meaningless. Automation, by itself, does only two things: it makes redundancy more efficient, and as a result increases the need for filing cabinets.

The pursuit of technology to move information has taken precedence over its meaningful use. The "real" information that keeps the wheels of government turning is still largely passed on printed reports and then put into filing cabinets. IM managers are out of touch with the needs of users and judging from what is in use today in most government offices neither can tell the difference between automation and effective automation.

Why not all automation technology is effective

The perfect example of ineffectual automation is the current fad of electronic form generation. Instead of cranking a pre-printed form on a typewriter it can now produced on a computer and laser printer. Typically It is then printed and sent next department, which creates another electronic form to do its job function. That form sent to the next department, and so on, and so on...

How is the process more efficient, or less redundant? All the data is still input manually with the same number of keystrokes and passed from department to department on different paper forms. Where are the cost savings? The net cost per form produced is much greater. The same amount of paper is being used, but xerox bond costs much more that offset book paper and laser printer toner costs ten times more than printing ink. There's the also the initial cost of developing the computer forms and the cost of quarterly upgrades via CD-ROM at $10-$20 per copy plus the cost of adding a $400 CD-ROM drive to every computers. Despite all this expensive "progress" the typewriter and secretary will still be there--to type the envelope the electronic form is mailed in and file a copy in the filing cabinet.

This kind "smoke and mirrors" office automation has been going on for the past 20 years. The contractors supplying the stuff have been laughing all the way to the bank and the only people inside the bureaucracy that seem to understand the joke are the power-users. Who are these power-users, and how did they get so smart?

The Renaissance of computing

There are several generations of power-users around, but it is the 40-something generation of former micro-computer programmers who have the best insight on the effective use of automation in the 1990s. They learned how to use computers in the 1970s when interactive computing was a new frontier and the only software available was what you wrote yourself. This engendered a pioneer's sense of independence, freedom and self-reliance. After buying or building a micro-computer the first question was, "Now what do I do with it?" They learned to use a computer like a kid learns to ride a bike; wobbling and crashing at first as they learned to program in BASIC on their Heath, Radio Shack or Apple II micro-computer.

Learning to program was a necessity because there were no commercial software packages for the early micro-computers. Fortunately, BASIC was (and still is) a simple and intuitive command language. There were only about 15 commands needed to write most programs, and it was common practice to write a few command lines and then run them to see how they worked. This allowed the programmer to spot and correct mistakes immediately. This interactive feedback made application development fast and efficient.

Before too long these micro-computer programmers started creating interactive applications to do estimates, projections, budgets and nearly everything else they had been doing on their calculators. They were able to solve problems DP departments never had the time or resources to investigate, let alone automate, and they were doing themselves.

Interactivity is now taken for completely for granted and most are not aware that in the 1970s mainframe computers ran jobs like bakeries make cookies; in batches. There was no video screen or keyboard on a mainframe, just rows of switches and blinking lights like in the sci-fi movies. A mainframe application was submitted to an operator on a stack of machine-readable punched cards which contained the program code, followed by others with the data to be processed. Mainframe jobs were scheduled weeks in advance and turn-around time was measured in days. If someone submitted a job and got a printout the next day they were considered lucky, and it most likely meant someone else's job had "crashed."

The "batch" nature of mainframes limited the range of tasks they were suitable for to predictable, repetitive jobs like payroll checks and cost accounting reports. Programming a mainframe was extremely cumbersome--a programmer might wait several days just for a time slot to test his application. Compared to BASIC, which allowed the micro-programmer to just get on the "bike" and ride, the COBOL mainframe language was like having to build a bicycle blindfolded before you could ride it. As a result, someone with a "toy" micro-computer could produce an application in a few hours that would take a COBOL programmer day or weeks. But since the mainframe had neither keyboard nor screen there wasn't much point in even trying to do an interactive application on a mainframe.

Early microprocessor programmers exploited the interactive nature of this new technology by creating applications where the user provided input and got immediate results. They realized how ineffectual mainframes were for such tasks and had an entirely new outlook on how computers could interact with people to answer questions and solve problems. The short period between the introduction of the first inexpensive microprocessor-based computers and the birth of packaged software was the Renaissance of interactive computing and those who learned about computing then are all the wiser for it now.

Computing plunges into the Dark Ages--DP starts the smoke machine

As the micro-wizards of the 1970s started to bring their their "toy" computers to the office, they created a ground-swell demand for end-user interactive computing. But instead of recognizing and supporting these innovators, top anagement turned to the people who ran the computers which gave them their reports; the data processing department.

Data processing managers initially fought the idea of putting computers terminals in workplace, largely because they didn't have any themselves--most mainframe computers still ran on cards and didn't use terminals for input. The computer "professionals" locked in their batch-mode cookie factories were largely unaware that a new age of interactive computing had been born. They reacted to the demand for computers by supplying their vision of interactive computing; video display terminals for remote batch input. Their myopia, combined with ever increasing bureaucratic clout marked the beginning of both the Digital Dark Ages and the IM roadblock in passing lane of the information highway.

Mainframe computer hardware manufacturers responded to the demand for video screens by supplying "dumb" terminals which could be connected to tape machines in the data processing center. DP could now put an input terminal in the personnel department to replace the card puncher. While some efficiency was gained in the input process, the computer and batch mode programs which processed the data had remained largely unchanged; the terminal operators were simply keying the data onto a magnetic tape instead of punched cards. It looked modern and impressive. Mainframe computer now had spinning wheels of tape in addition to the banks of flashing lights. DP managers paraded tours of top executives past rows of churning tape machines and they rushed for their checkbooks to buy more. Management never noticed it's reports didn't arrive any faster.

Data processing bureaucrats learned an important lesson during this first phase of automation that it would apply for the next 20 years: adding new glitzy technology increases the budget, regardless of whether or not it is effective.

The reader should note the uncanny similarity between the card/tape "automation" and automation via electronic forms. If you don't, go back and read about the forms again.

OIS--Leading the lambs to slaughter

The masterstroke of bureaucratic genius which transformed Data Processing into Information Management was the addition of word processing to its domain. Centralized word processing systems quickly transformed the DP department from an obscure and expensive offshoot of accounting department into huge a omnipotent bureaucracy which made the typing, the most mundane aspect of office operations, entirely dependent on its technical expertise.

It is quite ironic that typewriters, which created the first revolution of office automation, were replaced by a technology which stopped the second office automation revolution dead in its tracks. The real revolutionaries, the micro-wizards who had just started to make progress in end-user application development on micro-computers, were quickly crushed by the IM administered word processing steamroller which now carried the more exotic moniker "OIS" -- Office Information Systems. They should have been called OLC--Only Looks like a Computer, because they allowed users to type a letter to dicuss a problem, but couldn't run an interactive program which could actually sovle it.

The mini-computer--the power of a mainframe is used like a typewriter

A new hybrid, the mini-computer, appeared in the early 1980s and started to replace the OIS systems. It could run COBOL batch programs like a mainframe, interactive programs in BASIC like micro, and had word processing like an OIS system. The most popular of these mini-systems, the WANG Laboratory VS, was marketed and sold as a turn-key office automation solution. The term "turn-key" is used to describe a package of hardware and end-user applications that a IM department could install, "turn the key" and supply the end-users with word processing and other applications. Because most IM departments did not have any significant in-house programming capability the VS systems were first used mainly for OIS-like word processing and a limited number of large-scale applications developed by outside contractors in COBOL.

The WANG VS arrived on scene just two years before the IBM PC and Lotus 1-2-3 spreadsheets. The VS lacked the processing power to run spreadsheets for multiple users and this would ultimately prove to be its downfall. Nevertheless, Information Management departments continued to purchase VS systems into the late 1980s and continue to support them in the early 90s at the expense of more effective solutions because they still view computing as they always have--a centralized resource made more efficient through economies of scale.

The emergence of the PC

Although the ease of programing in the interactive BASIC computer programing language was quite evident by the late 1970s IM departments, from their bigger is better turn-key perspective, viewed the lack of applications on the early micro-computers as a fatal flaw. From a bureaucratic perspective they had nothing to gain and everything to lose by supporting a computer which could be programmed by the end user. IM showed absolutely no interest in supporting micro-computers until the IBM PC and packaged PC software arrived on the scene and they could once again offer what appeared to be turn-key solutions.

The IBM PC and MS-DOS--What "C:>" really means

IM managers initially viewed PCs as a threat to their centralized empire. But when they actually tried one and discovered the MS-DOS "C: >" prompt meant, "See your Information Management Specialist" they embraced PCs with open arms and helped Bill Gates of Microsoft carry his first billion dollars to the bank. The fact that Gates's company had developed the MS-DOS operating system for IBM carried quite a bit of weight with guys who had been buying IBM mainframes for years. Before too long MS-DOS replaced COBOL as the IM lingua franca. Legions of mainframe operators were recycled and became experts at assembling and configuring PCs. But IM departments didn't develop the applcations to run on the PCs that the users needed to solve their problems; they supplied hardware, shrink-wrapped software, and classes on how to read the manuals.

The Achilles heel of IM departments has always been the lack of an effective in-house application development process. DP departments lack in-house application skills and experience because the first mainframe computers were sold as turn-key packages to handle basic functions such as accounting and payroll. Since the manufacturer supplied both hardware and software, the IM staff were typically troubleshooters rather than systems analysts and programmers who created their own applications.

The addition of turn-key OIS and mini-computer systems in the early 1980s increased the need for hardware support personnel and shifted IM's focus still further away from end-user application development.When database development tools became available for the WANG VS, IM departments would finally begin to develop large-scale, enterprise-wide applications in-house, but they had neither the personnel nor the financial resources to respond effectively to individual office or department needs.

The "turn-key" enterprise-wide application computing strategy left IM departments ill-prepared for the PC era. Because users previously had no means to develop their own applications they had little understanding of how computers could make their operations more efficient once they did receive a PC. Because IM had always relied on outside contractors for its applcations it didn't either.


[- next section - ] [- table of contents -]