In 1991 Linux was created by a Linus
Torvalds who, at the time, was an engineering student at the University
of Helsinki, Finland. From 1996 to 1997, the Linux operating systems was
one of only two operating systems that experienced positive growth in large
enterprises. Linux was cooperatively developed over the Internet
by an army of programmers from all over the world. It’s radical develop
model, licensing terms, price, and countless features have contributed
to Linux being placed at the heart of both business and science/research
organizations. The objective of this paper is to shed light on the
background of Linux and show how it has emerged as a major contender in
the operating system market. Published literature and magazine articles
were used as sources for this paper.
In 1997 Datapro conducted an extensive survey of operating systems (OS). The results of which were based on the opinions of over 800 survey participants, most which were managers and directors of information systems (IS) in large organizations. The study produced a surprising winner in many critical categories, including growth of operating systems, overall satisfaction, interoperability, cost of ownership, Java support, and availability.
This Datapro study practically became a marketing tool for the Linux operating system. One of they study’s key findings was that from 1996 to 1997 Linux was the only operating system, besides Windows NT, that was experiencing positive growth in large enterprises. See figure1 in the list of figures. Many studies that focus on the total number of Linux installations show that Linux more than doubled during that time period. The Datapro study takes a different approach and reveals that the total number of enterprises using Linux has increased by 27 percent.
Windows NT grew more than any other operating system, but in terms of overall satisfaction, Microsoft’s premier network OS failed to make even the top three in the Datapro survey. In fact, Windows NT placed fifth while Linux took the top spot. See figure 2 in the list of figures. Linux was also the winner in individual areas that made up Datapro’s "overall satisfaction" rankings. See figures 2a, 2b, 2c and 2d in the list of figures.
The Linux operating system was a virtual unknown five years ago.
In just the last three years it has emerged from the underground hacker
community to challenge the windows juggernaut in the operating systems
market. The main subject areas covered in this paper are (a)
Linux features and capabilities, (b) the roots of the open-source software
(OSS) movement, (c) Linus Torvalds (the creator of Linux), (d) the Linux
development model, (e) licensing, (f) Linux in industry and (e) key Linux
distributors. I will end the paper with a conclusion and closing
Linux Features and Capabilities
The Linux OS has many outstanding features that companies require and demand in an operating system. Linux is an advanced multi-user, multi-tasking OS originally built for Intel compatible PCs. It can be used as a Unix workstation or server for tasks ranging from large reliable Internet web servers, to low cost workstations for client/server networks.
The current feature list of Linux is impressive. For example, it runs on a wider range of PC equipment than any other non-Microsoft OS. When you consider that Linux also runs on Digital Alpha computers, Sun SPARC and now Apple PowerMac hardware, Linux has even Microsoft operating systems beat. In addition it offers a reliable multi-tasking/multi-threading environment on all these platforms, with support for symmetric multiprocessing (SMP), along with hardware drivers for practically all popular hardware.
Today Linux is considered the most formidable challenge to Microsoft’s
dominance in the OS software market. Linux has come a long way in
such a short period of time. This operating system didn’t even exist
until 1991. In August of that year, the creator of Linux posted the
following message was posted to a newsgroup (comp.os.minix) on the Internet,
“Hello everybody out there…I’m doing a (free) operating system (just a
hobby, won’t be big and professional like GNU) [the OS is] for a 386 (486)
[ PC].” To truly understand how Linux came so far, so fast it’s important
to see where and how it all began.
Copyrighting Software Infuses a Movement
If Linux and the open source software (OSS) movement had a single beginning, it was the day in 1979 when Xerox donated one of the first laser printers to the Artificial Intelligence (AI) lab at the Massachusetts Institute of Technology (MIT). The machine crashed a lot and induced AI lab programmer Richard Stallman to ask Xerox for the code that controlled the printer. Stallman planned to modify the program to respond to breakdowns by flashing a warning on the screen of everyone who had an incentive to fix the printer right away (Mann, 1999).
To make this modification, though, Stallman needed Xerox to give him the source code for the printer program. Stallman was accustomed to the freewheeling academic atmosphere of the AI lab where programmers worked communally, constantly borrowing and tinkering with one another’s code. So, to him, this request to Xerox was not out-of-the-ordinary. It seems that Xerox had granted Stallman a similar request made earlier on an equally trouble-prone printer. But on this occasion Xerox refused Stallman’s request on the grounds that they were copyrighting the software. An irate Stallman argued that, “Xerox was hoarding software (Mann, 1999).
Xerox was not alone. Software was becoming big business, and many companies felt compelled to protect their intellectual property rights. To further antagonize Stallman, Silicone Valley lured away some of the AI lab’s best and brightest. When these programmers worked for software companies, Stallman discovered, their code was proprietary. It could not be shared and built upon. Stallman’s conclusion was that copyrighting was destroying the programming community (Mann, 1999).
In 1984, Stallman founded the Free Software foundation. It’s chief goal was to develop an improved operating system that looked like, but did not use the source code of, Unix – the most common operating system on big computer networks. Invented by two researchers at bell Labs, Unix is now available in a dozen different versions from companies like IBM, Compaq, and Sun Microsystems. Stallman’s version was called GNU, a recursive acronym for “GNU’s not Unix.” (Mann, 1999).
The challenge of GNU was enormous. An operating system defines what services programs can ask of a computer (adding two numbers, moving information onto a hard disk, and so on) and directs requests for those services to the hardware (keyboard, monitor, microprocessor, and so on). But the system is useless without hundreds of subsidiary programs to perform specific tasks such as managing windows and communicating with printers and other peripherals. To produce a functional system, the GNU project had to create all these programs. “It’s like building a Jet plane from scratch in your garage,” says Bruce Perens, a free-software programmer who works at Pixar Animation Studios in Richmond, California (Mann, 1999).
Though Stallman’s project had created scores of programs that were used all over the world, the heart, or “kernel” of the GNU operating system was not achieved. An operating system can be described as a layered set of software. The kernel is the operating system layer that provides the most basic functions, including resource allocation and direct control over most hardware resources. Part of the reason Stallman couldn’t complete the kernel is that he chose not to duplicate the tried and true Unix kernel but to base the GNU system on an advanced, experimental kernel developed at Carnegie Mellon University (Mann, 1999).
Stallman was regarded as one of the few people in the world up to the task of developing a radically new kernel – and possibly the only one who could think of doing it almost single-handedly. After spending endless nights typing code for the GNU project, Stallman’s hands literally gave out on him. He actually tried to continue by using MIT students as transcribers (Mann, 1999). This approach proved unsuccessful as well.
No one stepped in to replace the sidelined Stallman, mainly because his
ideas were fundamentally antagonistic to business. At this point,
the free software movement needed a new champion. It wasn’t too long
thereafter that a new champion appeared to carry the torch for the cause
of free software.
Linus Torvalds and the Birth of Linux
While Stallman’s efforts had come to a swift and unexpected halt, half a world away the free software race inherited a fresh set of ideas and motives from Linus Torvalds. In 1991, Torvalds was an undergraduate student at the University of Helsinki, Finland. He was far from being an expert programmer like Stallman. Concerning his programming skills, Torvalds admits, “I didn’t even know what I didn’t know.” But he knew the power and stability of the Unix operating system well enough to regard Microsoft’s MS-DOS as no match. To Torvalds MS-DOS seemed to be the digital equivalent of being forced to write with a leaky pen (Mann, 1999).
Refusing to stand in long lines waiting to program on Unix and adamantly refusing to capitulate to the weaker MS-DOS, Torvalds set out to build a Unix-like OS to run on his newly purchased 386, which had only 4 megabytes of memory. This new computer was far too inferior to host a powerful Unix OS. But he still refused to subject himself to bad software. Ignoring DOS, Torvalds mashed together chunks of code from his instructor’s and his own work (Mann, 1999).
Somewhat unexpectedly, Torvalds ended up with something like a Unix kernel.
Because Stallman’s GNU project had created the necessary subsidiary programs,
he tweaked the kernel to fit them. Torvald’s work resulted in the
creation of a complete operating system. For the first time, the
flexibility, stability and power of Unix were available on a small computer.
In an attempt to give the operating system a name that reflected the free
nature of it, Torvalds called his operating system “Freax.” His friends
thought the name was dumb and changed it to what the world has come to
know as, Linux.
Linux’s Bazaar Development Style
No will dispute the integral role Linus Torvalds and Richard Stallman played in introducing Linux to the world. Subsequently, it was a world of ideas that has pushed Linux to the top. Over the next several years the Linux operating system was developed in an unprecedented manner. Torvalds took contributions of software code from individual developers from all over the world. Software programmers/engineers collaborated successfully over the Internet to improve the Linux kernel and subsidiary programs. This style of software development is best described in an essay titled, “The Cathedral and the Bazaar,” written by Eric S. Raymond. Raymond was a frequent contributor in the early days of Linux development.
Raymond’s essay has become gospel in the Linux community. Raymond is a veteran Unix programmer and free-software advocate. His essay highlights the major difference between traditional software development and that used to develop Linux. Raymond argues that software before Linux always had been produced in (what he referred to as) a “cathedral,” by an isolated team of programmers, who worked on the code until releasing a final, finished version. Linux on the other hand was assembled in a “bazaar” by a cacophonous scatter of independent programmers (Raymond, 1998).
Raymond became aware of Linux in 1993. By that time he had already been involved in Unix and free-software development for ten years. In his essay, he states, “I was one of the first GNU contributors in the mid-1980s. I had released a good deal of free software onto the net, developing or co-developing several programs…that are in wide use today. I thought I knew how it was done.” (Raymond, 1998).
Linux overturned much of what Raymond thought he knew. He had been preaching the Unix gospel of small tools, rapid prototyping and evolutionary programming for years. Raymond believed that, “…there was a certain critical complexity above which a more centralized, a priori approach was required. In his view, the most important software (operating systems and really large tools like Emacs) needed to be built like cathedrals, carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time (Raymond, 1998).
“Linus Torvald’s style of development – release early and often, delegate everything you can, be open to the point of promiscuity – came as a surprise. No quiet, reverent cathedral-building here – rather, the Linux community seemed to resemble a great babbling bazaar of differing agendas and approaches…out of which a coherent and stable system could seemingly emerge only by a succession of miracles,” exclaims Raymond (1998).
The fact that the bazaar style seemed to work extremely well was shocking to Raymond. As he learned his way around the Linux community, he worked hard not just at individual projects, but at trying to understand why the Linux world not only didn’t fly apart in confusion. Instead it, “seemed to go from strength to strength at a speed barely imaginable” to those accustomed to traditional cathedral style software development (Raymond, 1998). The Linux community only wanted to improve the quality of an operating system, and by accomplishing that feat, the Linux community revolutionized software development. Raymond was convinced that what the Linux community was accomplishing was not merely by chance.
In 1996, chance handed Raymond the opportunity to run his own free-software project using the bazaar Linux development style. The project evolved around developing an e-mail program called fetchmail. Fetchmail is a mail transfer agent (MTA) that is widely used today to move e-mail across the Internet. While leading this project, Raymond surmised that Linus Torvalds’ monumental achievement of constructing the Linux kernel was less consequential than his invention of the Linux development model (Raymond, 1998). Today, the Linux development model is more commonly referred to as the open-source software (OSS) development model.
The OSS development model has many distinguishing characteristics that set it apart from traditional software development. One of the primary differences, according to Raymond, is that, “In the cathedral-builder view of programming, bugs and development problems are tricky, insidious, deep phenomena (Raymond, 1998). It takes months of scrutiny by a dedicated few to develop confidence that you’ve wrinkled them all out. Thus the long release intervals.”
Microsoft’s Windows 2000 (formerly NT 5.0) is a prime example of the awful delays that traditional software developers incur. At the present time (Aug 99), Windows 2000 is 18 months behind its initially projected date of release. It is tentatively scheduled for a Jan 2000 release. That puts it almost two years behind schedule. Even when it’s finally released, it’s not guaranteed to be perfect. This too will cause a great deal of disappointment and anxiety for many organizations.
“In the bazaar view, on the other hand,” Raymond (1998) notes, “you assume that bugs are generally shallow phenomena – or, at least, that they turn shallow pretty quick when exposed to a thousand eager co-developers pounding on every single new release. Accordingly, you release often to get more corrections, and as a beneficial side effect you have less to lose if an occasional botch gets out the door.”
release early, release often.
Early and frequent releases was a critical part of the Linux development model in the early years (Raymond, 1998). There’s no question that frequent software releases can wear out the patience of the average computer user. But considering that Linux’s early users were mainly computer programmers, software engineers and various other technical specialists (many of whom had Unix backgrounds), they could hardly be considered average users. These early co-developers were probably passionately driven by (rather than frustrated with) frequent releases. It’s also likely that they anticipated each new release in the hope that their very own code was superior enough to “make-the-cut” and appear in subsequent releases of Linux.
Raymond believes that Torvalds’ innovation wasn’t so much in his release early and often concept (something like it had been Unix-world tradition for a long time), but in scaling it up to a level of intensity that matched the complexity of what he was developing. Add this to the fact that he was able to successfully leverage the Internet as a development platform and simultaneously maximize the number of person-hours thrown at debugging and development, it becomes clear why a veteran programmer like Raymond believes that the Linux development model was an astonishing feat.
Linux has an incredible lis of features. Entire articles can be (and
have been) devoted to some of its most prominent features. The multi-tasking,
multi-threading and symetric multiprocessing features of Linux have been
instrumental in its early success. Despite the fanfare over Linux’s
features, the facto that will ensure the long-term success of Linux have
little to do with the current list of features, and much more to do with
how it licensed.
Linux and the General Public License (GPL)
One of the most important things to know about any software product is how its licensed. Practically all software is covered by a license agreement of some form. The license covers the rules under which users are allowed to use the product. It is a way of protecting both the manufacturer and the user of the product.
All the Linux’s features seem even more remarkable when you consider that it is ‘freely distributable’ under the terms of the GPL license. Linux and most all other open source software products are licensed under the terms and conditions of the General Public License (GPL). The GPL allows users to copy, modify and redistribute the software. The flexibility afforded by the GPL allows users to do just about anything they want with their copy of the software. This “do as you please” notion of licensing software is unprecedented in commercial software distributions.
The licenses for most software are designed to take away a user’s freedom to share and change it. By contrast the General Public License is intended to guarantee the user’s freedom to share and change free software – to make sure the software is free for all its users. The GPL reference to ‘free software’ is not to the price of software. Rather, it refers to the freedom to distribute copies of free software (and charge for this service if you wish). This freedom also gives users access to valuable source code. Users are allowed to change the software or use pieces of it in new free programs (Red Hat Software Inc., 1997, p.285).
Commercial software licenses tend to be much more restrictive in nature. The typical license from a software manufacturer/distributor strictly forbid users from legally copying, modifying or redistributing the software. The advent of the CD writer/re-writer makes it really difficult to enforce license agreements if users choose to ignore them. Unless stricter enforcement comes along, software companies can only rely on the honor and integrity of businesses and individual users.
In the case of the GPL, users have the freedom to make as many copies as they deem necessary. The only real restriction of the GPL is that modifications or enhancements made to the software must be passed on for others to use. This way, the software continues to evolve to the equal benefit of all of its users.
Software manufacturers have made licensing a very profitable venture. Licenses come in many forms including the few listed below:
An InfoWorld article (Vizard, 1999) titled, “IT managers caught in budget imbroglio,” explains that, “When it comes to budgets, most IT managers constantly live between a rock and a hard place. The rock is usually the chief financial officer (CFO) trying to drive overhead costs out of the business. And what most CFOs can’t really understand is why, if they can drive cost out of every other aspect of their business, the cost of IT as a percentage of revenues still remains stubbornly high.
One of the main reasons is that key vendors such as Microsoft and Oracle are notorious for their unwillingness to negotiate pricing, even if the cost of creating an additional copy of software is almost nil. “This means that a lot of IT people have a lot of pent up ill will toward both of these companies because they both serve as the hard place that the rocks in their organizations are hammering them on.” (Vizard, 1999).
Another InfoWorld article (Scannel & Trott, 1999) titled, “Chipping away at Microsoft,” reveals how corporate America feels about the profit driven licensing policy of Microsoft. The article explains how competitive pressures will help companies get an upper hand when negotiating with Microsoft Corporation. With each successful installation of Linux and every antitrust accusation from government lawyers, corporate users are getting bolder about tossing aside Microsoft’s standard volume licensing plan and pushing for better deals on the company’s core operations system and application products.
IT managers now feel they have the strongest negotiating position they have had in years as Microsoft fights off invaders from every angle. This break is well overdue. For years, many IT executives have said they could not wait for the day when they could get Microsoft over the negotiating barrel. They have long felt that the Redmond, Wash. –based company’s lavish profits from Windows and its applications have often come at the expense of their own profits (Scannel 1999).
“Some weeks it looks like [Microsoft] feels entitled to capture not just part of what we save, but all of it. That just isn’t going to fly with corporate America forever,” says John Chapman, senior technology executive at Amoco’s architectures and planning group, in Chicago. “When your margins are more sensitive to Bill Gate’s pricing whims than they are [to] the price of oil, that’s an untenable position for a large company to be in (Scannel 1999).
Large companies’ ability to barter with Microsoft has never been more critical. Microsoft’s software prices have remained constant during the past few years, while competitors such as Louts ad WordPerfect have been forced to reduce their prices. Microsoft is believed to have used subtle licensing changes to boost revenue. Since 1996, Microsoft has eliminated prorated maintenance terms, added extra charges for home use; eliminated concurrent licensing on Exchange, Office, and Windows Terminal Server and rolled out Enterprise Editions as separate products, particularly NT 4.0 (Scannel 1999).
Michael Gartenberg, vice president of Stamford, Conn. –based Gartner Group, stated, “Licensing loopholes like concurrent licensing are going away. Look for the situation to get worse before it gets any better.” Another Gartner Group analyst, Alexa Bona, predicts that this trend will continue in coming years as Microsoft institutes non-perpetual licenses, splits Internet Information Server from Windows NT/2000, requires BackOffice licenses to run Office, and moves to capacity based server pricing in addition to client-access licenses. (Scannel 1999).
Software licensing is an important issue for companies. The complexity
and unpredictability of software licensing interfere with a company’s ability
to get a firm handle of future IT costs. The two InfoWorld (May 24
1999) articles strongly indicate that companies yearn for software licensing
that isn’t priced to cut deep into their revenues. GPL licensing
delivers just that. The GPL allows Linux and other open source software
products to offer simplicity, flexibility and affordability in licensing
that corporate America and the rest of the business world desperately wants.
Linux in Industry
Not all businesses need the long list of capabilities offered by the Linux OS. Some of the early converts have some of the most demanding operations to support. Choosing an unreliable crash-prone OS is not an option for these organizations. Here’s a look at a couple organizations using Linux. The first is a scientific/research based operation the second is a retail operation. Fermilab operates the world’s highest-energy particle accelerator, the Tevatron. More than 2,200 scientists from 98 U.S institutions in 36 states and 90foreign universities in 20 countries use Fermlab’s facilities to carry out research at the frontiers of particle physics. Nearly all aspects of this research are computational-intensive, from data acquisition to data storage and analysis.
Linux joins IRIX, AIX, Digital UNIX, and Solaris as flavors of UNIX for which Fermilab provides core utilities and applications to the high-energy physics community. The fact that Linux can easily be placed in such a demanding computer operation with other well established Unix based operating systems speaks volumes for the reliability organizations are shouldering on Linux. (“Fermi Lab,” 1998)
“Our investigations of Linux farms and the work done at other high-energy particle labs has shown the ability of PCs to provide massive computing power at low cost offer significant advantages,” said Steve Wolbers, Deputy Head of the lab’s Computer Division. “We feel confident that Linux farms can be built and run successfully.” (“Fermi Lab,” 1998) Linux has won the confidence of many major scientific and research-oriented organizations. Even NASA uses Linux.
The case of Fermilab points out two key features companies look for in purchasing any software. Price and performance are major considerations when buying any kind of software, particularly operating systems. Unfortunately, due to budget constraints companies sometimes have to compromise one for the other. With Linux, no compromise is required. Companies get all the power and stability they need with a host of other exceptional features Linux offers. All of this and more is available from Linux distributors for under $60. If that’s still too much, the Linux OS can be downloaded from the Internet for absolutely free. Many companies are steering towards Linux because it has the best price and performance on the market.
Take the case of Burlington Coat Factory Warehouse Corp. The $1 million-plus deployment at the $1.8 billion Burlington, NJ, discounter is the largest Linux retail installation announced by a U.S. company. “We’re pumped,’ said CIO Mike Prince. Prince stated that when the business decision to deploy Linux was finally made, “…all the developers cheered.” (Orenstein, 1999).
In February 1999, Burlington began installing Linux on fifteen hundred and fifty computers in its two hundred and fifty stores. Prince said, “Linux has come along so strongly, and the price of Intel PCs have dropped so much…[that it] is attractive from both a price and performance standpoint.” Prince adds, “It’s free, and it runs like the wind.” (Orenstein, 1999).
Prince said he also expects Linux to be less costly to support and maintain than Microsoft’s Windows NT, which he said is less stable. Burlington has a strong Unix history and has been using Linux on development workstations since January 1998 (Orenstein, 1999).
Burlington is unusual, however. It is known as a company comfortable with technical risk-taking, having previously embraced thin client and Java technologies and now Linux. Many other retailers are only beginning to think about those technologies. Sandra Potter, an analyst at Aberdeen Group Inc. in Boston believes that despite Burlington’s adventurous reputation such a large company’s willingness to base a substantial amount of its operations on Linux could send a strong signal that it’s a low-cost option that other companies should consider. “I think it may turn out to be groundbreaking,” she said (Orenstein, 1999).
Burlington’s current in-store systems are old. They’re based on Sun Microsystems Inc. SPARC-based machines running SunOS 4.1. Prince has the added option of scrapping these old machines or installing Linux on them. With the Linux’s ability to support operations at the cutting edge of particle science (“Fermi Lab, 1998), it’s easy to forget that Linux only needs minimal hardware resources on which to operate. Amazingly, tt can still operate on the original 386 MHz platform for which it was originally created.
Since Linux can be installed on low-end PCs, Prince and countless other IT specialists (LAN and system administrators) can take advantage of this significant opportunity for cost savings. Not having to upgrade existing PCs can save businesses thousands of dollars in hardware replacement costs. These savings add up and can be funneled into more useful areas of the business. In all, Burlington’s hardware will cost $1.15 million to $1.8 million, depending on the power of the machines, Prince said. This amounts to an enormous savings for Burlington. The company uses leading Linux distributor Red Hat Software Inc.’s version of Linux, the cost of which will be only a few hundred dollars.
Burlington will save thousands of dollars in each of its stores by not
buying a commercial operating system. Today’s businesses face competitive
pressures from many angles. This competitive environment puts a squeeze
on profits. Many companies have to resort to cutting operating costs
to stay alive. So, it seems imperative that companies take advantage
of any opportunity to cut costs. Linux offers the cost savings without
compromising quality. This point is increasingly garnishing the attention
of the countless businesses looking for a competitive edge.
Though Linux has many outstanding features, many in the software industry think it’s not yet ready for the enterprise. Linux has come a long way since inception in 1991. But the journey to the top is exactly that…a journey. The Linux development community needs to keep the focus on the needs of the users, particularly large enterprises. The development is working hard to optimize OS for a broad range of functionality and capability. It’s an uphill struggle that Linux distributors will have win to make serious headway in the software industry.
Corporations and IT managers worldwide will keep an eye on the progress of Linux. Their main point of focus will be the top two Linux distributors, Red Hat Software Inc. and Caldera Inc. Both of these companies are building a solid reputation with the business world. They’ll be counted on to deliver new and improved software products, services and support for Linux users.
An Infoworld article (May 17, 1999) discussed the strengths and weaknesses of both distributors. Red Hat 6.0 and Caldera Open-Linux 2.2 are the latest releases from these two distributors. According to Infoworld, the two releases.“…take Linux one step closer to broad acceptance on corporate desktops and servers. Both network operating systems introduce new and features, such as enhanced administration systems, designed to raise their appeal to the corporate environment, offering benefits in installation, administration, and ease of use (Pace, 1999).
Red Hat and Caldera have taken decidedly different approaches to the problems involved with Linux, including installation, administration, desktop viability, and support. Caldera OpenLinux is a winner on the desktop, while Red Hat keeps a firm hold on its place as the premier Linux server distribution (Pace, 1999). Although both releases are second-generation products, despite their improvements, there’s still a little ways to go before Red Hat or Caldera is completely ready for the enterprise or the corporate desktop. The Infoworld article outlines specific areas where Linux needs to improve.
For Linux to move beyond the workgroup, department, or the Web server, it must offer strong directory services (similar to Novell Directory Services). To take on the desktop market, Linux must offer a healthy set of easy-to-use applications. And, most importantly, the support mechanism needs to be highly regarded and praised. The additions in Red Hat 6.0 and OpenLinux 2.2 show dedicated efforts to expand Linux’s corporate presence. Even though Linux still suffers from some fundamental problems, its core systems is still one of the best developments yet (Pace, 1999).
Of course, Red Had and Caldera are not your only Linux choices. Other popular distributions, including Debian, Slackware, and SuSE, have recently revised their products to include the latest release of the Linux kernel. But when it comes to corporate acceptance, Red Hat and Caldera are far ahead of other distributions because of their strategic partnerships, marketing abilities, and recognition of the need to support (Pace, 1999).
Many significant improvements have been made since Red Hat and Caldera’s last release. New to Red Hat 6.0 and Open Linux 2.2 is the 2.2.x kernel, which introduces many of the features that, until now, have been difficult-to-install patches. One of the most exciting new features is symmetric multiprocessing (SMP) support, and with the 2.2.x kernel, you can run as many as sixteen processors in one machine. Also new is software redundant array of independent disks (RAID) support (Pace, 1999).
On the desktop side, the enhanced support for sound and video from these releases is welcome. Both distributions support peripheral control interface (PCI) sound cards for the first time. Because sound drivers have been fully modularized, creating new drivers should be easier work for Linux developers. Also included with the Linux 2.2.x kernel is Videl4Linux. Similar to the modular sound drivers, the Video4Linux drivers provide a common framework for video capture devices to plug themselves into programs (Pace, 1999).
Before these new Linux releases, the installation process was rather daunting. Fortunately, the days of a difficult Linux installation are long gone. OpenLinux’s installation is as easy the typical Windows 9.x installation, but Red Hat’s installation, while it does a nice job of auto-probing device settings, is still rough around the edges.
Caldera’s installation wizard, is one of OpenLinux 2.2’s strongest features. Using Lizard, users can configure the initial parameters and install OpenLinux through a nicely designed graphical interface. Equally exciting is the portion of the wizard that lets you configure your system for dual boot with Windows 9x. The CD ROM guides users through its included Partition Magic software, resize your partition, and then install OpenLinux 2.2. It takes approximately forty minutes to configure a desktop system to run both Caldera and Windows 98 (Pace, 1999).
Red Hat 6.0’s installation program is a simple text based program. Its auto-probing feature eliminates the need for users to pick certain system features from a text menu. Red Hat uses a hard-disk partitioning tool called Disk Druid. The tool can be difficult to navigate through and doesn’t provide much help in determining how much space is still left on your hard drive. Its addition is sometimes incorrect and it reports less space available on the hard-drive than there really is (Pace, 1999). Red Hat can use a few pointers from Caldera installation system, primarily for the dual-boot (Linux & Windows) configuration.
Red Hat 6.0 has a program called “linuxconfig.” This is a tool from which users can administer all the features of their system. Linuxconfig offers all the basic administration tools you could need in one place, including tools for network, system, and user administration. Linux also lets you administer everything from a single user interface – which NT, reportedly, won’t offer until Windows 2000 is released (Pace, 1999).
Caldera has taken a different approach to administration, piloting the Caldera Open Administration System (COAS) architecture. Similar to Microsoft’s Management Console (MMC), COAS provides an open standard that lets developers write simple plug-ins. Unfortunately, the standard does not yet include a user interface specification (Pace, 1999).
Neither Caldera nor Red Hat is in charge of the graphical user interfaces
(GUI) they load on their systems, but they are in charge of which ones
are included with their distributions. Caldera uses the KDE (K Desktop
Environment), and Red hat has decided to straddle the fence with the KDE
and GNOME (GNU Network Object Model Environment). Both GUIs have
began offering what has seriously been missing on the Linux desktop – a
unified drag-and-drop architecture that lets you interact with your files
applications, and system without having to type commands at a command prompt.
Both GNOME and KDE offer the kind of functionality and more than enough
customization to be productive on the desktop. Even Macintosh pales
in comparison with some of the options now available (Pace, 1999).
Vendor Support for Linux
While Linux distributors work hard to make needed improvements their products, hardware and software vendors have pledged support for the Linux operating system. IBM, Dell, Compaq and Hewlett-Packard decided to sell server and/or desktop computers with Linux installed. They also offer various levels of support for Linux as well. The list of software vendors joining the Linux bandwagon is constantly growing. Among the more recognized names are Netscape, Oracle, Sybase, Corel Computer and Lotus.
Vendor support is a key to the future success of Linux. The aforementioned
Datapro study showed that for 1996 to 1997, Linux grew 27 percent in large
enterprises. It’s worth noting that this feat was accomplished without
the support of major hardware and software vendors. While these vendors
have had little or nothing to do with Linux’s past success, they’ll play
a more integral role in helping Linux gain widespread market acceptance.
The Linux OS has come a long way in a relatively short period of time. It impressive list of features make it a logical choice for companies looking for a reliable operating system to support business operations. The terms and conditions of the GPL licensing offer users the freedom to do practically anything they want to do with the software. All of these are definite selling points that will compel many users to at least try-out the operating system.
Linux will continue to emerge because it is a viable alternative to Windows
and other Unix based operating systems. It’s an alternative that
businesses seem to be yearning for.
The huge (and growing) amount of development effort behind Linux will result in this technology staying at or ahead of any commercial OS project. An illustrative example is security.
Because of the wide open nature of Linux and its available sources, security issues are identified, debated, and repaired in real time. The problems are discussed openly, the patches are tested widely, and the problem is worked on until resolved to everyone’s satisfaction. Open discussion of security issues in Linux might confuse users of traditional operating systems into thinking that Linux has security problems. While security problems do exist, the fact is that all operating systems have their share of security problems. Linux simply identifies and solves them faster that what can be expected of traditional commercial operating systems.
Linux is poised to deliver a blow to the Microsoft empire. The Linux
development community will continue to make the improvements necessary
for broader market acceptance. The emergence of Linux has given Microsoft
has a legitimate threat. The rules of the game have changed in the
OS market. Buggy software is no longer the standard; consumers don’t
have to wait nearly two years for the new release of new a product.
licensing is no longer cost prohibitive and restrictive. Linux offers
an unbeatable alternative.
Fermi Lab Supports Red Hat Linux (Oct 1998). [On-line]. Available: http://www.redhat.com/news/news-details.phtml
Mann C. (1999, Jan – Feb). Programs to the People. Technology Review, p.36.
Orenstein D. (1999, Feb 15). Retailer commits to Linux in 250 stores. Compuuterworld. [On-Line]. Available: http://www.compuuterworld.com/home/print.nsf/all/9902159106.
Pace M. (1999, May 17). Linux Leaders stab at enterprise. InfoWorld Magazine, p.35.
Raymond E. S. (1998, Feb 4). The Cathedral and the Bazaar. [On-Line]. Available: http://www.redhat.com/redhat/cathedral-bazaar/cathedral-bazaar.html
Red Hat Software, Inc. The official Red Hat Linux Installation Guide. (1997, October). North Carolina: Research Triangle Park.
Scannel E. & Trott B.(1999, May 24). Chipping away at Microsoft. InfoWorld Magazine, p.1.
Vizard M. (1999, May 24). IT managers caught in budget imbroglio.
InfoWorld Magazine, p. 5.