Archives ||About Us || Advertise || Feedback || Subscribe-
-
Issue of July 2003 
-
  -  
 
 Home > Focus
 Print Friendly Page ||  Email this story

Focus: Linux

When the Penguins win

A look at why enterprises opt for Linux and the issues they have to deal with when adopting for this platform. by Graeme K. Le Roux

Irrespective of hype from the IT industry, there are only five platforms you can put in a data center: Traditional mainframes and minis; Unix boxes (including Sun, HP and IBM systems), Windows boxes, NetWare boxes, and Apple boxes.

In practical terms, Apple's server offerings are limited and traditional platforms are typically accessed via PCs rather than terminals. If you go with Apple, you end up having to deal with file serving, printing, e-mail, and so on, for PC clients.

If you choose either NetWare or Windows servers, you are going to end up with a minimum of two servers per 100 users. That does not mean 50 users per server; you just don't mix the functions of a mail and print server on the same platform in a large network.

By contrast, in a Unix environment you may mix more functions and thus have less boxes. This difference exists because you can choose which OS features you want to run at boot time with Unix, and hence use system resources more efficiently than you can with Windows' load everything whether it gets used or not approach.

The Linux Option

Until the advent of Linux, using Unix meant using a particular vendor's version of Unix-Sun, HP, IBM, BSD, etc. Each of the vendors' Unix implementations has pros and cons and minor variations. They also run on relatively expensive proprietary hardware, with BSD being the exception.

Linux on the other hand runs on ordinary Intel hardware, which means that anything that can run Windows can run Linux. In addition, most of the software available for Linux is a lot cheaper than those for Unix.

Until recently, the downside to Linux was that you had to buy a distribution and "supported" hardware, and you were on your own putting the two together. That's fine for someone who is a Linux guru, or is learning to be one. But if you are a commercial user who just wants to set up a mail server, spending a day or so installing a server's OS and then checking it for stability is less attractive than an alternative platform with a pre-installed OS.

Now that, number of major vendors are shipping pre-installed Linux server platforms, it has become a serious option for the cost conscious administrator. Curiously enough, no major vendor to date has chosen to offer Linux for a desktop—the odd thin client aside—or laptop platform even as an option, so for now the penguins are trapped in the glass house.

Show Me the Apps

You'll get a solid OS environment by buying from a major vendor, but where do you find the applications?

There are many commercial software packages for Linux, and most of them are pretty clear about the releases and versions of Linux that they support. However, they may not have been tested in many out of the ordinary system configurations, and the same goes for many of the "freeware" packages which are often well worth evaluating.

The simplest way to get over such problems is to avoid complex server configurations—just keep things simple.

Since one of the first groups to deploy Linux on a large scale in a commercial environment were Internet Service Providers (ISPs), it is not too surprising that the most reliable software available for Linux deals with basic Internet applications like DNS, mail and Web services.

In Australia, most ISPs are Linux shops; this was simply a matter of economics. Linux allows an ISP to provide all basic services for a few hundred dollars per host, and the hosts are simply low cost AMD or Intel powered servers with less than a gigabyte of RAM and ordinary SCSI—or even EIDE-disks.

The fact that Linux is so popular with ISPs is probably the reason that most of the first crop of blade servers released by major vendors run Linux. And the fact that Linux, like all forms of Unix, does not assume that the system "console" is built into the host didn't hurt either.

Hype aside, blade servers are simply a practical way of stacking more processing power into a smaller space. In most cases each blade has, at most, a hard disk built on to it to accommodate the blade's OS. It is always assumed that storage will be contained in an external disk array.

Blades can be hot swapped, and the vendor usually provides some way of quickly cloning an OS image to a newly installed blade. In Linux terms this amounts to copying a built kernel image—a single file—to a new blade along with the Linux Loader (LILO); all other OS files can be stored elsewhere.

Windows on the other hand tends to assume that the entire OS resides on a local disk, which makes cloning a Windows "host" to a new blade more work than it would be under Linux.

Of course Linux has some way to go in comparison to some other Unix environments. The Sun Fire family for example, stores hardware configuration information, including Ethernet MAC addresses, on a removable card. This makes replacing a system simply a matter of pulling one or two hot swap hard disks and the configuration card out of one unit and inserting them in a replacement.

In fact, some Linux blade servers in Sun's Sun Fire family—and many more traditional Unix and non-Unix hosts—do not need the boot disks to be physically installed in a host at all. Boot disks can be installed in an external array or cluster.

This can be done in a Windows environment, but it is far less common than it is in a Unix—or even a traditional mini- or mainframe-environment. Cost conscious buyers like ISPs want such reliability and flexibility, but they don't want to pay either Windows or "traditional platform" prices to get it. This is where Linux comes in.

Cost Conscious

In a corporate environment, the impetus for looking at Linux is almost always cost. Either the company needs to put a new service in place and they don't have the budget for a Windows-based alternative, or they find, say Microsoft's Exchange Server, overly complex and expensive to use for simple tasks such as providing enterprise e-mail.

Exchange was never meant to be used for just e-mail, so it contains a lot more code and uses more system resources than a simple mail server. As such, using Exchange for several hundred users means having two or three servers at least, each of which has to have significant system resources available.

If all you want is dependable e-mail, Linux running sendmail, postfix or qmail can do the job for a few thousand dollars. In fact, you can buy Linux-based appliances such as Sun's Cobalt RaQ 550 for A$4,300 (US$2,780), which are quite capable of acting as a corporate mail server for several hundred users.

The RaQ 550 takes a few minutes to configure and is managed via a simple Web interface, which is all you need to set up and manage virtual sites, add users, etc.

If you want to do something that the RaQ 550's Web interface does not support, you can simply telnet to the unit and work at the bash command line. You will also find a full implementation of the 2.4 Linux Kernel, all the usual Linux utilities and a variety of development tools. There is no way you can deliver this sort of cost efficiency on a Windows platform or a more traditional Unix environment.

And that, in a nutshell, is the point of using Linux. It offers power and flexibility at a price nothing else can match.

VPNs and Remote Management

One of the things that Windows people find it hard to come to terms with in a Linux (or any Unix) environment, is that other than turning a server on, giving it an IP address and an initial administrator's UID and password, you don't configure or manage a server from a screen and keyboard attached directly to it.

While Windows can now be used to manage the server remotely, many Windows network administrators still prefer to work sitting next to the box, especially if they are doing something which will require rebooting the server several times.

The reasons why the administrator generally doesn't work on a Linux host from a local console have their roots in the environment where Unix was developed.

Unix is a multi-user system developed for a data center environment. It is not practical to simply toss dozens of users of a system during business hours every time you make some relatively trivial configuration change because the system has to be rebooted.

Nor do you typically have the option of tinkering with a system after office hours because rebooting it then will interfere with the end-of-day processing. Furthermore, data centers are typically cold, noisy and usually don't come equipped with desk space, so working long hours in them is not comfortable or efficient.

If you don't have to be sitting next to a host to administer it, then you don't have to be in the same place as the host. And if you aren't in the same place as the host, you have to work out a secure way of administering that host remotely. Not surprisingly, ISPs are a classic example of such a situation.

ISPs are big telecommunications users, and telcos haven't been slow to realize that having such users scattered all over the countryside would result in an inefficient use of their network. The result is that telcos tend to price their services to ISPs such that all but the largest ISPs would choose to use co-location centers—and therefore these ISPs have to run their servers remotely.

An excellent example of this would be the ISP used by the author's company. Ace Internet Services (www.acenet.com.au) is based in Bowral, New South Wales, Australia, while their servers are located in Optus' co-location center in Sydney-about 150 km from Ace's offices. Ace buy customer access services from both Optus and Telstra, which enables them to operate nationally without having to worry about establishing points of presence or modem racks, DSLAMs, etc.

For example, dial-up access is supplied by Optus who deliver individual user's dial-up sessions to Ace's routers, (which are physically located with their servers) as L2TP sessions. Ace uses secure management tools like PuTTY (basically telnet over a SSH link), which is freely available over the Internet, to manage their servers.

Linux-based appliances like Sun's Cobalt family support management either via a Cobalt management station (which uses a VPN) or through a Web-based VPN using SSL. There are also a number of IPSec implementations that support Linux.

You can also use hardware VPN gateways from the likes of Watchgurad or SonicWall. But where Linux is concerned, solutions like PuTTY are often just as effective, secure and much cheaper.

Graeme K. Le Roux is the director of Moresdawn (Australia), a company which specializes in network design and consultancy. For comments on this article write to editor@networkmagazineindia.com

 
     
- <Back to Top>-  

© Copyright 2001: Indian Express Newspapers (Bombay) Limited (Mumbai, India). All rights reserved throughout the world.
This entire site is compiled in Mumbai by the Business Publications Division (BPD) of the Indian Express Newspapers (Bombay) Limited. Site managed by BPD.