IT Trends: Digital Deployment
Sep 1, 2006 12:00 PM, By Brian Glass
IT managers have more choices than ever to give their employees valuable software tools.
Software deployment has been a problem since the dawn of the personal computer. In the prehistoric time before PCs, software deployment was not an issue because, back then, software was simply installed on the central computer, and everyone automatically had access to the same set of tools. In the PC era, on the other hand, there are literally endless ways to configure individual and group systems. Thus, getting the same software applications and performance out of those applications onto all systems in an enterprise is often a daunting task.
There is actually a surprising number of approaches to how IT managers deploy software onto PCs. This article will survey some popular methods and delineate their pluses and minuses.
The most fashionable, and my personal favorite, method for deploying software tools for a group to use is over the Web. Everybody has a web browser, and therefore all you have to do is give somebody the URL for accessing the tools, and off they go.
When I talk about deploying software over the Web, of course, I am not talking about making a software package available for download, but rather, making the actual application available to use in realtime over the Web. The actual software runs, for the most part, on the server and users access it through a web browser remotely.
Certainly, not all applications are appropriate for web deployment. You probably wouldn't want to edit video over the Web, for instance. But a surprising number of applications work extremely well this way. Since many church staff and volunteers frequently work from home, this method works well.
For example, my church contracts with Fellowship Technologies to provide us with Fellowship One (fellowshiptech.com) as our church management tool. Fellowship One provides users with a membership database, contact management, scheduling, and numerous other services. Volunteers and staff from all over the county can access this database from their home computers, using web browsers.
One of the biggest benefits to web-based software is that it doesn't matter where the software is located. As a contractor providing software to clients, you could locate the server at your own facility and make the applications available to the client over the Internet from that server.
For many years, a common method of deployment has been to store applications on a central file server, while sharing access to the server over a network. With this approach, PCs run their own software, but they access it over the network, rather than on their local hard drive.
A variation on this theme is thin clients or diskless workstations. Thin clients are computers without disk drives that simply boot from a file server over a network. The entire operating system is sent over the network from the server to the thin client. It's easy to deploy new software to these systems, since it can all be done from the boot server.
For this type of system, all software, including the operating system, is installed in a special location on the file server. Using special protocols (such as PXE), the client computer sends out a request on the network for boot services. The file server waits and listens for these requests, and when it receives one, it sends the operating system over the network to the client system. Once the system is booted up, it again connects to the server to obtain access to disks that contain the application software for that computer. Everything is conveniently stored in one place, and only needs to be upgraded once for all diskless systems.
With Windows applications, this can theoretically be done with Microsoft's RIS, or with numerous third-party packages. With a Mac, this can be done using Apple's NetBoot or a Linux server. With Linux, this is commonly done using the Linux Terminal Server Project (LTSP).
A modern terminal server is similar in concept to an old-fashioned mainframe with “dumb” terminals, but considerably more sophisticated. Instead of simply passing text-based information, all kinds of information from graphics to mouse clicks can be passed back and forth. The key is that software actually runs on the server, and consequently, where the software is physically installed.
MIT X11 was one of the earliest graphical terminal servers developed in the mid- to-late '80s. It's still in use on most Unix and Linux systems. It is also available for Mac and Windows.
With X11, applications communicate with the X display server. The display server controls the screen and displays the applications. If you are running X-based applications (which nearly all Linux/Unix systems are), you can have them run on one machine and display on any other. However, you cannot use X11 to display native Windows or Mac applications, since they are not built using the X system.
If you are working in a purely Windows environment, you will probably choose either Windows Terminal Server or Citrix. Terminal Server and Citrix provide similar functionality to X11, but allow native Windows applications to be shared across a network. I know of at least one large church that uses Citrix as a primary method of application deployment.
To my knowledge, at press time, Apple had no terminal server product for sharing native Macintosh applications, but a Mac can share standard X11 applications.
One way of performing mass rollouts of software is disk imaging. With this technique, a basic system is configured with all the software and tools that users in the enterprise will need. This prototype unit can be thoroughly tested to make sure the complete package works well. Once the system is stable and ready-to-go, the entire system disk is copied onto a network storage area, using disk-imaging tools.
Once you have an exact image of the system disk, it is easy to make rapid copies of the complete system without having to go through the pain of installing each application separately. New systems can be installed in a fraction of the time it would take to configure one from scratch.
In the Windows world, a common way to do this is using Norton Ghost. For Macintosh users, there is Apple Remote Desktop. With Linux, there are numerous solutions, one of which is SystemImager (systemimager.org).
REMOTE INSTALLATION SERVICES
Disk imaging usually goes hand-in-hand with remote installation services. A remote installation service allows administrators to install new software or updates on a desktop computer without having to be at the actual physical machine. This type of system usually stores software updates on a central server, and then pushes them through to the client computers.
For Windows, you could use Microsoft's RIS or the aforementioned Norton Ghost for this purpose. Apple Remote Desktop also provides remote installation, as does Linux SystemImager.
No solution described here is 100 percent. You may decide to mix techniques, or focus on one of them and use manual installation for machines that don't fit the mold. Notebook computers, in particular, can be difficult to fit into any of these models. Highly specialized workstations may not fit into the standard model either, since they need specialized software that most people on the network will never need to use.
Thus, this is only a brief survey of some primary software deployment methods. In any case, if you are providing general IT services to clients, you will definitely want to establish procedures for keeping systems up-to-date. Without such a procedure in place, you will waste countless hours managing individual systems on a one-by-one basis.
Spend some time evaluating and experimenting with some of these methods — you won't be sorry you did.
Brian Glass can be reached at firstname.lastname@example.org.
Acceptable Use Policy blog comments powered by Disqus