Workstation Watch
Issue: Volume: 27 Issue: 11 (November 2004)

Workstation Watch

Last month, Computer Graphics World presented the latest installment of its Webcast Technology Series on Next-Generation Workstations. Our expert panel featured Alex Herrera and Jon Peddie of Jon Peddie Research, who cited data from their new “Workstation Report” as they examined the current landscape and peered over the horizon at tomorrow’s market and technologies. In case you missed it, here’s an annotated summary of the presentation and the Q&A that followed, which you can access in their entirety at www.cgw.com.

Traditional Workstations: It used to be that workstations could be distinguished from PCs by just about every metric—the vendor, the operating system, the chips, boards, and buses. SGI, Sun, HP, IBM, and DEC were basically the only shops in town; Unix was the only option for operating systems; Windows was not viable 10 years ago; and Linux was just in its infancy. But what used to be a market made up of nothing but proprietary architectures running proprietary operating systems is no more. Of the 1.6 million workstations sold in 2003, nearly 90 percent were not proprietary, but PC-derived, that is, based on technology commonly found in PCs, namely IA-32 Pentium 4, Xeon, or mobile Pentium chipsets. Today, the remaining traditional proprietary workstations have been relegated mainly to vertical niche applications that continue to decline year to year.

Is it a Workstation or PC? Given this evolution toward PC-derived workstations, it's becoming more difficult all the time to distinguish between workstations and PCs. One way to differentiate the two is to designate something a workstation if it offers added value for a premium price. That value could be in the form of hardware—such as a professional branded graphics card or ECC (error-correcting code) memory—or it could mean that the vendor simply sank more hours into certifying the box for specific professional ISV (independent software vendor) applications. What's interesting is that for added benefits such as these, the cost difference can be less than $200.

Why Buy a Workstation? If a professional user can get some amount of improved productivity out of a workstation, whether by increasing throughput or quality or by reducing the chances of downtime, then the premium paid is certainly worth it. For instance, if you're working on a mission-critical project, then having ECC memory to protect against corrupted memory can be invaluable. On the other hand, if you use your machine to play games, then ECC memory may not be worth the expense—or the accompanying sacrifice in system performance—to protect against corrupted data, unless, of course, you're a serious gamer and have a really good score going.

Midrange Machines: At the midrange level of the workstation market and above, separating workstations and PCs is more straightforward. Dual processors, the width of the PCI Express interface, other specialized or high-bandwidth I/O options, and the amount of storage and memory are key separators. Midrange workstations may also build in more reliability, with features such as redundant RAID memory, multiple fans, multiple power supplies and, at the higher end, features like chip kill, which provides redundancy across the DRAM chips, so if one chip fails, the memory as a whole won't be corrupted. Alas, in a relatively short amount of time, several of these features will trickle down to the PC.

The Big Three: Dell is clearly the workstation market leader, shipping about 45 percent of the 2003 units, and an even larger share for the first half of 2004. Because of Dell's dominance, the other vendors tend to avoid head–to-head competition with the market leader and differentiate themselves in something other than just price. For example, Hewlett-Packard, which accounts for 23 percent of the units sold last year, offers Intel-based workstations, as does Dell, but HP also markets proprietary PA RISC-based machines. And IBM, third in unit share with 14 percent, is the only one of the "big three" vendors that currently markets an AMD Opteron-based workstation.

Other Top-Tier Suppliers: For Sun, which ranks fourth among workstation vendors with an 8-percent share of units sold, the good news is that it continues to gain ground in the market for the traditional proprietary machines. But the bad news is that this segment is declining faster than Sun can take in new business. Still, there's a lot going on at Sun that's encouraging: a deal with Fujitsu to share development costs for SPARC, a new line of Java workstations based on the Opteron CPU, and an ability to run not only its proprietary Solaris operating system, but also Linux. Ranking fifth, Fujitsu-Siemens, with a 4-percent share, differentiates itself from Dell and the other vendors primarily in terms of geography, with its strong presence in Europe as well as Asia through its sales partner Fujitsu Japan. Finally, SGI, with a 0.4 percent share, still markets workstations based on MIPS CPUs and the company's Irix OS that tend to be at the very high end of the spectrum. But in many ways they're not as much a workstation vendor as a "solution provider," searching out vertical niches, such as for national labs and government accounts, that need a lot of computer power, bandwidth, and storage. And SGI offers its workstations as one of the tools to address those applications.

Operating Systems: Windows is the overwhelming operating system choice of the big two professional application bases, DCC and CAD, and it promises to remain number one for the foreseeable future. In fact, the next version of Windows, code-named Longhorn, is not expected to be ready until sometime in 2006. Elsewhere, the once-rising star, Linux, seems to be losing some of its luster as being the "savior" from having to use Windows from Microsoft. Its roughly 10-percent market share doesn't appear to be growing very quickly. In fact, you could argue that Linux is doing more to erode the Unix base than the Windows base, since customers who have been tied to proprietary platforms are seeing it as a viable alternative to help them migrate away. Regarding proprietary operating systems, Sun looks at Solaris as a strategic part of its workstation future, but the same doesn't hold, at least not to the same degree, in terms of how HP, IBM, and SGI view their proprietary operating systems.

PCI Express: The most disruptive new technology is PCI Express, which replaces the long-standard parallel bus interface and relies on serial communications to increase the bandwidth between PC motherboards and peripherals. In June, Intel supported PCI Express on its Pentium 4 and Xeon processors. Other suppliers, such as AMD and Apple, will have to move quickly from the AGP standard to PCI Express, to ensure they have access to the latest graphics from the likes of 3Dlabs, ATI, and Nvidia, which have also announced support for the new standard.

The Gigahertz Race: The race to speed up the processor clock rate is over, and it was the vendors that said so. AMD started with a numbering scheme that dropped the clock frequency from the model number, and Intel did the same earlier this year. The move now is toward multicore processing. Rather than trying to push clock rates and extend architectures to squeeze that last bit of parallelism out of a single thread of execution, the industry is focusing on implementing multiple CPU cores on a single die to process multiple threads of instructions simultaneously.

Dual-Core Processors: Mainstream hardware vendors took the first step toward multicore processing by introducing dual-core processors. At the end of 2002, IBM was the first to market a dual-core processor, its 130 nanometer (nm) Power 4 chipset. Early this year, HP and Sun both introduced 130nm dual cores, the PA RISC and Ultra SPARC 4, respectively. AMD and Intel waited for 90nm technology to arrive before making the transition. AMD introduced a dual Opteron chip a month or so ago, while Intel just recently demonstrated a dual-core processor. Even though Intel entered the market last, it should overtake everyone else's dual-core processors soon, and by a wide margin. In fact, Intel says that it's ramping up dual-core production, and by the end of next year, it expects to ship dual-core chipsets for at least 40 percent of desktop processors.

Chicken-and-Egg Cycle: In terms of software for multiprocessor systems, the problem with ISVs has always been that they won't develop an application for a technology, in this case multiprocessor systems, unless there's a large enough installed base of potential users. But this time, the hardware companies—in particular, Intel, with its "virtual dual-core" HyperThreading—have broken that chicken-and-egg cycle and have started to populate the market with multiprocessor systems. Now they can point to an installed base to induce the ISVs to come forward with new multi-threaded applications.

Multi-Threaded Software: The task of creating multi-threaded software is non-trivial. ISVs have to reexamine code—that, in some cases, is a decade old or older—find opportunities for branching linear code, split the workload over multiple processors, recompile it, re-issue it, remarket it, and so forth. Unfortunately, ISVs are not rolling in cash right now, so these are judicious decisions they have to make. Therefore, it's still going to take time, perhaps three to five years, before we'll see a full spectrum of multi-threaded professional software applications. In any case, we have to give credit to the platform builders who held their nose, jumped in the water, and said, "C'mon guys, we're going multiprocessing."

Phil LoPiccolo
Editor-in-Chief