Jump to content

Talk:Workstation/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1

Unix OS

Article is a bit confused, would be better to logically seperate random historical bits from the general concept. Workstation also implies a Unix OS or at least something with a compiler - as opposed to a Windows Desktop.

This Article is All Wrong

A Workstation is a location, usually containing furniture of table height, where a worker performs a specific task. i.e. a Jeweller making Jewellery (UK spelling there btw), or a factory working at an assembly point.

It is NOT a Computer, and I think this distinction should be made by this article being renamed to 'Computer Workstation'. Jaruzel (talk) 09:38, 11 September 2013 (UTC)

Cleanup

This article needs cleanup. The article needs to clarify the following:

  • General non-OS specific definition of the term.
  • Use of term amoung Unix/Linix computers.
  • Use of term amoung in the Windows and Mac OS X worlds, as in "Windows NT Workstation".
  • Workstations as computers used for specific tasks, such as imaging, video creation, local or remote control or monitoring of machines, devices, etc., (see definition: [1]).

--Cab88 23:30, 16 February 2006 (UTC)

But... It helps to have general agreement on what a workstation is, or is not

I agree. But keep in mind that this is a relatively complex topic, and that a lot of meaning is bound up in the word "workstation". All at once, its a hardware specification and/or performance criterion, operating system choice, and specific application software deployment, all on one machine.

Also, Microsoft helped pollute things by tacking on "Workstation" to their operating systems, as if to suggest that NT would turn a PC into a workstation, which it most definitely does not. They have also done this with their use of "Engineer" in their MCSE (Microsoft Certified Systems Engineer) diploma. They only stopped when professional engineering regulatory organizations told them to cease and desist. Just about anything that company does mars the technical purity of things.

More to the point, with hardware improvements in PC's to support better multitasking, and with multiple processors, and the rise of Linux on the desktop, many people feel that Linux on powerful PC hardware constitutes a workstation, which some people would go along with, and others would not.

So you see, its hard to do an article like this, when there is not clear consensus on what a workstation is in 2006, vs. what it was in 1986, when the first Apollo workstations came out, based on my reading of the history of technical workstations. I have used Sun workstations for years, and love them for doing any serious technical work, but I also know I could do most of it on a Linux PC if I had to, as long as I don't push it too hard and cause something to crash.

When time permits, I will try to re-organize this article. But you know, I have been searching the net for a while, trying to find an authoritative definition of what exactly a workstation is and is not, how it compares to a fast PC, etc. etc. Its ironic that I may end up being one of the authors of such a document. I find it hard to believe there are not others who would have already done a bang-up job of this, better than I could do...

--SanjaySingh 05:11, 17 February 2006 (UTC)

I would suggest the following then. Start with the dictionary definition and show how this derived from the 1980's concept of a workstation computer (Unless their where "workstation" computers before then I am not aware of). Then discuss how the workstation hardware of the past was generally more powerful then the average PC and how modern powerful PC's have blurred the distinction. The way the label "workstation" is used in the modern day should be discussed. The Microsoft use of the term in the "NT Workstation" OS can be discussed separately as it seems to relate to not relate to the dictionary and hardware based definitions but it is more about the OS capabilities and intended audience. --Cab88 22:53, 18 February 2006 (UTC)

this definitely needs to be cleaned up. i don't understand the difference between a PC and a workstation

The reason you can't see the difference is that, today, there really isn't one. It's a distinction without a real difference. Frankly, it seems to me that the only people who make the distinction really are more interested in making a prestige-oriented distinction about themselves. The equipment/platform is only a symbol. A modern PC is just as powerful and capable, in principle, as any of the "workstations" that Sanjay is attempting to differentiate it from. A PC can run some version of Unix (Linux or FreeBSD), it has a powerful processor, lots of memory, advanced graphics, etc., etc.

The only thing that makes sense for this article is to express it all in the past tense. The distinctions that remain, such as they are, have more to do with software than with hardware, and those distinctions seem mostly of the snobbish sort, not any real technical issues. The definition of words like "engineer" has nothing whatever to do with this question, it's just a slam against Microsoft. (The person who picks up your garbage is probably called a "sanitation engineer"... so what?) Windows can multi-thread just as well as Unix, perhaps even better. The "workstation" has become a commodity. Deal with it. The word has as much currency as was the old term "mini-computer", which today is a meaningless anachronism. -- RussHolsclaw 00:50, 15 March 2006 (UTC)

Many industries provide their workers with workstations, regardless of whether or not they employ computers in their work. An article on computer workstations should, IMO, specify computers. It's elitist, again IMO, and misleading to insist in an encyclopedia article that the term "workstation" applies not only just to computers, but to a specific subset of computers. Note also that the office furniture industry is also marketing specialized furniture for computers as computer workstations. Georgia Yankee (talk) 19:37, 6 June 2011 (UTC)

Does anyone in this discussion (other than me) actually USE PC's, workstations, for technical work?

Some comments for Russ. Call me an elitist if you like, thats perfectly fine. But I am within my rights to suggest that you don't see a distinction between PC's and workstations, cause you don't push the machine hard enough to see any differences.

The system level architecture of a typical PC is designed for low-cost, NOT multitasking. I know from experience. Its possible to have computational jobs running in the background on a Sun workstation and the machine still remains usable. On a PC, it bogs down and becomes unwieldy to use. Its not about the CPU, its about the design of everything else. One PCI chipset for 6 slots; small caches, IDE drives... everything made to run one thing at a time. No crossbar switch architecture, no SCSI, no balanced I/O for multiple tasks moving information in multiple directions at once.

This is a mix of facts and rather odd conjecture. Yes, PCs are cost-driven. So are workstations. The specific technical solutions used in PCs differ from those used in workstations, but PCs multitask just fine (OS software allowing) and can do plenty of real-time rendering jobs workstations would have been very hard-pressed to match. (FPSes with respectable framerates, for example.)

Benchmarks tell you performance in ideal conditions, but PC's under multitasking loads do not perform anywhere near what the benchmarks would have you expect. There are millions of armchair analysts who are duped by benchmarks, and who think that dividing clock rates through by the cost of the machine makes them smart, when it proves nothing.

People who buy workstations need a machine to do work that scales up to big and complex models. I am not talking about dinky little AutoCAD models. I am talking about things like fighter planes, or space craft, or DNA research, or chip design, or scientific simulation.

This is just a non-sequitur. It makes a workstation sound like a small, cheap supercomputer, which it never was. Workstations were networked machines (largely Ethernet, as opposed to the IBM-oriented SNA or the microcomputer-oriented RS-232) well before non-dialup networking was mainstream. That (hardware and software support for serious networking) was a big part of the workstation world as well.

If you think a 32-bit Pentium 4 can even load in the human genome into memory for speedy sequence analysis, I invite you to try it. The Pentium 4 is a marketing driven processor that only achieves its high clock rates by having a ridiculously long 20-30 stage pipeline, which means that it suffers from pipeline stalls if branch predictions are wrong, and its instructions per cycle is lower than its competition. When you start to push things where the memory access pattern exceeds what the caches will support, things will bog right now, and stay down.

I won't go after this for being dated, but it is and people reading should realize that. Four gigs of RAM and a quad-core 64-bit CPU is a commodity now. It's a rich kid's gaming rig.

Now lets talk about software and related things.

The only reason I mention Microsoft at all is because their use of "workstation" has confused the definition of what workstation hardware is to people that are not familiar with Unix or RISC processors... Microsoft markets inferior operating systems on commodity hardware to the masses, and tells them its a workstation, when its not. I have no idea where you got the notion that Windows multithreading is as efficient as Unix. RISC processors are designed around the idea of multitasking (Sun's register windows), and floating point (DEC Alpha), and multiple processors in one machine. Windows is most certainly not designed from its foundations for these things because the average Windows user does not do jobs that require hours of number crunching while still acting as a server on a network, and also a graphics console. You CANNOT use Windows for multitasking efficiently, or safely. It will either slow to molasses, or crash. You do ONE thing at a time on Windows, or else...

Register windows were abandoned in the SPARC line after it was proven they didn't help nearly as much as those massive register spills hurt. Floating-point has been on-die since the 80386 era, at least as an option, and I don't know why he picked out the Alpha as a particular example of it. Windows has always had its problems, mainly due to Microsoft refusing to break older programs by enforcing a coherent security model, but the NT-derived OSes are not especially bad at multitasking. (Windows 2000, XP, and Vista are all NT-derived. 1, 2, 3, 95, 98, 98SE, and ME are MS-DOS-derived.)

Microsoft tries to tell people they are "systems engineers" when they are not engineers at all. I work in an engineering department, where some academic standards apply. Microsoft software is suitable for entertainment. I like playing Team Fortress. But I would not use Windows for anything important like my neural network simulations. And I stand by my contention that Microsoft has polluted the term "workstation" by their frivolous use of that term in their marketing efforts, in the same way they pervert the word "engineer."

We all know the only real engineers drive trains. Whoo whoo! ;)

Linux is very cool, but its a 32-bit hobbyist's operating system for the most part. Since it is community driven in terms of its development, it largely follows the demographics of the majority of its adherents, which means 32-bit x86 single processor hardware remains the most common configuration, and therefore most software will be designed around this assumption. The volatile nature of the kernel means that software often requires recompilation or breaks often, and Linux is not yet ready to scale to 64-bit because both the hardware infrastructure is just now going to 64-bit, while UltraSPARC from Sun has been 64-bit since 1995, and there are few commercial engineering applications that run on Linux, due to its open-source nature, and business viability issues.

This is to laugh. Really. Look at how many companies use Linux and how many architectures it has been ported to. For your information, Linux has been 64-bit since it was ported to the Alpha in the mid-1990s, way back when the Alpha was still relevant. Linux works on computers from Blue Gene to the TiVO and palmtop systems without a hitch. UltraSPARC, on the other hand, is just as obsolete as the SPARC workstations it was built for. Yes, 1980s-era workstation OSes were there first. Linux, on the other hand, is still here today.

Apple's MAC OS X, is BSD Unix under the hood, with a very polished usable interface on top. Its the best consumer level operating system out there. Running on 64-bit hardware, multi-tasking, multiple processors, it lacks only for scalability to large scale server hardware, and technical applications to run on it.

Apple has no real reason to market MacOS X as a high-end server OS because I don't think Apple makes high-end servers. If Apple did, MacOS X (which is based on Mach, by the way) would acquit itself well in that role, I have no doubt.

Solaris 10 is the worlds most advanced and powerful 64-bit operating system, though it might not be the most endearing. It is the most scalable, to hundreds of CPU's, hundreds of gigabytes of RAM, and highest performing, beating even Linux on many standard tests. AND it has hundreds of high-end applications, ranging from PRO/Engineer, CATIA, for 3D Mechnanical design, and Cadence, Synopsys, Mentor Graphics, for Electronic Design Automation. Also, lots of open-source packages can be compiled easily on this hardware and software platform to run on multiple processors at once seamlessly.

Solaris is a cool OS but it lacks the application software support Linux has. Sun is only now getting the open source world and I have high hopes for the software, but it has a long way to go to catch up.

Anyway, despite the less than diplomatic words that have been exchanged thus far, I am glad for the discussion. It means people DO care about this article, and it probably means that when all is said and done, this article will be an authoritative one for people to consult. --SanjaySingh 09:21, 5 April 2006 (UTC)

I don't mind a lack of diplomacy. I do mind a lack of accuracy.

You make some very good points here, Sanjay, along with interesting information concerning bus architectures, etc. I stand corrected. However, I take exception to the invective you posted on my personal "talk" page. I don't think it would generally be regarded as "diplomatic".

Notwishstanding all of that, I was mostly objecting, albeit clumsily, to the irrelevant invective unleashed at Microsoft for "polluting" the word "workstation" here on the discussion page. The fact is that the word has long been used in a much broader sense than the one you attach to it.

In particular, I recall that during the late 1980's IBM (where I worked for 26 years, starting in 1966) made frequent use of the word in a much broader sense. The high-performance type of machine you mention was described as an Engineering/Scientific Workstation, not just a workstation. At the same time, IBM documents made frequent use of the term in its more literal sense, i.e. a machine one stationed oneself at to do work... work of any sort, that is, including spreadsheets and word processing.

In fact, IBM even used the term fixed-function workstation in reference to what is colloquially called a dumb terminal. (No device carrying an IBM trademark could be called "dumb", of course! :-) )

So, if I were to take issue with anything in the Workstation article itself, I suppose it would be the title, and the first sentence, which lays exclusive claim to the general term "workstation" while characterizing other, more specific, appellations as being the "colloquial" ones.

--RussHolsclaw 13:41, 12 April 2006 (UTC)

OK, lets work together on this, and move it forward

I was reading the supercomputer page recently, and they have an interesting and perhaps usable template for discussing their technology, and I think their structure can be of use in this article. Supercomputers, like workstations are also a moving target, in that the state of the art of 10 years ago is today's run-of-the-mill machine, or even (a pity) a paperweight. I am a big fan of Seymour Cray. He rocked.

I have written up an article for my department, intended to be like a white paper, that discusses in some detail many of the issues we have been going over in this page.

I am glad you mention the "dumb terminal" as a workstation, because technically it is one end of a continuum of computer power. At the other end, would likely sit a deskside machine, whether its an old Apollo workstation, or a Sun Ultra 450 quad processor machine, which just happens to be the entry level of their workgroup servers as well. It would offer additional context in which to consider where desktop machines were, where they are now, and where they are going.

RISC microprocessors were considered a disruptive technology that enabled a significant advance in desktop machines at the time they were introduced into systems. A good article on workstations would need to track the evolution of networked desktop computing from the earliest days of terminals, to the present day distributed networks of workstations.

I know your experience would help with this, because you have seen the evolution of high end hardware... When I read your bio, I was a little shocked to see 26 years of IBM experience. I was 11 or so at the time you began your IT career. I started trying to learn BASIC on the Radio Shack TRS-80 and save my paper route money when I was 12 or 13 to buy my own computer. Eventually I ended up at U. Waterloo, and picked up some computer architecture along the way. I believe that if we pool our knowledge, we can have a bang-up article that will be authoritative, easy to read, and will be referred to by many people who are seeking to understand this class of machine.

Regards,

--SanjaySingh 03:20, 13 April 2006 (UTC)


Actually, I don't think you read my bio right. I worked for IBM for 26 years, ending 13 years ago. I've been in the business 39 years. I just turned 60 in January. There were no TRS-80's when I started with IBM, just System/360 computers, which I provided support for in those early days. In '76, I built one of the early S-100 bus machines, an IMSAI 8080. Eventually, I got it running CP/M, as soon as I could afford a floppy disk drive for it. They were a bit expensive in those days.

My career with IBM mostly had to do with mainframes, though. I had no contact with Unix until more recent times. I've done a little Unix work in the past few years, but mostly modifying programs written by others. In my mainframe work, I worked on systems that provided remote support to customers. This included devising a protocol for downloading software fixes to System/370 systems, and providing the ability for remote viewing of memory dumps and other software diagnostic data. In fact, this application involved a very early version of base-64 encoding, similar to the type used today to handle email file attachements. That was in about 1973.

One thing I can do is take your "early history" part back a bit further. IBM had an early system of the type that could be called a "scientific workstation" that was introduced in 1959. That was 7 years before I started working for IBM, but the machine was still being sold. It was the IBM 1620. There was another system in 1965, just the year before I joined IBM, called the IBM 1130. The latter machine was built using electronics techology borrowed from the System/360. That was hybrid integrated circuit devices that IBM called Solid Logic Technology, or SLT. --RussHolsclaw 05:14, 15 April 2006 (UTC)