Jump to content

Talk:Computer/Archive 5

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 3Archive 4Archive 5

US bias in table

The use of red and green colours in table "Defining characteristics of five early digital computers" seems to be intended to suggest that the American ENIAC was the first "real" computer, which it was not.

Zipdude (talk) 08:19, 19 May 2008 (UTC)

VMS

VMS is an important enough operating system to be listed under operating systems. It had significant impact on the development of NT, it is still widely used today, has design features (logicals) found in few other systems, and has been supported across a number of platforms, including DEC Alphas, HP Integrity, a couple of Sun devices, as well as x86. —Preceding unsigned comment added by 69.126.40.68 (talk) 16:29, 23 February 2008 (UTC)

Shannon's involvement

The statement "largely invented by Claude Shannon in 1937" (in reference to digital electronics) is inaccurate and should be changed to something more like "enabled by the theoretical framework set forth by Claude Shannon in 1937" 67.177.184.127 (talk) 09:29, 19 February 2008 (UTC)


Heron

Shouldn't Heron's mechanical play be mentioned in the history section? Was it not one of the first programmable devices?

Per my understanding, the device to which you refer, while extremely remarkable, was not programmable. -- mattb @ 2007-04-12T16:25Z
Well my understanding is certainly limited. I saw it on the history channel some months ago. I suppose the 'programming' was rather built into the device and not really modifiable.
The trouble with Heron's mechanical theatre is that we just don't know enough about it. It's hard to say to what degree it was programmable. Certainly it wouldn't count as a "computer" because it didn't manipulate data - but perhaps there is a reasonable claim to be the first programmable machine. I wish there were some really clear explanations of how it worked - but I haven't seen any. I'll have a bash at improving the history section with this information. SteveBaker (talk) 02:10, 16 January 2008 (UTC)


Yesterday I wrote a section on computer components, which got reverted, then rereverted and then reverted again. I know that it didn't add much info, but it did add another perspective. At the moment, the article focuses almost entirely on the logical side of a computer, an important aspect, but it doesn't explain what you see when you open a computer case. Such down to earth information should also be present. At the moment, there is not even a link to the computer hardware article. This is (or should be) an umbrella article, serving different readers, giving an overview of all computer-related aspects and pointing to other articles. I suppose the biggest problem (as usual) is that it is written by experts, which is good, but also usually means the style is rather inaccessible to most encyclopedia readers (this is not a tech corner). Specialised info can go into specialised articles, but this one should also give a grassroots explanation.

As an illustration of what I mean, I was making a 'roadmapway' for my future computer requirements (I build them myself), for which I wanted to make a list of more and less vital components (as a visual aid), but decided it was easier to look it up in Wikipedia. To my surprise, I found no such list, so I made it myself. It should not necessarily be here, but it should be somewhere and then there should be a link to it in this article (preferably accompanied by a very short overview - the Wikipedia way) because this will be a first stop for people looking for such info.

As for the structure, I worked from the inside out, explaining that the mb connects everything together and then 'hook everything up' to that - can it be part of the mb, does it fit directly onto it, is a cable used and is or can it be inside the computer case - all stuff that isn't evident from the article right now. And also, which components are essential (a graphics card is, unless the mb has that functionality) and which aren't (a sound card isn't). Let me put it this way - such info should be in Wikipedia. Where should it go? DirkvdM 09:34, 19 June 2007 (UTC)

I was rereverting because User:Matt Britt is an established contributor and shouldn't have his changes reverted by a bot. As for the merits of one version over the other, I will side with him; Computers extend beyond PCs, and in my opinion, the new section was too list-y. —Disavian (talk/contribs) 13:23, 19 June 2007 (UTC)
Alright then, my last question was where it should go, and you say personal computer, which makes a lot of sense, so I'll put it there. And I'll add a link to that article (and computer hardware) in the 'see also' section, because you may know the distinction between the two, but people look something up in an encyclopedia because they don't know much about it. It's in the intro, but somewhat inconspicuously. Maybe there should be a listing of the types of computer in a table next to the intro. Also, you say the list is too 'listy', but that's what it's supposed to be. What's wrong with lists? DirkvdM 07:04, 20 June 2007 (UTC)
About the list-y bit: according to Wikipedia:Embedded list,
Of course, there are exceptions; ex: Georgia Institute of Technology#Colleges. —Disavian (talk/contribs) 01:00, 22 June 2007 (UTC)
Note that the first word is 'most' and that it speaks of entire articles consisting of a list. This was (or rather is, because it's now in the personal computer article) just a list inside an article. And it isn't even a list, it just has some lists in it. The alternative would be to enumerate them after each other (in-line), which is much less clear. That's a personal preference, I suppose (I like things to be as ordered as possible - blame it on my German background :) ), but I certainly won't be the only person who feels like this. I doubt if that text is meant to be applicable here. DirkvdM 10:24, 24 June 2007 (UTC)

Removal of text

I removed this text from the article:

Five Generations of Computers: Over the 20th century, there have been 5 different generations of computers. These include:

  1. 1st Gen => 1940-1956: Vacuum Tubes
  2. 2nd Gen => 1956 – 1963: Transistors
  3. 3rd Gen => 1964 – 1971: Integrated Circuits
  4. 4th Gen => 1971 – Present: Microprocessors
  5. 5th Gen => Present & Beyond: AI

Because it appears to be original research. If someone has a source, there shouldn't be much of a problem with re-adding it, so long as it is communicated who proposed this model. GracenotesT § 18:39, 16 July 2007 (UTC)

I've just re-removed this text. I agree on the OR claim, but also disagree to an extent with the classification: especially with the "Present and Beyond: AI". Firstly, this is casting speculation on what will drive the industry in the future and secondly 'AI' has been 'the next big thing' since the 1960's. Angus Lepper(T, C, D) 21:19, 16 July 2007 (UTC)
I concur with Gracenotes and Angus Lepper that the text is original research, speculation, and factually incorrect, all of which violate numerous Wikipedia policies and guidelines, and should stay out of the article. --Coolcaesar 19:36, 18 July 2007 (UTC)
The list is correct for the first four generations and is discussed in History of computing hardware, with more detail on the third and fourth generations in History of computing hardware (1960s-present). However there is to date no consensus as to what the fifth generation is (or will be) but for one possibility see Fifth generation computer. --Nibios 04:06, 30 July 2007 (UTC)

Image Improvement

I believe a better picture of the NASA Super Computer should be shown, as this one is quite an illusion. —The preceding unsigned comment was added by Unknown Interval (talkcontribs) 00:40, August 20, 2007 (UTC).

In this topic I have to write that this article is not going to be complete without image of, I think today most widely used, simple, PC. I scrolled down the article in hope I will find it, but there isn't. Is it unlogic? For many people around the world, metion of word "computer" at first means PC. But they will not find it here. --Čikić Dragan (talk) 20:27, 13 January 2008 (UTC)

This is an encyclopedia - not a picture gallery. Ask yourself this: What additional meaning could come to the article by showing computers that are essentially the same as the machine the person is sitting in front of as they sit reading Wikipedia. We need to show images that ADD information. Most people think of a computer as a laptop or a desktop PC or something - we can expand on their perceptions by showing images of massive supercomputers, computers made of gearwheels, tiny computers that fit into a wristwatch. To try to keep this article down to some kind of reasonable size, we have to use our available screen space wisely - an photo of a common kind of PC is really largely irrelevant. There would be a case for showing (say) an absolutely original early-model IBM PC - that's of historical value...but a common modern desk-side is just pointless. SteveBaker (talk) 01:52, 16 January 2008 (UTC)

Homebrew computer

Perhaps the article can add a section about homebrew (or DIY) computers. People as Dennis Kuschel[1] are starting to make a complete computer almost from scratch (by combining electronic components). Also a Dutch guy called Henk van de Kamer is making for example a complete CPU from basic transistors[2].

People have done this for years; it's almost a homage to the days of discrete component and SSI/MSI IC-based computers. Anyway, this information is more of the trivial sort than of fundamental importance to computers in general. 74.160.109.8 00:04, 6 November 2007 (UTC)

Yes but I believe now, powerful computers (with up to 1200+ CPU's) may be created while the spying of the trusted computer alliance may be avoided. Further searching led me to the OpenCores RISC 1200 CPU ), and the AVR Webserver project (made diy) from Ulrich Radig (see Elektor article) based on the ATMega644 AVR and the Ethernet-based-appliance control (see Ethernet appliances). —Preceding unsigned comment added by 81.246.178.53 (talk) 12:34, 23 November 2007 (UTC)

Update: it seems it is already possbile and available at wiki ! See ECB_AT91 —Preceding unsigned comment added by 81.246.158.33 (talk) 12:57, 23 November 2007 (UTC)


Deletion and redirect of Computer system

The computer system article was recently deleted and redirected to computer. A problem with this is that in many of the articles linking to computer system (see here), the term "computer system" is used to mean a "combination of hardware and software", rather than a "computer". The links in those articles need tidying up if WP is not going to have a "computer system" article. Maybe the instances that mean hardware and software could link to computing instead, if there is not going to be a "computer system" article. Nurg 03:49, 11 November 2007 (UTC)

I realize that I may be horribly late to the party, but I also would promote that these two articles would lead to two entirely different discussions. Mjquin_id (talk) 00:01, 18 July 2008 (UTC)

CPU section

Very good article, but just from the perspective of a beginner reading the article such as myself, it would've been easier to understand if the CPU had been given an overall paragraph of information on how it functions as a whole first, instead of just talking about the ALU and control unit straight away individually. —Preceding unsigned comment added by 60.234.157.64 (talk) 05:33, 11 November 2007 (UTC)

Input/Output

Inside the Input/Output topic, the author mentions that there are fifty or so computers within a single computer. I considered this highly inaccurate and changed this to something more simplified, but have been reverted with the editor saying it was better written and added the processor and VRAM making it more logical to keep. I don't agree with that decision simply because a computer really is just the processor and memory working together to have an output, and even though there are embedded systems with various configurations and limited programmability, it wouldn't have fifty boards. I'll try to clean up the article a little again, and if reverted once again will try to find a consensus, because this simply doesn't make sense. --Bookinvestor 01:41, 2 December 2007 (UTC)

I concur with Bookinvestor that the sentence is highly inaccurate and should be modified or deleted. Whomever wrote that has either never read a basic textbook on computer architecture (e.g., John L. Hennessy's classic textbook, which I've read twice over the years) or has not completed a freshman course in formal written English at a decent university. --Coolcaesar 05:57, 2 December 2007 (UTC)
The sentence doesn't seem all that misleading to me. GPU's do much parallel processing and have many ALUs operating in parallel. This article, for instance, describes a GPU using 128 processors. Of course this doesn't mean 128 boards - the processors are sections of a single IC chip. -R. S. Shaw 06:49, 2 December 2007 (UTC)
I've read the article and have also found the wikipedia article stream processing. Although I really want to delete the whole paragraph, perhaps I should study a little more into what the author's trying to say because it opens a whole new realm of learning. I'll add this link to the paragraph to see exactly what the author's talking about with GPUs, and hope someone would be able to research with me. Thank you for your contribution Shaw. --Bookinvestor 19:24, 3 December 2007 (UTC)

what is a computer?

The first sentence should be something like: "A computer is an information processing machine which is capable of simulating any other information processing machine that can fit into its memory." There are three salient parts to the definition:

 -information processing machine
 -capable of simulating any other ( like a Universal Turing Machine )
 -can be built in real life, so can't have infinite memory ( unlike a Universal Turing Machine )

Currently the first sentence of the article says "A computer is a machine that manipulates data according to a list of instructions", which is only one particular (but dominant) category of computer architecture -- Instruction Set Architecture. Other real life categories which are actually sold are:

 parallel/multi-core -- one computer processes many lists of instructions in parallel
 reconfigurable/FPGA -- the computer is specialized for simulating any logic network

And some other classes which have only had prototypes:

 stream processors
 cellular automata simulators (probably only one of these was ever made (Margolis))
 dna computer  —Preceding unsigned comment added by 70.20.219.7 (talk) 03:31, 25 January 2008 (UTC) 
While your opening would probably stand up better as a formal definition, I'm not sure that's the best route to follow for an article which is intended to be an accessible introduction to computers. Remember, this article isn't about theories of computation or computability, but about "computers". The term "computer" has a vernacular meaning to society which is every bit as significant as the formal basis. The writing of the article tries to balance these two worlds by introducing computers in fairly familiar terms and then branching out slightly into instruction set (or Von Neumann) architecture. I believe this is a good approach because, as you state, this is the dominant realization of computers and will cover most anything that a typical person will think of as a computer (thread-level parallelism really doesn't break significantly from the "list of instructions" concept, either). With some of your examples (stream processors, special-purpose logic and DSP, DNA computing, etc), there's probably a good bit that can be said about whether these even are computers (the classical "where is the line between a computer and a calculator" question). We run into the issue of a heavily overloaded term in "computer", so we took the tact of making an article as accessible yet informative as possible to a general reader. I really think it would only complicate matters to try to introduce much computing theory in this article, especially since there are a lot of other articles dedicated to the theoretical aspects of computers. -- 74.160.99.252 (talk) 19:07, 11 February 2008 (UTC)

A computer can't be a machine, because the Wikipedia entry for machine excludes computers.Heikediguoren (talk) 21:14, 10 July 2008 (UTC)

The Wiki article for machine is in grave error and should be corrected; analog and digital computers both fall under the broader category of "machines". See Association for Computing Machinery, "the world's first scientific and educational computing society" (and still going strong).
The 3rd edition of the Concise Encyclopedia of Science and Technology (Sybil P. Parker ed., 1994, McGraw-Hill) defines "computer" as "A device that receives, processes and presents information. The two basic types of computers are analog and digital." (Note: A computer doesn't require "a list of instructions".) A broad definition indeed, but the introductory paragraph can quickly set aside the analog variety, which are still in heavy use, although we don't commonly think of them as "computers". Any analog meter, gage, motor driven clock, speedometer, odometer, etc. qualifies as a computer.
I'm not sure "Modern computers are based on tiny integrated circuits…" is the proper phrasing. How about, "Modern electronic computers rely on integrated circuits…" The CPUs in PCs and Macs aren't so "tiny" – it's a relative term anyway.
Btw, the caption that reads, "Microprocessors are miniaturized devices that often implement stored program CPUs" is all wrong; it doesn't even make sense. Microprocessors are literally integrated circuit (as opposed to discrete circuit) electronic computers – computers on a chip. And "central processing unit" should appear earlier in the text. As is, we see "CPU" before we're told what the letters stand for.
Otherwise, nice work guys!
Cheers, Rico402 (talk) 10:16, 11 July 2008 (UTC)
I STRONGLY agree. In the article itself, analog computers are mentioned. By the given first sentence definition, they don't belong in the article... In general, this article is amazingly shallow for Wikipedia (which is usually very strong on computing subjects). 213.112.81.33 (talk) 23:47, 15 October 2008

Wristwatch computer

I removed the image - from the image description page, it appears it is a painted image, not a real product. If the image is re-inserted, we need a reliable reference. --Janke | Talk 11:05, 3 February 2008 (UTC)

Defining trait(s) of a computer

The article states:

"Nearly all modern computers implement some form of the stored program architecture, making it the single trait by which the word "computer" is now defined."

This is unreferenced, and to me this seems wrong -- I would say that Turing Completeness is also an essential trait of a "real computer". Is there some basis for saying that stored program control is sufficient?

The article also uses the term "Turing Complete", without defining it. I would suggest a brief explanation of this important concept, and then include it in the "trait by which the word "computer" is now defined." bit. What do people think? — Johan the Ghost seance 16:55, 10 February 2008 (UTC)

Gottfried Wilhelm Leibniz should be mentioned as he invented the binary system and built mechanical calculators as well. —Preceding unsigned comment added by 84.151.59.187 (talk) 02:19, 17 April 2008 (UTC)

The expression "form factor" is used in many Wikipedia articles on computers, and linked to Form factor (which is a disambiguation page though not labelled as such). This term apparently has several different meanings. One usage seems to be the "footprint" or overall physical size of the computer. This usage seems to be implied in many of our articles, however it is not given on the page Form factor (In other words, where this usage is intended the link is wrong).
Could we please:

  • (A) Provide this definition at Form factor, and an article for this meaning if appropriate
  • (B) In all articles that mention "form factor", clarify which meaning is intended, as appropriate

-- Writtenonsand (talk) 05:24, 21 May 2008 (UTC)

Computer <> Digital computer

A internal link could profitably, for the curious, be made to Analog computer, and perhaps - for completude - to Stochastic computer[3] 89.224.147.179 (talk) 09:56, 22 June 2008 (UTC)


Not all stored program computers have a program counter.

The control unit section of this article contains this text "A key component common to all CPUs is the program counter". This is almost correct but I think it should be changed to say A key component common to almost all CPUs is the program counter". The one exception I know of is the ICT 1300 series which have three control registers CR1,CR2 and CR3. After a normal single length instruction is executed in CR1 then CR3 is moved into CR2 and CR2 moved to CR1 and CR1 has one added to it and is moved to CR3. In normal operation (though nothing in hardware forces this) there is at least one unconditional jump instruction in one of the control registers. When one is executed in CR1 then CR1 and CR2 get loaded from core memory and CR3 receives a copy of the original jump instruction with one added. The previous contents of CR2 and CR3 are stored in another register where they are available as a return link for subroutines. When a conditional jump is executed, much the same happens except CR3 receives an incremented version of the jump but with the condition made into always true, that it it becomes an unconditional jump.

When executing a program with two subroutine jumps immediately following one another, all three control registers contain unconditional jump instructions. In this state, it is impossible to point at one of the control registers and say that is the program counter. It sort of has a program counter but it flits between registers and I suppose sometimes you could say there are three program counters. This avoids having any separate logic to implement the fetch cycle and also avoids having to have an extra bit in a program counter to allow for 24 bit instructions in a 48 bit word machine which has no addressable portions of a word.

If you prefer, an alternative wording would be "A key component common to all modern CPUs is the program counter" as the five ton ICT 1301 were built in 1962-5 and are in no way modern, though as over 155 were built, they were quite a large proportion of the computers of their day, certainly in the UK. I should add that the number 155 is because I have part of machine number 155 as well as all of machines numbered 6 and 75, so there could be considerable more than 155.

86.146.160.196 (talk) 18:48, 27 August 2008 (UTC)Roger Holmes


Example Mistake?

{{editsemiprotected}}

The text currently in the article says:

Suppose a computer is being employed to drive a traffic light. A simple stored program might say:

  1. Turn off all of the lights
  2. Turn on the red light
  3. Wait for sixty seconds
  4. Turn off the red light
  5. Turn on the green light
  6. Wait for sixty seconds
  7. Turn off the green light
  8. Turn on the yellow light
  9. Wait for two seconds
 10. Turn off the yellow light
 11. Jump to instruction number (2)

With this set of instructions, the computer would cycle the light continually through red, green, yellow and back to red again until told to stop running the program.

However, suppose there is a simple on/off switch connected to the computer that is intended to be used to make the light flash red while some maintenance operation is being performed. The program might then instruct the computer to:

  1. Turn off all of the lights
  2. Turn on the red light
  3. Wait for sixty seconds
  4. Turn off the red light
  5. Turn on the green light
  6. Wait for sixty seconds
  7. Turn off the green light
  8. Turn on the yellow light
  9. Wait for two seconds
 10. Turn off the yellow light
 11. If the maintenance switch is NOT turned on then jump to instruction number 2
 12. Turn on the red light
 13. Wait for one second
 14. Turn off the red light
 15. Wait for one second
 16. Jump to instruction number 11

In this manner, the computer is either running the instructions from number (2) to (11) over and over or its running the instructions from (11) down to (16) over and over, depending on the position of the switch.

However, I think it should be (changes in italic):

Suppose a computer is being employed to drive a traffic light. A simple stored program might say:
  1. Turn off all of the lights
  2. Turn on the red light
  3. Wait for sixty seconds
  4. Turn off the red light
  5. Turn on the green light
  6. Wait for sixty seconds
  7. Turn off the green light
  8. Turn on the yellow light
  9. Wait for two seconds
 10. Turn off the yellow light
 11. Jump to instruction number (2)

With this set of instructions, the computer would cycle the light continually through red, green, yellow and back to red again until told to stop running the program.

However, suppose there is a simple on/off switch connected to the computer that is intended to be used to make the light flash red while some maintenance operation is being performed. The program might then instruct the computer to:

  1. Turn off all of the lights
  2. Turn on the red light
  3. Wait for sixty seconds
  4. Turn off the red light
  5. Turn on the green light
  6. Wait for sixty seconds
  7. Turn off the green light
  8. Turn on the yellow light
  9. Wait for two seconds
 10. Turn off the yellow light
 11. If the maintenance switch is NOT turned on then jump to instruction number 2
 12. Turn on the red light
 13. Wait for one second
 14. Turn off the red light
 15. Wait for one second
 16. If the maintenance switch is NOT turned on then jump to instruction number 2
 17. Jump to instruction number 11

In this manner, the computer is either running the instructions from number (2) to (11) over and over or its running the instructions from (11) down to (16) over and over, depending on the position of the switch.

This is because if the original one was used (and the switch was turned on), the program would loop from 11 through 16 continuously regardless of the switch's position. starwarsfanaticnumberone (talk) 22:49, 2 October 2008 (UTC)

 Not doneThe original example executes just fine: If the switch is on, the instructions loop continuously from #11 to #16. Once the switch is turned off, the instructions continue until #16, then jump to #11, and then jump to #2, starting to loop again from #2 to #11. The suggested change merely adds a redundant instruction.--Aervanath lives in the Orphanage 23:12, 2 October 2008 (UTC)

Ask

  • Who is created the first computer around the world ?
  • Who is created the first internet connection around the world ?
  • what is the first computer program ?

Unable to edit

I wanted to correct a mistake, but I cannot edit the article. Why? Tohuvabohuo (talk) 10:45, 1 December 2008 (UTC)

Probably because the article is protected.

--Sci-Fi Dude (talk) 18:40, 5 June 2009 (UTC)


Nokia overtaking HP

Following the statement about Nokia overtaking HP in sales of computers; if we consider smartphones as computers, it is true that Nokia overtook HP in Q1 2008 in sales: Nokia sold 14,588,600 smartphone while HP sold 12,979, 000 computers. Source: Gartner http://www.gartner.com/it/page.jsp?id=688116 and http://www.gartner.com/it/page.jsp?id=648619. —Preceding unsigned comment added by Thecouze (talkcontribs) 20:38, 9 January 2009 (UTC)

Thank you for the references, Thecouze. I think we should also make a more general statement about mobile phones, smart phones, and PCs as a whole (rather than specific companies).

Nearly all mobile phones (not just smartphones) use at least one general-purpose CPU (possibly in addition to a DSP): reference: "Intel has ARM in its crosshairs" by Tom Krazit 2007.

If we consider mobile phones as computers, mobile phones ("Mobile Terminals") overtook PCs a long time ago:

  • 294,283,000 Worldwide Mobile Terminal Sales to End-Users in 1Q08 -- Gartner [1]
  • 71,057,000 Worldwide PC Vendor Unit Shipment Estimates for 1Q08 -- Gartner [2]
  • 32,249,904 Worldwide Smartphone Sales to End-Users in 1Q08 -- Gartner [3]

--68.0.124.33 (talk) 06:02, 15 March 2009 (UTC)

Error in computer logic in first computer program.

It would appear that the first computer program example (counting to 1000) has an error introduced when the increment value is itself also incremented each loop. Thus the values held by the increment holder are 1, 2, 3, 4, 5,.., n. The result counter does not increment by one but follows the following pattern: 1, 3, 6, 10, ...

The value of the increment should stay steady during the course of the program execution. —Preceding unsigned comment added by Campej (talkcontribs) 04:59, 14 January 2009 (UTC)

Further topics section

The section is way too not encyclopedic - whould be converted to prose or deleted.--Kozuch (talk) 19:13, 22 March 2009 (UTC)


Operating System/Application Software

Why is it necessary to have an operating system installed on a computer before installing application software —Preceding unsigned comment added by 219.89.124.31 (talk) 23:29, 14 April 2009 (UTC)

To install and execute a specific software application your system must meet the requirements (both hardware and software) stated for that application. If the application's requirements state that operating system XYZ is required then the application is in some way dependent on XYZ and will not work without it. Most, but not all, applications require an operating system. 71.135.172.126 (talk) 15:21, 15 April 2009 (UTC)

Need to add CPM to operating systems. —Preceding unsigned comment added by Mycroft 514 (talkcontribs) 20:11, 18 November 2010 (UTC) Under scripting languages, REXX is one of the most widely used. Also TEX is another one that springs to mind, and if you want to continue to most common then you need to add .BAT files on the PCs. —Preceding unsigned comment added by Mycroft 514 (talkcontribs) 20:14, 18 November 2010 (UTC)

Somebody Please Fix: Small Spelling Error

From the 1st paragraph in "History of Computing":

"From the end of the 19th century onwards though, the word began to take on its more familiar meaning, decribing a machine that carries out computations."

- describing is spelled as decribing in this paragraph. It's semi-protected so I cannot edit this small error as a new user.

Done. Dan D. Ric (talk) 01:21, 26 April 2009 (UTC)

Copyvio?

At a glance, this example looks suspiciously like one of the first exercises in Uplink (video game). It should be checked out, and credit given if that's its source. Дигвурен ДигвуровичАллё? 21:11, 16 June 2009 (UTC)

Intro needs editing

The intro to this important article contains a number of sentences and phrases which could benefit from further editing. The prime candidate here is "Personal computers in their various forms are icons of the Information Age, what most people think of as a "computer", but the embedded computers found in devices ranging from fighter aircraft to industrial robots, digital cameras, and toys are the most numerous." The clause 'what most people think of as a "computer"' does not fit at all well into the sentence and could, for instance, be accommodated by rephrasing as follows:

It is notable that Oscar Wilde and Professor Britannica agreed on the same thing "Computers will someday take up half a room!" —Preceding unsigned comment added by 98.162.199.142 (talk) 00:39, 10 June 2010 (UTC)

"Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". The embedded computers found in devices ranging from fighter aircraft to industrial robots, digital cameras, and toys are however the most numerous."

Perhaps someone could come up with something even better.-Ipigott (talk) 15:33, 18 June 2009 (UTC)

I've just done a general update on the lead section, using your suggested wording. I hope it's a bit better now. --Nigelj (talk) 14:23, 20 June 2009 (UTC)
The first sentance of the lead reads 'A computer is a machine that manipulates data according to a set of instructions.' Would it not be appropriate to change this to read 'A computer is a machine that manipulates data according to a set of modifiable instructions'? Purely mechanical dvices can be regarded as containing instructions that lead to manipulations of data. Surely it is the fact that the instructions - the program - can be modified that is a defining charactersitic of a computer.--TedColes (talk) 17:40, 20 June 2009 (UTC)
I don't think so. A single chip microprocessor with a program in read only memory is still a computer even though its instructions cannot be modified. Early computers were programmed with patch panels etc, later on we got 'Stored program computers' which were a big step forward but to rule the early machines to be excluded goes against accepted history. —Preceding unsigned comment added by RogerHolmes (talkcontribs) 18:19, 16 July 2009 (UTC)
Have checked anybody what appears on the intro, there's something like a troll around that put Computers are awesome! (especially Macs) on the intro of the article, can anybody check??


Definition of computer

the definiion of computer given in this article may be better suited if it was "instructions and give output in a useful format." instead of "A computer is a machine that manipulates data according to a set of instructions." - --nafSadh (talk) 05:52, 13 July 2009 (UTC)

I agree that the current definition is too vague. Slide rules and calculators are machines that manipulate data according to instructions, but most people would not refer to them as computers. --Jcbutler (talk) 19:54, 9 February 2010 (UTC)
Wouldn't there also be a case for remove the word 'programmable' from the definition as I would say that it's rather dubious as to whether many of the examples of computers have been programmed at all rather than just 'happened', life being the obvious example. StijnX 11:01, 25 July 2010 (UTC)
I agree with StijnX that the word computer is unduly associated with the word "program". A device does need a program to be a computer. A program is a list of instructions or steps - yes, computers have to change inputs into outputs in a meaningful way, but that specific method of transforming methods can be inherent to the computer itself, like individual neurons or AND gates, both of which can be reasonably considered as rudimentary computers. I vouch for changing the definition from "A computer is a program machine that receives input, stores and manipulates data, and provides output in a useful format." to "A computer is a machine that receives input and manipulates it into a meaningful or useful output. It may also store data, and it may be programmable so that it can manipulate inputs or data in multiple ways." rablanken (talk) 01:38, 25 December 2011 (UTC)
I'm sorry, I meant to post my last on the "disambiguation" talk page. I'm a little new. But I think I think it's still applicable to the introductory paragraphs of this article. rablanken (talk) 01:46, 25 December 2011 (UTC)

Definition of Computer

I deleted the IBM 608 Transistorized Calculator from the List of transistorized computers as Bashe, IBM's Early Computers p.386, had noted “... properly called a calculator, not a computer in today's terminology because its small ferrite-core memory was not used for stored-program operation.” That edit was reverted: Wtshymanski (talk | contribs) (4,331 bytes) (and ENIAC was a stored program computer? Revert (I see, above, that Wtshymanski is an editor of this article!). Not agreeing with that revert, I accessed the Computer article, looking for a definition of computer (I was interested in the 608; someone else could worry about the unrelated ENIAC).

Found A computer is a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations. The particular sequence of operations can be changed readily, allowing the computer to solve more than one kind of problem. Conventionally

The definition must stop at conventionally as that use admits of non-conventional computers. The second sentence "The particular sequence of operations can be changed readily..." is wrong, see Embedded systems An embedded system is a computer system designed to do one or a few...

The Computer article's definition is, then, A computer is a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations. Compare that to Programmable calculator Programmable calculators are calculators that can automatically carry out a sequence of operations under control of a stored program, much like a computer.

Somewhat (a lot!) to my surprise Wtshymanski's revert was correct, calculators are computers! At least so far as these Wikipedia definitions are concerned. (Even considering the ... sequence of operations can be changed readily... doesn't help as programmable calculators meet that criteria as well.)

The obvious suggestion: The Computer article should have a definition of computer sufficient to distinguish computers from programmable calculators. 69.106.237.145 (talk) 09:07, 21 June 2011 (UTC)

Is this a useful distinction? Is our hypothetical knowledge-seeking reader (Jimbo help him!) going to be confused between the racks of blinking lights filling a dinosaur pen and that thing he uses at tax time to tot up his deductions? Is not a pocket calculator implemented as a microprocesor with a stored progream? Was the IBM 608 Turing-complete? What is Turing-complete anyway, and must a computer be Turing-complete to be a computer? Myself, I think it's useful to distinguish between machines used for doing lots of arithmetic vs. more general puprpose bit-twiddling (no-one did text editing on ENIAC, for instance) but it's hardly a crisp -or vital - distinction. ENIAC crunched numbers only and is plainly a computer, an HP 48 can spell out text prompts to you and is plainly a calculator - it's a complex world. --Wtshymanski (talk) 13:40, 21 June 2011 (UTC)
I believe that "Turing Complete" is the definition of a "Computer" that we should be using here. It's more than just a statement of what the machine does. The "Church-Turing" thesis points out that all machines that are Turing complete are functionally equivalent (given sufficient time and memory) to all other Turing-complete machines.
A calculator may have a computer inside it - but when used from the user interface, it's not a computer. My car has a computer inside - but that doesn't make my car be a computer. A programmable calculator, however, exposes its Turing-completeness to the user and most certainly is a computer under this definition.
The Turing completeness thing is important - any machine that is Turing complete could (with enough memory and time) emulate any machine that we would most certainly describe as "a computer" (eg an Apple II)...so how can we deny that a machine that could emulate an Apple II the title "computer"? We obviously cannot.
So it is certainly the case that any machine that's Turing complete must be called a computer. The only point of contention is whether there are other machines that while not being Turing complete also warrant being called "a computer"? The difficulty is that most of those machines are old and no longer in use - and whether they were once called "computer" or not doesn't matter because our language has moved on. After all, the original meaning of the word referred to human beings who did menial calculations for mathematicians, scientists, etc. The meaning of the word has definitely shifted - and we're supposed to be using the modern meaning.
So I strongly believe that this article should use Turing-completeness as the touch-stone for what is and what is not a computer. However, I think we'd be doing a disservice to our readership to use those words to describe what a computer is - because Turing Completeness is a tough concept to get ones' head around - and the people who come to read this 'top level' article are likely to be seeing a more approachable definition.
SteveBaker (talk) 14:12, 21 June 2011 (UTC)
Was an IBM 608 programmable calculator "Turing-complete"? I've never found an intelligible definiton of Turing-completeness aside from the one that goes "A machine is Turing-complete if it can emulate a universal Turing machine (except for finite memory and speed)". I have a vauge picture of jamming tape into my TI58 calcuator and seeing if it can read marks off it and make marks on it...and then the utility of the comparision vanishes. Turing-complete seems to have something to do with "conditional branching" - from what I've read you must be very clever to come up with a Turing machine that doesn't have conditional branches. A touch-stone is only useful if you can tell if something is touching it, and the "Turing complete" phrase for me has obscured more than it explains.
I don't think the categories of "computer" and "calculator" are disjoint - my old TI 58 could automatically execute stored instructions and could change the order of execution depending on the results of operations on data, and so by any reasonable definition was a "computer". Just as plainly, it was specialized for doing arithmetic, so it was a "calculator". --Wtshymanski (talk) 15:20, 21 June 2011 (UTC)

Requested edit

While you are thinking about the Definition of Computer (the immediately above discussion topic), please delete the articles second sentence The particular sequence of operations can be changed readily, allowing the computer to solve more than one kind of problem.. This edit requested because 1) many (most?) computers are in Embedded systems and 2) some computers have the particular sequence of operations burned into ROM (the Dulmont Magnum was an early example) thus changed readily is not, in fact, a defining characteristic of Computer. Thanks, 69.106.237.145 (talk) 20:32, 26 June 2011 (UTC)

Depends on what "readily" means. If you built a machine from logic gates or Tinkertoy to solve the Knight's tour problem, it would be completely useless at solving anything else without redesigning it. But any computer system solves problems with the same hardware, only changing the software. The same processor that controls ignition timing on a car might equally well be used in a printer, same hardware, different firmware -though you probably could not alter the mask ROM in either application. "Readily" is shorthand here which could be expanded in the article. --Wtshymanski (talk) 02:00, 27 June 2011 (UTC)

Hardware table limitations

It is notable that the table's entries for third generation computers is limited to those from US companies. Clearly the list can't be comprehensive, but it should surely aim to be reasonably representative of important advances? I am particuarly thinking of Manchester University's Atlas with its introduction of virtual memory by paging. --TedColes (talk) 06:51, 7 August 2009 (UTC)

See also section

Some of the links in this section of the article have questionable relevance. For example, List of fictional computers; Electronic waste; Living computer theory, a defunct link; and especially The Secret Guide to Computers, which seems to be nothing more than an endorsement/promotion article. Someone with editing ability please cull this section. —Preceding unsigned comment added by 98.26.191.8 (talk) 01:51, 20 August 2009 (UTC)

Why was the date 2012 changed back to 1613? It is correct.--Laptop65 (talk) 12:03, 1 December 2009 (UTC)

Transistor

Shouldn't the invention of the semiconductor transistor be given a lot more attention? Although I am not an expert on computers, I thought that developing a semiconductor was crucial in moving away from hefty vacuum tubes. 77.250.86.100 (talk) 20:55, 15 March 2010 (UTC)

At the time this comment was posted, this was in the computer article, and it's still there today:

Semiconductors and microprocessors

"Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953. In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers."
That seems sufficient for this article. Readers desiring further information can click on the transistor link, and there they can go to history of the transistor for even more detail. Wbm1058 (talk) 01:16, 21 January 2012 (UTC)
On second thought, semiconductor transistor should be the lead of that section Wbm1058 (talk) 01:43, 21 January 2012 (UTC)

about computer

i want to know whay computer come to be so powerful —Preceding unsigned comment added by 41.218.229.207 (talk) 19:31, 5 April 2010 (UTC)

This may be a pointer to a real opportunity to improve the article. By the 21st of 22 paragraphs under 'History of computing', we are only up to vacuum tubes; the 1980s go by in one sentence and we never mention the 90s, the 00s, or the 10s. Maybe we can shift our field of view forward just a little to mention the last 20 - 30 years in passing? People have argued strongly in the past that we should not focus on the modern PC in the lede here, but it seems that now, we hardly mention its existence, let along its importance in the modern world. Under software, the assembler and traffic-light examples would be familiar to readers beamed forward to now from the mid 1970s.
If I had to answers the anon's question above I would talk about the increase in CPU complexity and word width, the increases in typical RAM and disk space, and find the points where modern highstreet laptops overtook yesteryear's military secret supercomputers. I would talk about the dramatic increases in software complexity made possible by high-level programming languages, shared libraries, specialist domain-specific languages, object-orientation, test-driven development, scripting languages etc. I would talk about the internet and the way it has led to collaborative open-source development, the way that examples, answers and code tricks previously buried in books and manuals are now just Google search away without the developer leaving their desk. I would, but I don't have sources to hand, and anyway there's no room unless we shift some of the ancient history out to sub-articles too. --Nigelj (talk) 20:08, 5 April 2010 (UTC)

Could wiki admin please add the topic of ....

Medical device, which is related to compliance issue --124.78.215.0 (talk) 07:18, 18 April 2010 (UTC)

An Hungarian invention?

What sources are listed that the computer is, in fact, an Hungarian invention? —Preceding unsigned comment added by 86.101.12.46 (talk) 16:45, 4 May 2010 (UTC)

Noted; I took out both national categories. Everyone knows Tesla invented the computer anyway, so it should be a Serbian Croatian Serbian Croatian Scottish British....duck season...rabbit season... --Wtshymanski (talk) 18:15, 4 May 2010 (UTC)

computer word come

Some people says computer has a full form means all word C O M P U T E R has a different meaning but computer has no full form because before computer,calculator is came into the market which can accept number(0-9) and it can take some process like addition,subtraction,multiply,etc. after some years when a new calculator came which can accept number(0-9)as well as character is called compute after some years when a new advance compute machine is came which can accept number(0-9),character all data types it has extra ability of process it can store the data for long time is called computer. —Preceding unsigned comment added by 117.96.82.247 (talk) 02:57, 18 May 2010 (UTC)

Edit request from Bmgross, 6 June 2010

{{editsemiprotected}} For the example of some code that might control traffic lights, the made up keyword "THEN" is spelled in both lowercase and uppercase. It appears that the text was meant to be typed in uppercase.

P.S. This would be the second of the code examples, where they have decided to add an IF statement.

Bmgross (talk) 03:20, 6 June 2010 (UTC)

 Done CTJF83 pride 03:34, 6 June 2010 (UTC)

by:joshua domingo —Preceding unsigned comment added by 110.55.182.2 (talk) 08:41, 26 July 2010 (UTC)

This article is in need of major reorganization

I can’t believe this article was once a featured article; for such a basic, fundamental article, this article is terribly organized. The individual items of content seem solid, but what’s really lacking is any overall structure. If one takes a look at say the Russian or Hebrew versions of this article, one gets some idea how a proper article would look like. Anyone have the initiative to do a major reorganization of the article? As a first crack, I would suggest a very rough structure such as the following:

  • Lead should be rewritten; suggest using Google Translate on the Russian or Hebrew or other versions to get some idea how this should be done.
  1. History
    1. Computers in antiquity
    2. Calculating machines
    3. Mechanical/electromechanical computers
    4. First general-purpose computers
    5. Stored-program computers
    6. The integrated circuit (IC)
    7. Semiconductors and transistors
    8. Microprocessors and microcomputers
    9. Parallel computing and multiprocessing
    10. Multicore processors
  2. Types of computers (in lieu of “Misconceptions”)
    1. Personal computers
    2. Workstations
    3. Servers
    4. Supercomputers
    5. Mobile computing [e. g. cell phones, tablet computers]
    6. Embedded computing
  3. Theory of computing (?)
    1. Boolean logic and binary computing
    2. Turing machine
  4. Computer architecture
    1. Von Neumann architecture
    2. Harvard architecture
  5. Typical computer organization and hardware
    1. Central processing unit (CPU)
    2. Arithmetic logic unit (ALU)
    3. Memory
    4. Input/output (I/O)
    5. Storage
    6. Computer networking
      1. The Internet
      2. World-Wide Web
  6. Computer programming and software
    1. Machine and assembly language
    2. High-level programming languages
    3. Firmware
    4. Operating systems
    5. Multitasking and multiprogramming
    6. Protocols and/or platforms
    7. Applications
  7. Economics/industry
  8. Future trends
    1. Artificial intelligence [but should be very brief]
      1. [Also mention notable milestones such as Deep Blue vs. Kasparov]
      2. The “Singularity”
    2. Networked computing
    3. Embedded and pervasive computing
    4. Quantum computing
    5. Biological and molecular computing
  9. Computers and society [or some b.s. section like that]
    1. Computer literacy
    2. Technological divide/technology gap
    3. Societal implications (?)
  10. See also, notes, references, external links, etc.

Btw, there is so much room for improvement here, that shouldn’t the article be unprotected to give more editors the opportunity to contribute/fix/reorganize this article? This current “frozen” version is just awful. The article could be so vastly improved, without adding to the length (in fact, it could cover everything above and still be shorter). Any takers? —Technion (talk) 13:06, 27 July 2010 (UTC)

If you think that it is need of a revamp, why not be bold? Sir Stupidity (talk) 02:26, 29 July 2010 (UTC)

One word:  finals.  Btw, §1.6 and §1.7 above should obviously be swapped.  My bad. Technion (talk) 06:30, 29 July 2010 (UTC)

The number of sections/subsections as it stands is fine, and is akin to other similar articles in Wikipedia. History section (as in Televison, Electricity, Telephone articles) is a general overview and chronologically encompasses the major advances/innovations/people that led to the modern computer. Subarticles for less significant content would be ideal. Michael Jones jnr (talk) 22:30, 31 July 2010 (UTC)

Overpopulation of the internet

In the internet section, shouldn't it be noted that the number of available IPv4 addresses is diminishing rapidly due to overpopulation, and that without widespread use of IPv6, we will run out of IP addresses? Jhugh95 (talk) 17:34, 16 August 2010 (UTC)

Read IPv4 address exhaustion and then reply again.Jasper Deng (talk) 06:17, 17 February 2011 (UTC)

Misuse of sources

This article has been edited by a user who is known to have misused sources to unduly promote certain views (see WP:Jagged 85 cleanup). Examination of the sources used by this editor often reveals that the sources have been selectively interpreted or blatantly misrepresented, going beyond any reasonable interpretation of the authors' intent.

Please help by viewing the entry for this article shown at the cleanup page, and check the edits to ensure that any claims are valid, and that any references do in fact verify what is claimed. Tobby72 (talk) 16:57, 6 September 2010 (UTC)

Typo (extra "a")

In the “Limited-function computers” section, the article says “…designed to make a copies of themselves…”

Can someone take out the extra particle? --91.156.254.185 (talk) 18:43, 8 September 2010 (UTC)

Done! Thanks for pointing it out. Franamax (talk) 18:53, 8 September 2010 (UTC)

the Z3

it is marked as the first fully functional computer in 1998, but the book it was published in was from 1998, and the computer was from 1941. —Preceding unsigned comment added by Cpubuilder (talkcontribs) 03:00, 24 November 2010 (UTC)

Changing priority

I'm curious, in the entire history of the Wikipedia, has changing the "importance" setting in a talk-page banner ever made a detectable difference in the article quality? Or is it like those extra thermostats on the walls of office buildings - a placebo you can fiddle with instead of doing any work? --Wtshymanski (talk) 05:26, 5 December 2010 (UTC)

I can't answer your question directly - it's hard to know what motivates editors to work on particular things - and even harder to guess whether more editors makes for a better quality article or a worse one! However, there are some things that it really does have an effect on. For example, when people talk about making a "hard copy" Wikipedia - or a DVD version for shipping to 3rd world schools who may not have an internet connection...they start off by picking the high-importance articles and work their way down the importance scale until they run out of space (or whatever). There are also people who monitor the importance-versus-quality indicators on these articles. If all important articles have high quality ratings - then we have a good encyclopedia...if too many important articles are crap - then we don't. So the fact that this article is off-the-charts-important does actually have some effect...just not (necessarily) in improving it. SteveBaker (talk) 03:08, 6 December 2010 (UTC)

Hi... wellcome to computer world... — Preceding unsigned comment added by Iniyanit (talkcontribs) 07:03, 2 January 2011 (UTC)

Edit request from 87.228.229.206, 6 January 2011

{{edit semi-protected}}

please change HPLaptopzv6000series.jpg to Acer Aspire 8920 Gemstone by Georgy.JPG as the first picture shows an obsolette PC with windows XP that are not so often used like windows seven which is the last picture 87.228.229.206 (talk) 15:32, 6 January 2011 (UTC)

Not done: too much copyrighted content on screen. How about File:2009 Taipei IT Month Day1 Viewsonic Viewbook 140.jpg, File:Computer-aj aj ashton 01.svg, File:Computer-blue.svg, File:Computer n screen.svg, or File:Personal Computer Pentium I 586.JPG?   — Jeff G.  ツ 16:38, 6 January 2011 (UTC)

Programming Languages

There shouldn't be so much detail describing and listing programming languages in this article. And according to the reviews of this article, others agree with me. It's fine to mention them but so much detail shouldn't be there. — Preceding unsigned comment added by GuitarWizard90 (talkcontribs) 23:34, 7 January 2011 (UTC)

Data integrity

> A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation.

Hardware is nowhere near that reliable. Hence

  • Memtest86
  • RAID
  • SMART for HDDs
  • ECC RAM
  • HDD remapping

and so on and so on.


> The 1980s witnessed home computers and the now ubiquitous personal computer.

Apple II, 1977 Tabby (talk) 09:52, 1 March 2011 (UTC)

Regarding the first item; it is talking about processing operations (I think) and in that it is accurate, but somewhat pointless. --Errant (chat!) 10:30, 1 March 2011 (UTC)
Indeed. You can't say "The computer makes mistakes - and that's why it needs ECC RAM" (or whatever). The ECC RAM is a part of the computer - so the best you can say is "The computer doesn't often make mistakes - in part because it has ECC RAM". I think it's worth mentioning this because computers are (by far) the most reliable machines humans have ever devised - when you measure average error rate per cycle of operation. My car's engine has four cylinders and cycles at (let's say) 4,000 RPM making about a million "operations" per hour. It's unlikely to run for more than 100,000 miles - a few thousand hours - without catastrophic hardware failure - making perhaps a few billion cycles in the process. My computer does that many operations in a second! Actually, ECC RAM doesn't do a whole lot to save that.
It's a SERIOUS mistake to think "PC" when we say "Computer". The little Arduino computer that I've programmed to drive my robotic milling machine doesn't have ECC RAM, RAID, SMART or remapping software...and the fault rate is still zero over uncountable trillions of operations. When was the last time your wristwatch crashed?
We need to say this in the article because there is a perception that computers crash and their programs fail quite often - however, 99.9% of the time (at least), that's because of human error - the software has bugs. The computer hardware is doing precisely what the stupid human told it to do. I doubt a day goes by without some piece of software misbehaving - but unrecovered hardware failure happens maybe once every few years.
SteveBaker (talk) 05:25, 7 March 2011 (UTC)

ALU Operations

An arithmetic and logic unit does not perform multiplication or division, or any trigonometric functions. This is incorrectly stated in the article. ALUs can do basic arithmetic operations (addition and subtraction) and logic operations (AND,OR,NOT,XOR,NAND,NOR,XNOR) (p. 114 Digital Logic and Microprocessor Design with VHDL by Enoch O. Hwang). Integer multiplication and division are handled by separate functional units within the pipeline. Computers generally do not support trigonometric functions on integers but rather on floating point numbers so these are handled in the FPU. Multiplication and division of floating point numbers are also handled in the FPU.

If a circuit is built to perform an arithmetic operation such as division or multiplication it is called a datapath, not an ALU.

Chrisfeilbach (talk) 05:28, 19 April 2011 (UTC)

Unsubstantiated statement about the castle rock

The statement that the castle rock is considered to be the earliest programmable analog computer is unsubstantiated and therefore this paragraph should be removed until a reference is found.
The heading of this very talk page requests: No original research and verifiability. In Wikipedia:Referencing for beginners the introduction states that Any editor can remove unreferenced material this is all I did, furthermore there is no such thing as a programmable analog computer. This paragraph needs to go until a reference is found.
--Ezrdr (talk) 13:23, 22 April 2011 (UTC)

First of all there are cited references and it has been kept in the article for so long. So, just removing the content is not wise; it is like I'm doing, coz' I can do. Although I do not know nothing about the Castle clock it did not seem like false information. It is not a programmable analog computer in code-based programming. It says it can be programmed (calibrated) to adjust day length. Calling it programmable is hypothetically not incorrect. So, it is better to have a discussion.
I'll suggest a rewrite of the paragraph with much short note (1 or 2 sentence) will be a better choice of edit. --নাফী ম. সাধ nafSadhtalk | contribs 18:07, 22 April 2011 (UTC)
The references are pretty bad for that section. But the claim it makes is valid & accurate. A "programmable analog computer" is a reasonable description (although I appreciate it is a fine line between purely adjustable and programmable), of course a narrow definition of "computer" might make it an inaccurate description. But we use the broad definition here :) -Errant (chat!) 19:42, 22 April 2011 (UTC)
In the five pillars of Wikipedia the second pillar says that all articles must strive for verifiable accuracy: unreferenced material may be removed, so please provide references. The sentence: the castle rock is considered to be the earliest programmable analog computer is unreferenced and a programmable analog computer is yet to be found ; this paragraph is in the computer article because it has the computer word in it. I'm sorry ErrantX but only a reference can decide if something is valid & accurate or NOT! Do the right thing, add a verifiable reference, that's all!--Ezrdr (talk) 20:21, 22 April 2011 (UTC)
I can not verify the offline references. But the section and citation is convincing. Although, is considered to be the earliest programmable analog computer might be replaced with is claimed to be the earliest programmable analog computer. Wikipedia cares about verifiability not truth though. Your argument "programmable analog computer is yet to be found" is denied by the paragraph & citations we are discussing on! --নাফী ম. সাধ nafSadhtalk | contribs 20:55, 22 April 2011 (UTC)
-nafSadh: I can not verify the offline references. Cool, then we agree, this has to be removed until proven right !
-nafSadh: Wikipedia cares about verifiability not truth though: You've got to be kidding, right ?
--Ezrdr (talk) 21:11, 22 April 2011 (UTC)
Plz, read Wikipedia:Verifiability; also read Wikipedia:Citing_sources. I can not verify the offline references means I do not have means to reach the reference. It does not meen the source in unverifiable. Wikipedia cites a lot of offline materials - we do not ask for verifiability of each and every source just because we can not reach the source. If editors have question about some context which is uncited, then the context shall be removed; but when it is cited editor must have to consult with the reference. Only if the context is deviated from cite-source or the source is not citable then the context must be removed. Removal of cited context is not what we shall do. I tried to verify the source, the source itself seems authentic, but not all pages are available online. This source is also cited in many Wikipedia articles and other articles. So, If someone manages the source and finds that, the context is not in its p184 then it can be removed. You removed the context saying according to discussion, but we did not have agreed yet to remove. Please revert the removal and wait for some days until we can come to an agreement.
In addition the context you added do not match with your citations(cite-6). In cite-7 no page number is given. --নাফী ম. সাধ nafSadhtalk | contribs 05:41, 23 April 2011 (UTC)
So. The information is not unreferenced. It is simply a "offline" source that is provided; please avoid treating this as a worse source than online ones (the contempt people sometimes show for books as sources is... mad! :D). So there is no need to remove the text outright (though I support cutting it down). If you have to go to some effort to verify the reference then you simply have to go to that effort! I'll look for some sources tonight, I found a couple yesterday, I need to dig into my bookshelves to verify something. It is generally considered bad manners to simply remove referenced material outright - instead the proper approach is to ask for verification and discuss the material. I agree if this cannot be verified then it needs to go. However I think we can manage it without too much difficulty :) --Errant (chat!) 09:30, 23 April 2011 (UTC)
Please re-include the reference for Castle clock. It'll be better to keep citations. as the source is verifiable, it is ridiculous to think that some editor added the citation without verifying it. --নাফী ম. সাধ nafSadhtalk | contribs 19:25, 23 April 2011 (UTC)

I too think the castle-clock-as-computer is too far fetched to be plausible William M. Connolley (talk) 21:00, 29 May 2011 (UTC)

I note also the above assertions that there is a reference, but no-one has read it. Please be aware (N certainly should be, since he has participated in the debates) that this bears all the hallmarks of Jagged85-ism, who was known to misrepresent references. See Wikipedia talk:Requests for comment/Jagged 85/Cleanup William M. Connolley (talk) 21:06, 29 May 2011 (UTC)

I'm not sure if it's in this discussion or note (I got rather lost TBH) but I did track down the source and check it out - the result was really "meh" and it doesn't really support the claim that was made. The current content (Ezrdr tweaked it I think) is better, accurate and cited. --Errant (chat!) 21:23, 29 May 2011 (UTC)
Thanks; "meh" sounds like about what one would expect when checking up on the Jagged stuff. If the current text really is verified you may or may not want to revert my complete removal of ref to the Castle clock wot I just did, though my feeling is still that the claim is still quite tenuous William M. Connolley (talk) 21:30, 29 May 2011 (UTC)

Unreferenced

However, talking of unreferenced. The paragraph about mechanical calculators (the part that begins In 1642, the Renaissance saw the invention of the mechanical calculator, a device...) is completely without sources :S Ezrdr, do you have a source to support the assertions made in that paragraph? --Errant (chat!) 09:38, 23 April 2011 (UTC)

Hi Errant, A couple of online references so that you can quickly double check them for accuracy. First, a reference from Dorr E. Felt the inventor of the Comptometer that you can read on page 10 of his book (Mechanical arithmetic, or The history of the counting machine, Dorr E. Felt, Washington Institute, Chicago, 1916): "The most famous was by Pascal in 1642. He is credited with being the first to make a mechanical calculator. But I do not think he fully deserved it. He didn't make one that would calculate accurately, even if you handled it with the greatest care and took hold of the wheels and cogs after taking the top off the machine, trying to help them along. I have tried them myself on several of his machines which are preserved, and making due allowance for age they never could have been in any sense accurate mechanical calculators". Here is a quote from The Gentleman's magazine, Volume 202, p.100: "Pascal and Leibnitz, in the seventeenth century, and Diderot at a later period, endeavored to construct a machine which might serve as a substitute for human intelligence in the combination of figures;".
It is the duty of any Wikipedian to remove unsound and unreferenced statements
--Ezrdr (talk) 12:56, 23 April 2011 (UTC)
Cool. However I am more concerned with the second part of the paragraph which makes assertions about this being the basis for various modern computing parts. Especially the part about the Microprocessor & Intel; is there a source that states this link? --Errant (chat!) 12:59, 23 April 2011 (UTC)
No problem: Intel Museum - The 4004, Big deal then, Big deal now
--Ezrdr (talk) 13:06, 23 April 2011 (UTC)
or that: [Microprocessors: From Desktops to Supercomputers]
--Ezrdr (talk) 13:09, 23 April 2011 (UTC)
Am I missing something on the first link? I can't see anything to support hat the micro-processor was invented serendipitously by Intel during the development of an electronic calculator, a direct descendant to the mechanical calculator. (i.e. I don't see mention of mechanical calculators :S). I'll have to download the other PDF when I get home, I don't have my Athens login here. Also; do we have a source for: initially, it is in trying to develop more powerful and more flexible calculators that the computer was first theorized - that seems fairly logical, but best to tie it up with a source (as you pointed out above) --Errant (chat!) 13:28, 23 April 2011 (UTC)
Inclusion of the word serendipitously is disputable. I strongly prefer the removal of it. --নাফী ম. সাধ nafSadhtalk | contribs 19:25, 23 April 2011 (UTC)

From sublime to ridiculous

to ErrantX,
First of all I want to recount the facts:

  • I removed an unreferenced edit (I mentioned it in the edit summary)
  • My edit is revoked telling me to explain (instead of providing a reference)
  • I explained that what I had removed didn't make sense, but that a valid reference would prove me wrong
  • I rewrite the paragraph, still mentioning the castle clock, just removing the claim to first ...
  • And now I have to reference every single word of my edits, I have to explain serendipity, I have to explain that the electronic calculator comes from the mechanical calculator !

Serendipity is when you find something that you were not expecting to find. This is 1969, Intel, a young startup at the time, was developing some chips for an electronic calculator for the Japanese firm Busicom. Intel simplifies the 12 chip set requested down to 4 and delivers the calculator in early 1971. But very quickly they realized that they had created a lot more than a set of calculator chips (this is the not expecting to find part of serendipity), but they do not own the design, fortunately Busicom renegotiates the deal for cheaper prices and they are able to share the ownership of the chip set (MSC-4) with Intel as long as they do not use it for electronic calculators (another serendipity) people use it from Prom Programmers, printers, teletype machines... Intel markets the 4004 a few months later. The 4004 was never patented just because of that. 4 years later the first PC Altair 8800 was released using a fourth generation Intel design (the 8080 after the 8008, the 4040 and the 4004)
Now, before I continue, you say that you need to go home in order to read some of my references, so please read them and we'll talk about them tomorrow. --Ezrdr (talk) 16:06, 23 April 2011 (UTC)

Yes.. I am with you, and have knowledge of everything you have written above. However whilst that is sourceable information that we can deal with in the context of a reasonable timeline.... do you have a source that deliberately deals with the relationship to mechanical calculators. I mean, sure, there are obvious ties in terms of the historical development of computing. But you've gone beyond that with the strong claim that these early mechanical calculators influenced modern computers. It's no longer just a timeline of development, it is analysis/opinion/commentary. That definitely needs a specific source :) I might be missing something, again, in that other source (I will have to read it when more awake) but I, again, do not see the part that supports the assertion being made. --Errant (chat!) 18:19, 23 April 2011 (UTC)
Ezrdr discussing some of your points. please don't be ridiculed. :
  • you removed an unreferenced edit.
  • Your edit was revoked because removal of cited content should be explained (it is unnecessary to provide a reference for an already cited text)
  • you explained that what you had removed didn't make sense. but all we can see is references talk against what you sense. I might have not mentioned castle clock by myself, but as it has been added, I disagree removal of it.
  • You rewrite the paragraph, although mentioning the castle clock, including new claims
  • adding reference is good to keep things unchallenged. IN CONTRARY, you already have challenged cited sources and context. Wikipedia is not what you or I think, Wikipedia is what books, texts, pages, sources etc think.
I feel that, removing cited text might ridicule the editor who added it. --নাফী ম. সাধ nafSadhtalk | contribs 19:25, 23 April 2011 (UTC)

Reference showing that Electronic calculators come from Mechanical calculators

to ErrantX,
Please read the Wikipedia article Sumlock ANITA calculator#History of ANITA calculators. Just stay clear of unreferenced claims though. ;-)--Ezrdr (talk) 19:32, 23 April 2011 (UTC)

I'm sorry but it would be a lot lot easier if you could just cite the claim. So far I have been unable to verify the analysis in that paragraph. I suspect we'll have to move it down a bit. Look; my problem is that the section is really making a big deal of the influence. There is no obvious connection between Intel's creation of the Microprocessor and the older mechanical calculators, it was part of the historical development of computers. Do you see why I think it is necessary to cite this piece of analysis of the history? At the moment my feeling is the material is best moved to the correct place in the time-line. To be clear: at the moment it looks like OR and SYNTH.
Also; the first part of that analysis is still uncited (that being a more direct influence we can probably cite it and leave it be).
FYI I can't find a decent source for the claim you removed; looks like you might be right. Though I am trying to get hold of the Discovery documentary to see what it is like.
You probably don't need to create more and more section headers. --Errant (chat!) 20:53, 23 April 2011 (UTC)
I agree with. Errant --নাফী ম. সাধ nafSadhtalk | contribs 20:58, 23 April 2011 (UTC)

to ErrantX,
First of all thank you for doing the research on the castle clock, I came to the same conclusion you did so far, but please bring it back if you find a verifiable reference.
At this point I don't understand what you don't understand about the paragraph on mechanical calculators. It is in the chapter Limited-function early computers because there is no other place for the description of something that is not a computer but which is one of its most direct ancestor ... in two ways. Please cite what doesn't make sense and I'll make sure it's referenced. I have about 50 books on the subject, in French and in English, I taught microprocessor classes, as a student, in the late 70s and I have an MSEE. I've programmed on mainframes, minis and PCs and designed disk drive firmware from scratch. So please, shoot, I'm ready for you ;-)
--Ezrdr (talk) 22:40, 23 April 2011 (UTC)

Um, well ,I thought I had explained the issue above :S I am trying to! But to put it another way. That section says several things (to paraphrase slightly):
  • Mechanical calculators evolved into digital calculators
  • While working on digital calculators Intel discovered the microprocessor
  • Therefore mechanical calculators influenced the discovery/development of the microprocessor
The first two are facts. The last is an editorial/opinions/analysis for which we must have a source. Otherwise it is something of OR. The problem is not that you have stated these two facts. But that you have tied them together, across hundreds of years of development, without a source to back it up. This is the principle of OR and SYNTH.
There is also another piece of analysis: in trying to develop more powerful and more flexible calculators that the computer was first theorized. Right now I cannot find a good source for it, so I would really appreciate a clear source for this analysis. The reason I am questioning it is because it lists Babbage as one person who was influenced in this way. From my knowledge of Babbage, his work was not related to developing existing calculators but related to his mathematical work (well, obsession really :)). Turing's main work on computers was primarily theoretical and so the link is even more tenuous! Hence, I am not sure that the assertion holds.. they have influence for sure. But I think that it is incorrect to say that it was due to people working on developing better calculators that the field was progressed. I think you'd struggle to find a source that makes such a broad claim! --Errant (chat!) 23:07, 23 April 2011 (UTC)
I asked to re-include the reference for Castle clock on original section or the talk; yet did not get any reply. --নাফী ম. সাধ nafSadhtalk | contribs
Ezrdr said, "please bring it back if you find a verifiable reference", but the reference is already verifiable; you could not just reach the source :S --নাফী ম. সাধ nafSadhtalk | contribs 03:48, 24 April 2011 (UTC)
Do we really need this context, "leading to the development of mainframe computers in the 1960s, but also the microprocessor, which started the personal computer revolution, and which is now at the heart of all computer systems regardless of size or purpose[16], was invented serendipitously by Intel[17] during the development of an electronic calculator, a direct descendant to the mechanical calculator[18]." in Limited-function early computers section? It is a opinion or opinionated description. We do NOT need to place OPINIONs in Wikipedia. --নাফী ম. সাধ nafSadhtalk | contribs 11:55, 24 April 2011 (UTC)

Definition of Harassment

Now, before any more is written, both of you need to check the Wikipedia definition of harassment.
To refresh your memory:
I removed a statement which didn't make sense (there is no such thing as a programmable analog computer) and which reference pointed to nowhere and I am now subject to the nonsense displayed above this paragraph, having to explain every single words of my edits, being ordered around, ridiculed !
Errant, you are overstepping your administrator's authority and ruining the spirit of Wikipedia:

  • No, I do not have to seek your approval for the edits I make.
  • Edits don't have to make sense to you in order to be kept in this article.

Errant, as the overseer of the Computer article (or at least you are behaving like it), I cannot believe the ignorance you showed about Babbage's work and his contribution to our modern digital society.
As a Wikipedian I must be able to participate without being subjected to this kind of virtual verbal mugging.
--Ezrdr (talk) 07:38, 25 April 2011 (UTC)

No Now, I also feel harassed!
instead of going to an edit war we tried to resolve some dispute in talk.
  • No, you do not have to seek someones approval for the edits you make.
  • Edits don't have to make sense to someone in order to be kept in this article.
likewise,
  • I can delete/undo/revert/edit any of your or someones edit. Undoing and redoing edits coz edit war.
  • You removed a statement which didn't make sense to you, but made sense to some many editors.
I felt offended by some of your talk revealing your own level of expertise which seemed like you looked down on us.
--নাফী ম. সাধ nafSadhtalk | contribs 09:02, 25 April 2011 (UTC)
I am not here as an admin, but an editor, sorry if there has been confusion on that - but I don't see where I implied as such, or used the tools :S I appreciate this may well be your field of specialism, however it is important to be able to verify all material added to the article. I am sorry if you feel ridiculed, but I cannot yet see an accurate source for the material you have added to the article. As you pointed out above, verification is crucial! Multiple problems exist with that paragraph that you simply have not been able to address. (I wrote a whole section on the problems... but am not posting it because I went into detail on the problems, which you mostly ignored, above).
You correctly questioned the other material, but you could have done it better :S Now I am questioning the material you have added, that you added without a source. Perhaps I could have done it better. But it still needs a source.
Programmable analog computers; please explain what is wrong with such a concept? Heathkit's were programmable, for example.
Finally; perhaps it is a good idea to get a WP:3O on this. --Errant (chat!) 10:20, 25 April 2011 (UTC)
I support the idea of getting a WP:3O. --নাফী ম. সাধ nafSadhtalk | contribs 10:48, 25 April 2011 (UTC)
There are too many discussions in that, well, discussion. So I have created new discussions for anything not related to the one described by this heading. Babbage has nothing to do with the use of a reference that points to a dead reference.

--Ezrdr (talk) 13:41, 25 April 2011 (UTC)

Was the computer first Theorized By Babbage while trying to develop more powerful mechanical calculators

My understanding, from everything I have read, is that it is in trying to develop more powerful and more flexible mechanical calculators (differential and analytical engines) that the computer was first theorized. I have two references to defend that position, I've used them both in my contribution to the Computer article:

  • "It is reasonable to inquire, therefore, whether it is possible to devise a machine which will do for mathematical computation what the automatic lathe has done for engineering. The first suggestion that such a machine could be made came more than a hundred years ago from the mathematician Charles Babbage. Babbage's ideas have only been properly appreciated in the last ten years, but we now realize that he understood clearly all the fundamental principles which are embodied in modern digital computers" Faster than thought, edited by B. V. Bowden, 1953, Pitman publishing corporation
  • "...Among this extraordinary galaxy of talent Charles Babbage appears to be one of the most remarkable of all. Most of his life he spent in an entirely unsuccessful attempt to make a machine which was regarded by his contemporaries as utterly preposterous, and his efforts were regarded as futile, time-consuming and absurd. In the last decade or so we have learnt how his ideas can be embodied in a modern digital computer. He understood more about the logic of these machines than anyone else in the world had learned until after the end of the last war" Foreword, Irascible Genius, Charles Babbage, inventor by Maboth Moseley, 1964, London, Hutchinson

--Ezrdr (talk) 13:26, 25 April 2011 (UTC)

Well, quote one is an excellent reference regarding Babbage's vision. Second quote is much the same, as well as noting that his work was laughed at by his contemporaries. However, neither appears to support the statement that Babbage was developing more powerful mechanical calculators... :S Unless I am missing something obvious --Errant (chat!) 13:58, 25 April 2011 (UTC)

Since ErrantX, the main protagonist, has left this discussion (in this sub-section), I would like to bring it to a close with a quote from the Charles Babbage Institute: "The calculating engines of English mathematician Charles Babbage (1791-1871) are among the most celebrated icons in the prehistory of computing. Babbage’s Difference Engine No.1 was the first successful automatic calculator and remains one of the finest examples of precision engineering of the time. Babbage is sometimes referred to as "father of computing." The International Charles Babbage Society (later the Charles Babbage Institute) took his name to honor his intellectual contributions and their relation to modern computers."
--Ezrdr (talk) 11:12, 26 April 2011 (UTC)

Was Babbage developing more powerful mechanical calculators ?

It is important to separate all new questions from already answered ones, therefore this sub-heading.
In the proposal that Howard Aiken gave IBM in 1937 while requesting funding for the Harvard Mark I which became IBM's entry machine in the computer industry, we can read: "Few calculating machines have been designed strictly for application to scientific investigations, the notable exceptions being those of Charles Babbage and others who followed him. In 1812 Babbage conceived the idea of a calculating machine of a higher type than those previously constructed to be used for calculating and printing tables of mathematical functions. ....After abandoning the difference engine, Babbage devoted his energy to the design and construction of an analytical engine of far higher powers than the difference engine..." Howard Aiken, Proposed automatic calculating machine, reprinted in: The origins of Digital computers, Selected Papers, Edited by Brian Randell, 1973, ISBN 3-540-06169-X --Ezrdr (talk) 14:27, 25 April 2011 (UTC)

Can the invention of the microprocessor by Intel while developing a calculator engine be called Serendipity

Serendipity comes to mind when reading the content of the Intel museum web site: - The 4004, Big deal then, Big deal now
Serendipity is when you find something that you were not expecting to find. This is 1969, Intel, a young startup at the time, was developing some chips for an electronic calculator for the Japanese firm Busicom. Intel simplifies the 12 chip set requested down to 4 and delivers the calculator in early 1971. But very quickly they realized that they had created a lot more than a set of calculator chips (this is the not expecting to find part of serendipity), but they do not own the design, fortunately Busicom renegotiates the deal for cheaper prices and they accept to share the ownership of the chip set (MSC-4) with Intel as long as Intel does not use it for electronic calculators. Intel markets the 4004 a few months later. Designers use it from Prom Programmers, printers, teletype machines... 4 years later the first PC Altair 8800 was released using a fourth generation Intel design (the 8080 after the 8008, the 4040 and the 4004), 6 years later the Apple II was born (using a different brand of microprocessor though) --Ezrdr (talk) 13:35, 25 April 2011 (UTC)

I will say this very clearly, one last time because I don't think it has got in yet :( The problem is not largely with the facts, it is the presentation. Calling it serendipitous might be true, but it is still OR without a source that says it was fortune/accidental. Something the source does not say. My other major issue is that all of this is shoved at the end of the section about mechanical calculators in the 1600's - it is a strong claim of influence across the ages of development of computing. And so needs a strong specific source. You are placing a strong focus on the importance of mechanical calculators without actually supporting it in a source (instead trying to shore it up with some tangentially related sources) --Errant (chat!) 14:02, 25 April 2011 (UTC)

ErrantX, the main protagonist, has left this discussion (in this sub-section), so I guess this discussion is over, but I'd like to bring about a different question which stemmed from this debate: In Wikipedia, can an appropriate neologism be used to describe an event that happened and was described in the media before it became popular?. I don't think that this question should be discussed here, but it might be worth thinking about it. For anyone wanting to research the events surrounding the invention of the first microprocessor further, I'm joining the references of the two books that I have on the subject:

  • Bit by Bit, an illustrated history of computers by Stan Augarten, Ticknor & Fields, 1984
  • The untold story of the computer revolution, Bits, Bytes, Bauds & Brains by G. Stine, Arbor house, 1985

In this last book, you will read on page 164,165 that "In spite of a management structure that favored and encouraged innovation, the introduction of the first microprocessor chip met with some hesitation on the part of Intel's board of directors....Intel's marketing people estimated that the entire worldwide market for microprocessors would be only a few thousand units per year!".
Also the movie Serendipity that made me discover this word and its meaning was released in 2001.
--Ezrdr (talk) 15:29, 26 April 2011 (UTC)

I haven't left. I've let you have your way for the moment, because it was like beating my head against a brick wall communicating the problem to you :) You have a moratorium while I go through the big pile of source material I dug out to research this, but I do still strongly suggest you a) find a source (which still has not appeared) and b) try to improve the wording (which leaves a lot to be desired). Will be back with sourced suggestions in a week or so. r.e. your question above.. it quite obviously depends on the context, subject and the sources. So the answer is Yes. And No. --Errant (chat!) 15:48, 26 April 2011 (UTC)
Welcome back, just realize that we are only talking about what's in the title of this discussion.--Ezrdr (talk) 16:29, 26 April 2011 (UTC)
Are you referring to "serendipity" when you talk about neologism above? The word is old, far older than the microprocessor! So hardly an issue. On the other hand no source has currently be provided to support such an analysis of the invention --Errant (chat!) 16:43, 26 April 2011 (UTC)
May be it has turned into a VERY LONG discussion regarding very small amount of text. Almost 36kB of talk incurred on changes about 1kB :S
Feeling much better that discussion has became moderate once again. I hope no one is feeling harassed or ridiculed anymore. In most cases my (tiny little brain's) insight coincide with ErrantX's. Although, I'm observing the talk, my responses are limited. Hope to see a consensus in general soon. --নাফী ম. সাধ nafSadhtalk | contribs 17:02, 26 April 2011 (UTC)

Please stick to the current discussion, which is described by its title. It's OK to discuss my rhetorical question about the use of neologisms but it has to happen in a different discussion.
This is very simple:

  • Serendipity is when you find something that you were not expecting to find
  • Intel invented the first microprocessor while Busicom, a Japanese company, was paying them to develop a set of programmable calculator chips

That's it!
If you have a different question, it should be in a different discussion. Please!
--Ezrdr (talk) 17:50, 26 April 2011 (UTC)

The word is intended to mean a fortunate discovery or a piece of luck, not just an accidental one. So, yes, it does need to be sourced as a fortunate discovery. You started the blooming off-topic discussion!--Errant (chat!) 18:02, 26 April 2011 (UTC)
What's fortunate is that Busicom's engineers, management and marketing teams did not understand what was in their electronic calculator engine, otherwise (they OWN the design), it could have been: Busicom inside ....
OK, the story: Ted Hoff, employee number 12 , is hired in 1968, a year before the project (ref: Stine, p.163). It is a very, very small company and now here is what happened (ref:Stine, p163):
"After a thorough analysis of the Busicom requirements, Hoff concluded that the proposed Busicom calculator design was far too expensive to be cost-effective in the desk calculator marketplace of the time. He felt that what Busicom wanted wasn't possible with the state of the art and the price structure that Busicom would have to set up to stay in business, even with manufacturing costs in Japan being far lower than in the United States. But Ted Hoff was also experimenting with one of the first IC computers, the PDP-8 minicomputer made by DEC..."
Please, take the time to read this reference, talk to friends about it, challenge its authenticity, but it is time to end this charade!
--Ezrdr (talk) 05:37, 27 April 2011 (UTC)
Which is an interesting source about how the microprocessor was invented and came to be Intel's beast. However nothing in there describes it as fortunate or serendipitous; as you pointed out above all material must be sourced to a verifiable reference. You appear to be sourcing the story of

the discovery of the microprocessor - and then saying "wasn't that lucky". It may well be, but your view is irrelevant :) --Errant (chat!) 08:10, 27 April 2011 (UTC)

Is the Electronic calculator a direct descendant of the Mechanical calculator ?

The first Electronic calculator was the Anita Mk7 developed by Sumlock Comptometer, the British maker of the Comptometer mechanical adding machine. The calculator had the same basic user interface found on mechanical and electromechanical machines, with sets of columns of 9 keys. The story of its invention can be found in the second paragraph of Sumlock ANITA calculator#History of ANITA calculators --Ezrdr (talk) 14:04, 25 April 2011 (UTC)

This is unrelated to the issues I raised above, and certainly not something I would dispute. --Errant (chat!) 14:12, 25 April 2011 (UTC)
You know what, I fucking give up, it's impossible to have a reasoned discussion over here, so have it your way. --Errant (chat!) 14:19, 25 April 2011 (UTC)
My strong stand is that, disputed edits by Ezrdr, those we wanted and tried to resolve, should be eliminated. Any other editors' act on this regard is NECESSARY.
It looks like that a resolution is not possible here. --নাফী ম. সাধ nafSadhtalk | contribs 15:25, 25 April 2011 (UTC)

Edit request from Andres.felipe.ordonez, 30 May 2011

Grammar error: 1st paragraph in the "Programs" section

change: "...modern computers based on the von Neumann architecture ARE OFTEN HAVE machine code in the form of..."

to: "...modern computers based on the von Neumann architecture OFTEN HAVE machine code in the form of..." Andres.felipe.ordonez (talk) 19:31, 30 May 2011 (UTC)

 DoneJasper Deng (talk) 19:48, 30 May 2011 (UTC)

Missing google chrome operating system from Operating system section

google chrome os has been in beta for quite some time and is and is launching on june 15th. — Preceding unsigned comment added by 173.16.191.111 (talk) 06:42, 1 June 2011 (UTC)

Rather more importantly, we seem to be missing an "operating system" section entirely William M. Connolley (talk) 07:24, 1 June 2011 (UTC)
IP user might be talking about the Operating system row-group of Computer software table. Using the word section (specially by a new user) is not a sin!
Whatever, Google Chrome OS is not a new OS, it is already listed, coz it is just another flavor of Linux. » nafSadh did say 14:55, 1 June 2011 (UTC)
Agreed. Chrome is more properly a "Window Manager" - like KDE or Gnome. If you take one of those fancy Cr-48 Chrome-powered laptops that Google gave away last year and flip the little switch that's hidden behind a black sticker at the back of the battery compartment, you'll see that it boots up in to an absolutely standard Linux "shell" and you can do all of the usual command-line stuff with no sign of any Chrome. Ditto Android, Kindle and a whole bunch of other things that falsely lay claim to being "operating systems". SteveBaker (talk) 15:36, 1 June 2011 (UTC)

EXPLAIN THE GENRATIONS OF COMPUTER IN EASY WORLDING AS U CAN AN ALSO HEADINGS OF CHRACTRISTICS OF COMPUTER & USES OF COMPUTER — Preceding unsigned comment added by 61.5.137.131 (talk) 09:44, 10 June 2011 (UTC)

For easy wording, you can go to Simple English Wikipedia » nafSadh did say 10:53, 10 June 2011 (UTC)

The ugly gallery in the lead should go. Either use one image representing "the computer" (maybe the ENIAC or some other old device which doesn't hit on anyone's pet computer as much as a more recent image would do or a decent collage like in the 1980s article. SpeakFree (talk) 01:15, 30 June 2011 (UTC)

The Colossus computer was not involved in breaking the cipher messages from the Enigma machine, rather that from from the Lorenz cipher SZ40/42.--TedColes (talk) 05:36, 30 June 2011 (UTC)
I agree that the ugly conglomerate of images needed to go. The Colossus pic was reverted because it wasn't stored-program. I found a usable shot of EDSAC and put that in. -R. S. Shaw (talk) 07:10, 2 July 2011 (UTC)
The gallery now looks nice. If you change it, please don't mess it up by introducing unnecessary whitespaces. SpeakFree (talk)(contribs) 22:59, 28 March 2012 (UTC)

Edit request from 82.128.87.32, 6 August 2011


82.128.87.32 (talk) 08:50, 6 August 2011 (UTC) A computer is called a system == It is programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations. The particular sequence of operations can be changed readily, allowing the computer to solve more than one kind of problem.

Conventionally a computer consists of some form of memory for data storage, at least one element that carries out arithmetic and logic operations, and a sequencing and control element that can change the order of operations based on the information that is stored. Peripheral devices allow information to be entered from an external source, and allow the results of operations to be sent out.

A computer's processing unit executes series of instructions that make it read, manipulate and then store data. Conditional instructions change the sequence of instructions as a function of the current state of the machine or its environment.

The first electronic computers were developed in the mid-20th century (1940–1945). Originally, they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs).[1]

Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space.[2] Simple computers are small enough to fit into mobile devices, and mobile computers can be powered by small batteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from mp3 players to fighter aircraft and from toys to industrial robots are the most numerous.

It would help if you could identify what you want changed. --Wtshymanski (talk) 20:06, 6 August 2011 (UTC)

Magnetic storage

Edam said that "Zuse also later built the first computer based on magnetic storage in 1955". The Manchester Mark 1 used magnetic drum storage in 1949. --TedColes (talk) 06:49, 30 October 2011 (UTC)

Limited function computers

Hello, I want to object to the last sentence of the Limited function computers section, which, as of today, claims that "Living organisms (the body, not the brain) are also limited-function computers designed to make copies of themselves; they cannot be reprogrammed without genetic engineering."

In my opinion, this sentence is redundant here, because it is only tangential to the topic of the article. Also, it is unnecessarily both disputable and controversial. Controversial in the sense it is suggesting that the bodies of living organisms are "designed". And disputable because it is not stated what exactly is meant by reprogramming. If the organism is viewed as a chemical computer, for instance, then "reprogramming" could be performed for example by changing concentration of some chemical in the environment. Most importantly, this example is superficial because there is little sense in considering a body without a brain, even more so in an article about computers. Also, what about animals that do not have a clearly defined "brain"?. Another thing is, that to the best of my knowledge, nowadays it is not feasible to perform genetic engineering manipulation in a whole living multicellular organism as such manipulation are restricted to single cells only.

My last and least important objection is, that while the brain is a part of the body, the sentence suggests otherwise.

I suggest 1) removing the sentence 2) or quoting the source of this claim and expanding the sentence to comprehensively reflect the source's claim. As the expert opinions on whether live organisms are or are not limited-function computers does probably differ, both sides of this argument should be then represented here --195.39.74.163 (talk) 23:33, 18 December 2011 (UTC)

It is indeed a ridiculous claim. In fact the whole section is ridiculous, so I've removed it. As a matter of interest though studying bodies without brains is far from senseless. I studied psychology in a more robust age, when pithing a living frog's brain was considered acceptable. It's quite remarkable to see how little a frog relies on its brain for even apparently quite complicated tasks such as swimming. Malleus Fatuorum 06:15, 19 December 2011 (UTC)

File:Acer Aspire 8920 Gemstone by Georgy.JPG Nominated for Deletion

An image used in this article, File:Acer Aspire 8920 Gemstone by Georgy.JPG, has been nominated for deletion at Wikimedia Commons in the following category: Deletion requests December 2011
What should I do?

Don't panic; a discussion will now take place over on Commons about whether to remove the file. This gives you an opportunity to contest the deletion, although please review Commons guidelines before doing so.

  • If the image is non-free then you may need to upload it to Wikipedia (Commons does not allow fair use)
  • If the image isn't freely licensed and there is no fair use rationale then it cannot be uploaded or used.

This notification is provided by a Bot --CommonsNotificationBot (talk) 16:15, 30 December 2011 (UTC)

  • Computer (disambiguation): A computer is a program machine that receives input, stores and manipulates data, and provides output in a useful format.
  • Computer: A computer is a programmable machine designed to sequentially and automatically carry out a sequence of arithmetic or logical operations.

These two definitions should be synchronized so that Wikipedia is consistent with itself. Wbm1058 (talk) 01:52, 21 January 2012 (UTC)

I believe all four of the above should be redirected to computer, where general-purpose and special-purpose should be defined and a clear distinction drawn between each. Neither microcontroller or embedded system defines special-purpose computer. Seems like embedded system is better, at least the last paragraph of the lede discusses the issue:

In general, "embedded system" is not a strictly definable term, as most systems have some element of extensibility or programmability. For example, handheld computers share some elements with embedded systems such as the operating systems and microprocessors that power them, but they allow different applications to be loaded and peripherals to be connected. Moreover, even systems that do not expose programmability as a primary feature generally need to support software updates. On a continuum from "general purpose" to "embedded", large application systems will have subcomponents at most points even if the system as a whole is "designed to perform one or a few dedicated functions", and is thus appropriate to call "embedded".

Should we create a new section in computer, titled General- and special-purpose computers, where each is defined? From those definitions, might fall out a better definition for computer itself. Will be nice to find some good sources for this. Wbm1058 (talk) 02:44, 21 January 2012 (UTC)

Refer to earlier discussions above for issues with the definition of computer. Wbm1058 (talk) 14:31, 21 January 2012 (UTC)

Is there a difference between a stored-program computer and a general-purpose computer? If so, what's the difference? Wbm1058 (talk) 14:39, 21 January 2012 (UTC)

Computer hardware is the collection of physical elements that comprise a computer system. So, from that definition, it follows that a clear definition computer is required to determine what exactly constitutes computer hardware, as opposed to ordinary electronic hardware. Wbm1058 (talk) 15:18, 21 January 2012 (UTC)

Re general-purpose computer vs. special-purpose computer: From books.google.com FORTRAN IV: a programmed instruction approach:
A special-purpose computer utilizes a fixed or stored program to solve the problem it was designed for. The same problem is solved repeatedly, using different sets of data."
I could not have explained it in a better way. Somewhere in the article this should be pointed out. As I mentioned on my talk page, early computers such as ABC were clearly special-purpose. Nowadays, a GPU is a good example for a special-purpose computer, while a GPGPU is generally programmable.
While a microcontroller, to which special-purpose computer currently redirects, is used for special-purpose applications, as a computer it is generally programmable and thus a general-purpose machine. Neither is a redirect to embedded system correct. These links should be fixed. Nageh (talk) 15:38, 21 January 2012 (UTC)
Re stored-program computer: All stored-program computers are general-purpose. However, there are general-purpose machines that take the program in the form of interchangeable hardware components, such as plug boards. Nageh (talk) 15:38, 21 January 2012 (UTC)

The discussion in this section developed from my edits to change some of the many hardware links on Wikipedia to computer hardware links. I've since become aware of an electronic hardware article, which I think should sometimes be linked to rather than computer hardware, though which to link to may be somewhat a judgement call. My thought is that when talking about "hardware" at the device level, link to computer hardware, or personal computer hardware, network hardware, graphics hardware or video game hardware and when talking about the internals of a particular device, electronic hardware. That makes more sense to me than making the decision based on whether the "hardware" is programmable or not (i.e., general- or special-purpose). Electronic hardware is stuff like Schottky transistors and silicon gates. Examples:

Wbm1058 (talk) 21:39, 23 January 2012 (UTC)

special-purpose computers

Another article to link to when discussing special-purpose computers: Application-specific integrated circuit. I suppose these are often found in embedded systems. Wbm1058 (talk) 17:07, 18 February 2012 (UTC)

Edit request on 12 February 2012

Computers are fun.98.165.76.134 (talk) 21:39, 12 February 2012 (UTC)

Indeed--Jac16888 Talk 21:40, 12 February 2012 (UTC)

Edit request on 17 April 2012

Edit Request: "and it will carry process them." changed to "and it will process them."

This edit request is in reference to the second sentence under Programs (http://en.wikipedia.org/wiki/Computer#Programs). The sentence ends with, "and it will carry process them." I believe "carry" should be removed, so it reads, "and it will process them." Perhaps at one time the sentence read "and it will carry them out", but the "carry" was not removed when it was changed to "process them".

This is my first edit request, so I hope this is the correct way to do it. Thanks! -Mark

Markrummel83 (talk) 15:56, 17 April 2012 (UTC)

Thanks! I see TedColes has already carried out the request. Nageh (talk) 17:34, 17 April 2012 (UTC)

Edit request

I confess near total ignorance about how computers work (hence my interest in learning from this page), but I am medieval historian (Western, 450-1450) with a PhD and I must say that the second paragraph under Limited-function early computers, beginning "Around the end of the 10th century..." is in multiple respects inaccurate, and likely a farce. I suggest it be removed. I think it is beyond being modified. I will leave it to those who know Wikipedia better to take such action as they see fit. 216.117.19.94 (talk) 02:54, 30 May 2012 (UTC) — Preceding unsigned comment added by 216.117.19.94 (talk) 02:51, 30 May 2012 (UTC)

Can you be more specific. What about the two sources cited?--TedColes (talk) 07:08, 30 May 2012 (UTC)

Hi, Ted. In response to your request, I have followed the links to the two sources cited in that paragraph. Unfortunately, they provide no evidence for the claims made. The first - Felt, Dorr E. (1916). Mechanical arithmetic, or The history of the counting machine. Chicago: Washington Institute. p. 8 - actually makes no reference whatsoever to a "a machine invented by the Moors that answered either Yes or No to the questions it was asked" as stated in the article. What is more, the source itself gives no reference to any contemporary (i.e. 10th cen) historical documents that might validate its claims. The second - the parlour review january 1838 - suffers from the same deficiency. It simply asserts - "it is said..." - that Albertus Magnus devised a taking earthenware head which Thomas Aquinas later destroyed. Again, no reference to the historical record. Have we a letter from the 13th century confirming this claim? a chronicle? a history? or some other supporting document? No. Just an assertion. Until we have a source that can provide a credible historical reference to these claims - preferably to a contemporary, 13th century, Latin language text, I see no reason to foist them on the interested and sincere reader. Hope this helps. 216.117.19.94 (talk) 01:15, 2 June 2012 (UTC)

Advantages of Computer

Computer processing is very fast and accurate in comparison of human being.It is many times faster than a human being, It never tired and work consistently. — Preceding unsigned comment added by 122.160.78.104 (talk) 07:34, 28 June 2012 (UTC)

Archive bot

I've added auto archiving to this talk page. It needs it. It should archive everything over 60 days old (yes, even the stuff from 2009) except for the last 5 threads. There will always be a minimum of 5 threads on here, regardless of their age. The first archiving should happen at some point in the next 24 hours. - X201 (talk) 15:29, 17 October 2012 (UTC)

First bug - found by Grace Hooper

As I understand it , what was actually said was "I've found the first computer bug", she was being ironic . The concept of bugs was already well understood well before this date. http://cs-www.cs.yale.edu/homes/tap/Files/hopper-wit.html — Preceding unsigned comment added by 93.97.31.112 (talk) 21:06, 28 October 2012 (UTC)

Computer

Cool article---- — Preceding unsigned comment added by 174.102.84.111 (talk) 16:06, 9 November 2012 (UTC)

Edit request on 17 November 2012

Please change

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by

to

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s they had been largely replaced by


Please change

While some computers may have strange concepts "instructions" and "output" (see quantum computing), modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language.

to

While some computers may have strange concepts for "instructions" and "output" (see quantum computing), modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming

language. Voi8 (talk) 19:53, 17 November 2012 (UTC)

Done Rivertorch (talk) 06:18, 18 November 2012 (UTC)

Suggestion for Misconceptions section.

I think the author of the "Misconceptions" section showed insight to include such a section. However, I believe he or she has a misconception that a computer must be a device. Specifically, unless you consider a human being to be a biological device, I believe the first modern computers were actually women employed by the U.S. Army during WWI to create tables that soldiers in the field who operated cannons could use to determine the amount of powder to place in a cannon and the firing angle to deliver their cannonballs and other destructive loads specific distances with particular wind conditions. I'm sorry I have no reference for this information other than my HS Math instructor, but it might be researched. I believe the job title of such persons included the word "Computer." — Preceding unsigned comment added by 174.62.95.48 (talk) 17:33, 1 December 2012 (UTC)

I've updated the article to note that (according to SOED) the word - referring to a person - dates back to the mid 17th century. http://thefreedictionary.com/computer also includes the "person" definition, as does https://en.wiktionary.org/wiki/computer#English. Realistically though, the definition as a person is obsolete; in modern usage the term invariably refers to a device - or can someone produce a reference to the contrary?
I think an example of current usage of the word "computer" for anything other than an electronic device would be helpful. There are examples of such as slide rule and billiard-ball computer but I can't think of a currently-used, practical device that anyone calls a computer that is not electronic. Mitch Ames (talk) 01:33, 2 December 2012 (UTC)

Digital computers were also developed in Germany, like the world's first functional program-controlled Turing-complete computer, the Z3, which became operational in May 1941

That's just missing in the article. You're writing the first digital computers were constructed in the U. S. and in the U. K., but you don't mention Germany although the "the world's first functional program-controlled Turing-complete computer, the Z3, which became operational in May 1941" was constructed in Germany by Konrad Zuse. http://en.wikipedia.org/wiki/Konrad_Zuse — Preceding unsigned comment added by 77.117.246.168 (talkcontribs) 12:08, 2 November 2012‎


seems like someone wants to rewrite history... article is not editable! 79.239.50.153 (talk) 22:38, 15 December 2012 (UTC)

The article has been semi-protected for more than five years due to high levels of vandalism. If you'd like to suggest a change, please follow the procedures outlined here. Rivertorch (talk) 22:50, 15 December 2012 (UTC)

Edit request on 5 December 2012

Mr. Rahul Sharma Lucknow

What Computer and Information Systems Managers Do Computer and information systems managers, often called information technology managers (IT managers or IT project managers), plan, coordinate, and direct computer-related activities in an organization. They help determine the information technology goals of an organization and are responsible for implementing the appropriate computer systems to meet those goals.

Work Environment Most large companies have computer and information systems managers. The largest concentration of IT managers works for computer systems design and related services firms. Most IT managers work full time.

How to Become a Computer and Information Systems Manager A bachelor’s degree in computer or information science plus related work experience is typically required. Many computer and information systems managers also have a graduate degree.

Pay The median annual wage of computer and information systems managers was $115,780 in May 2010.

Job Outlook Employment for computer and information systems managers is projected to grow 18 percent from 2010 to 2020, about as fast as the average for all occupations. Growth will be driven by organizations upgrading their IT systems and switching to newer, faster, and more mobile networks.

Similar Occupations Compare the job duties, education, job growth, and pay of computer and information systems managers with similar occupations.

O*NET O*NET provides comprehensive information on key characteristics of workers and occupations.

Contacts for More Information Learn more about computer and information systems managers by contacting these additional resources.

Not done. This is not a suitable sub-section for this article. --Wtshymanski (talk) 14:43, 5 December 2012 (UTC)

Edit Request - "Professions and organizations - Computer Engineering" - on 24 December 2012:

Computer engineering is the study of both "hardware & software" sides of computer systems. http://en.wikipedia.org/wiki/Computer#Professions_and_organizations I see it is included with "Hardware-related" professions only. i suggest adding it to "Software-related" too, or you can insert a new row: "Hardware & Software Related", then include the fields that combine H/W & S/W in their study (like Computer Engineering). I also suggest to add these references about computer engineering:

  • http://www.acm.org/education/education/curric_vols/CE-Final-Report.pdf "Page 4: Computer engineering is defined as the discipline that embodies the science and technology of design, construction, implementation, and maintenance of software and hardware components of modern computing systems and computer-controlled equipment. Computer engineering has traditionally been viewed as a combination of both computer science (CS) and electrical engineering (EE)."
  • http://www.wisegeek.com/what-is-a-computer-engineer.htm "and integrating software options with the hardware that will drive the applications.", "Some of the common tasks associated with the computer engineer include software design that is customized for a particular industry type. Operating systems that are peculiar to the culture of a given company often require the input of a computer engineer, ....."

Thanks in advance. --41.46.105.96 (talk) 04:45, 24 December 2012 (UTC)
EDIT:
Here is my request in the form of (Change X to Y):
Please change:
Software-related: Computer Science, ..., Video game industry, Web design
to:
Software-related: Computer Science, Computer Engineering, ..., Video game industry, Web design
Found here: http://en.wikipedia.org/wiki/Computer#Professions_and_organizations 41.46.102.52 (talk) 16:11, 25 December 2012 (UTC)

Not done - this article has a link to Computer engineering already which would be a better place to discuss that topic. --Wtshymanski (talk) 23:12, 24 December 2012 (UTC)
Thanks for your reply, but you got me wrong. I don't mean to argue on the definition of CE. This page is about "Computer", i just say: *This* page, which is about Computers, mentioned that "CE is the hardware study of computers", which is FALSE and not true. My aim is to correct this FALSE info mentioned on *this* page. and i proved by references & citations above that CE classification on *this* page is NOT CORRECT and needs to be revised. Thanks again :D 41.46.102.52 (talk) 16:03, 25 December 2012 (UTC)

Update needed in "Software" Section

Under the Software section of this article, where it talks about various Operating Systems, It seems as though windows 8 is not present, I do believe this is an error and it should be updated. FranktheTank (talk) 14:31, 7 January 2013 (UTC)

 done Mitch Ames (talk) 13:42, 8 January 2013 (UTC)

what is a computer and what does it do?

Lengthy text hatted—apparently unrelated to improving article
The following discussion has been closed. Please do not modify it.

A computer is a programmable electronic device that accepts input; performs processing operations; outputs the results; and provides storage for data, programs or output when needed. most computers today also have communications capabilities. This progression of input, processing, output, and storage is sometimes called the information processing cycle.

Data is the raw, unorganized facts that are input into the computer to be processed. Data that the computer has processed into a useful form is called information. Data can exist in many forms, representing text, graphics, audio and video. one of the first calculating devices was abacus. Early computing devices that predate today's computer include the slide rule, the mechanical calculator and Dr Herman Hollerith's Punch Card Tabulating Machine and Sorter. First-generation computers, such as Eniac and Univac, were powered by vacuum tubes; second - generation computers used transistors; and third-generation computers where possible because of the invention of the integrated circuits (IC). Today's fourth- generation computers use microprocessors and are frequently connected to the internet and other networks. some people bilieve that fifth- generation computers will likely be based on artificial intelligence.

A computer is made of hardware ( the actual physical equipment that makes up the computer system ) and software ( the computer's program ). common hardware components include the keyboard and mouse ( input devices ), the CPU ( a processing device ), monitors /display screens and printers ( out put devices ), and storage devices and storage media ( such as CDs, DVD drives, hard drives, USB flash drives, and flash memory cards). Most computers today also include a modem, network adapter, or other type of communications device to allow users to connect to the internet ot other network.

All computers need system software, namely an operating system ( usually windows, mac OS, or linux), to function. The operating system assists with the boot process, and then controls the operation of the computer, such as to allow users to run other types of software and to manage their files. most software programs today use a variety of graphical objects that are selected to tell the computer what to do. The basic work space for windows users in the windows desktop.

Application software consists of programs designed to allow people to preform specific tasks or applications. such as word processing, web browsing, photo touch-up, and so on. software programs are written using a programming language. programs are written by programmers; computer users are the people who use computers to preform tasks or obtain information. — Preceding unsigned comment added by 41.46.4.216 (talk) 18:06, 7 March 2013 (UTC)

If this has any bearing on the article, please feel free to explain how, and uncollapse this discussion. Rivertorch (talk) 23:14, 7 March 2013 (UTC)

Edit request on 11 April 2013

HARDWARE: SOME IMPORTANT COMPONANT OF HARDWARE WHICH ARE UNDER BELOW 1.INPUT 2.MEMORY 3.PROCESSOR ETC - 118.107.133.149 (118.107.133.149) 07:47, 11 April 2013‎

Like the ones in the Components section? - X201 (talk) 08:06, 11 April 2013 (UTC)

Hoaxes

Around the end of the 10th century, the French monk Gerbert d'Aurillac brought back from Spain the drawings of a machine invented by the Moors that answered either Yes or No to the questions it was asked. Again in the 13th century, the monks Albertus Magnus and Roger Bacon built talking androids without any further development (Albertus Magnus complained that he had wasted forty years of his life when Thomas Aquinas, terrified by his machine, destroyed it).

This was in the "history of computing" section. First, I think the Moorish machine (a brazen head) was either a hoax or a glorified Magic 8-Ball sort of thing — either way, there's not much computation going on. (If the "machine" spoke aloud, it was certainly a hoax.) Second, I'm a bit baffled that it didn't mention that Magnus and Bacon's machine was a hoax. Sure, it may seem too obvious to mention, but leaving it out makes it looks like the sort of random nonsense that people sometimes put in Wikipedia. Finally, this really has very little to do with computing. People of the time had very little notion of what we would call a computer, and would likely not be able to readily equate the concept of talking machines with computation. I suspect these people weren't imagining computing devices, but rather talking inanimate objects. Does the notion of a talking mirror bring computation to mind? Surely not. I'm removing this from the article. - furrykef (Talk at me) 12:41, 29 May 2013 (UTC)

thoughts

Hmm, I'd bet many pounds this article was heavily edited by a Brit. I guess that's obvious to anyone reading it.

Only a Brit would take a manual describing the construction of EDVAC - "First Draft of a Report on the EDVAC", run off and make a copy of it, and then try to claim they were first. The EDVAC was a working Von Neumann computing machine, and presented to the public in 1947. It just wasn't able to be delivered to the customer until 49 because of patent disputes. So claiming that British copies of it demonstrated in 1948, based on the EDVAC manual, were first, is just plain wrong in so many ways.

Looking down the article, like the integrated circuit and other areas, I'm proud to note that in most places, no American felt the need to put in that every US first was in fact American. This does seem to be a European obsession. — Preceding unsigned comment added by Dkelly1966 (talkcontribs) 17:25, 29 May 2013 (UTC)

Right. Chauvinism being unheard of among Americans. Rivertorch (talk) 18:27, 29 May 2013 (UTC)

ABC vs Z in lede

I see we now have a debate on who was first with the computer in the introduction of this article. I don't think that's anything but troll-bait in general, and doesn't help explain what a computer is anyway. We have both the AB and Z3 computers in the history section, and I think that's where they belong, only. I'm thinking of deleting both from the lede. --A D Monroe III (talk) 22:20, 26 June 2013 (UTC)

After no objections for a week, I have removed it. A D Monroe III (talk) 21:54, 2 July 2013 (UTC)
IIIraute has reverted my removal of an expanding argument about who was first to make the computer from the introduction of the article, with the comment "restore sourced content - no consensus for removal".
As to "sourced", yes, the content is sourced, but the sources listed duplicate the sources in the history section, where they belong.
As to "consensus", I did state the reasons for my doing this just above. And, yes, while no one else specifically agreed with my proposal, no one disagreed, for a week. No objections or comments counts as consensus for obvious and trivial concerns, as I consider this to be. Myself, I would respond here before I'd do such a revert.
The fragments of computer history that Illraute restored are incomplete, and have been and will be a target for people to insert their POV. That's bad enough, but more importantly, a debate on who's first has nothing to do with the basic explanation of a computer, so doesn't belong in the lede at all. The article's history section duplicates this information already, in a more complete and coherent fashion.
If Illraute or others don't specifically state some counter-arguments, I will restore my edit in the next day or so. A D Monroe III (talk) 16:27, 3 July 2013 (UTC)
I don't see why this information should be removed - especially since you didn't care to correct or remove the "The first electronic digital computers were developed between 1940 and 1945 in the United Kingdom and United States." claim.--IIIraute (talk) 16:41, 3 July 2013 (UTC)
Can't we get away from nationalistic promotion of "firsts"? Which computer is counted as first depends heavily on a whole range of adjectives before the word "computer". Such adjectives include "electronic" "digital" "programmable" "automatic" "binary" and "stored-program". Also, it is contentious to date the time that is appropriate in this context. Should it be the time that the idea first arose, the time that a reasonably complete design was written down, the time that some component first worked, the time that the reasonably complete machine was first demonstrated in a laboratory or workshop, or the time that it first started to serve users addressing real problems?--TedColes (talk) 17:08, 3 July 2013 (UTC)
To Illraute's point, I did remove the "UK and US" part, as that implies some claim to being first; you're putting it back. (BTW, it left out Germany, promoting future "fixes" by "pro-German" editors – real or imagined.) I'm not saying the intro shouldn't mention history, but it should only mention it. The whole point of the introduction is to summarize, not give details, especially details that promote reactions of POV.
I agree with TedColes. Any debate on "first" will come to arguments on definition of "computer", which usually ends up based on bias, or just being loud and insensitive.
But, really, please just read the intro, but skip this "ABC vs Z3" middle paragraph. It just reads better without it. The paragraph contradicts itself, is disjointed, and even stoops to weasel words. It's the result of conflicting editors with conflicting views. If we leave it there, it will get worse.
And, again, detailing claims to "first" doesn't help explain the topic of computers. It doesn't belong in the into. A D Monroe III (talk) 21:00, 3 July 2013 (UTC)
You are right, you did remove the "UK and US" part - my mistake - I am really sorry! It reads better without it. I have reverted my revert of your edit. --IIIraute (talk) 22:33, 3 July 2013 (UTC)

A lot of information from the women in computing article deserves to be in this article as well.

I put up the tag that mentions how there is information missing when it comes to women in computing in this article. Lovelace isn't mentioned and the use of computer to refer to women isn't mentioned at all in this article (human computer talks about humans as computers and even has a pictures of woman being computers). My changes to this article are to include the usage of computer being used to refer to humans, especially female humans.--JasonMacker (talk) 00:16, 27 April 2013 (UTC)

I added a paragraph in the history of computing section on Lovelace and added a picture of her as well. The information was pulled from the lead of Ada Lovelace, as well as Lovelace's mention in Women in computing#Timeline_of_women_in_computing.--JasonMacker (talk) 00:32, 27 April 2013 (UTC)

I replaced the artificially darkened picture of the ENIAC with a much clearer one of the main control panel.--JasonMacker (talk) 01:05, 27 April 2013 (UTC)

Has this issue now been resolved with the addition of information about Lovelace? I'm assuming it has (given the lack of discussion in this section for quite a long time, and the fact that I can't see a problem with this article any more), so I'll remove the tag, let me know if there still are issues, so that the tag can be put back. Cliff12345 (talk) 02:13, 2 July 2013 (UTC)

As much as I understand the appeal of, and even delight in the idea that the world's first computer programmer was a picturesque Victorian-era noblewoman with a flowery name and title (who moreover happened to be the daughter of a celebrated and influential Romantic poet), I sense a subtle sexist double standard here: Babbage was at least equally important to the history of the computer – Babbage could be described as the inventor of computing hardware and Lovelace of software –, but only Lovelace is depicted here, while a portrait of Babbage is not included (although available). Keep in mind that this helps perpetuate the insidious stereotype that women's primary function is decorative, not intellectual. I'm not trying to insinuate any of this to be conscious, let alone intentional, lest I offend the principle of assuming good faith, but it's worth considering.
It might seem like a trivial detail if you've never really tried to understand feminism and what it is really all about (I would never have spotted subtle details like this either not long ago, and even considered it ridiculous to find fault in something like this), but once you develop a sensitivity to the issue, it does stand out. Especially since mention of Ada's role was added to the article specifically to further the cause of feminism. I'm sure you guys are well-intentioned, in fact! (I'm a guy myself, by the way – my first name appears feminine to many Anglophones, I've found.) --Florian Blaschke (talk) 13:16, 29 July 2013 (UTC)

Edit request on 22 October 2013

I have a video that I would like to embed in the computer components section of a wikipedia article. I had to create it for school, at Alverno College. Can you give me permission to embed this video. It is one minute and 54 seconds long.

This is a digital representation of the components of a "slimline" desktop computer.

Tlenyard (talk) 00:36, 22 October 2013 (UTC)

  • Please tell me exactly where you want this video and how you want it formatted and I'll consider adding it for you. Your alternative is to make ten useful edits in the next four days which will make you autoconfirmed and allow you to edit it yourself. Technical 13 (talk) 02:32, 22 October 2013 (UTC)

Edit request on 31 October 2013

replace {{p.|61-62}} with pp. 61–62 there is no template:p. 174.56.57.138 (talk) 19:55, 31 October 2013 (UTC)

Done. Thanks. --    L o g  X   20:01, 31 October 2013 (UTC)

Computer

computer What is a Computer? A computer is a programmable machine. The two principal characteristics of a computer are: it responds to a specific set of instructions in a well-defined manner and it can execute a prerecorded list of instructions (a program). Modern Computers Defined Modern computers are electronic and digital. The actual machinery -- wires, transistors, and circuits -- is called hardware; the instructions and data are called software. All general-purpose computers require the following hardware components: memory: enables a computer to store, at least temporarily, data and programs. mass storage device: allows a computer to permanently retain large amounts of data. Common mass storage devices include disk drives and tape drives. input device: usually a keyboard and mouse, the input device is the conduit through which data and instructions enter a computer. output device: a display screen, printer, or other device that lets you see what the computer has accomplished. central processing unit (CPU): the heart of the computer, this is the component that actually executes instructions. In addition to these components, many others make it possible for the basic components to work together efficiently. For example, every computer requires a bus that transmits data from one part of the computer to another.Ώ — Preceding unsigned comment added by 182.182.124.217 (talk) 10:50, 10 December 2013 (UTC)

Usefulness to mere mortals required.

Hi Wikipedia people

Er - I don't understand lots and lots of things in this article. In fact it just bamboozles me half the time. I'm a college lecturer and have been using computers to make music with since 1985. I somehow think your aim should be for someone like me to understand your article, but it's way way too tech savvy to actually be of worth to people who wish to learn something from your work.

You all evidently have fantastic knowledge, but you really need to work out how to share that knowledge with other s who seek it, rather than just passing it around yourselves. Coming up with an Acronym and saying that that is a "language" or a "Compiling language" is not much use to people who don't know what a computer language is...

Please TRY to address those who have come to you for knowledge in a way that does not put them off returning to these wonderful resources.

My Name Is Andy. I do Not write for WP, but I feel that I may be allowed to comment. Thankyou — Preceding unsigned comment added by 109.154.8.72 (talk) 21:11, 9 December 2013 (UTC)

You're not alone. Click this link to read a Wikipedia guideline that implores editors to make articles readable (this article isn't much worse than other articles). You might prefer to click Simple:Computer for a simpler version, but many Simple English Wikipedia articles are no simpler than English Wikipedia; editors' vanity is almost as big a problem over there.
Do you know how to click links? I think you mean the table at Computer#Languages (click to see what it is). If you read that table and you don't know what an assembly language is, for instance, click the blue link where it says assembly language. Art LaPella (talk) 00:57, 10 December 2013 (UTC)

Thankyou very much Art LaPella - for taking the time to respond. Yes, I do realise that I can get definitions of various terms by clicking links - but of course this is a very disruptive process for the learner and I would at least like to feel that a subject as important as The Computer could at least open with enough information to satisfy the enquiring mind without the over use of complex terminology that requires navigation away from the subject in hand. However, you seem to have a similar view. So I will not preach to the converted. Andy — Preceding unsigned comment added by 109.154.8.72 (talk) 00:29, 11 December 2013 (UTC)

New sections: ‎Advantages/Disadvantages of computers

I'm not a fan of these recently-added new sections. They are a mix of obvious statements with a few questionable ones thrown in, and it's in list form, which we general don't go for in Wikipedia. Plus, it's not sourced. Is there anything can be done to fix this, or should we just remove them? --A D Monroe III (talk) 18:59, 17 December 2013 (UTC)

I agree with this. Given the importance of the topic and the large number of good, well referenced articles relating to it, I think that this article has a very long way to go to get up to a similar standard.--TedColes (talk) 22:42, 17 December 2013 (UTC)
Remove it. The inanity of it reminds me of a grade-school textbook. Thanatosimii (talk) 19:57, 4 January 2014 (UTC)

These sections are inappropriate. They genernally detract from the article and violate WP:USEPROSE, WP:CITE, WP:VER, and perhaps WP:POV and/or WP:OR. Needs to go. --R. S. Shaw (talk) 23:06, 15 January 2014 (UTC)

Category:Computers in fiction

Am I to assume that when a technology becomes wide spread enough in the real world listing it's instances in fiction becomes pointless? Someone a thousand years from now may not particularly give notice when they replicate their food or walk through a teleporter; they would probably see those things as being as mundane as we see things as like using a car of a refrigerator; however someone from the past without that technology would certainly notice it. CensoredScribe (talk) 19:25, 18 March 2014 (UTC)

Not necessarily. The question is not whether or not the element is common or not, but rather whether or not it is a defining element of the subject. For example, Speed Racer has fictional categories for racing drivers and motorsports. Both are reasonably common in real life, but both are defining characteristics of Speed Racer (also, Mach Five is in "Fictional racing cars", a defining characteristic of the car).
For comparison, Tron is (appropriately, IMO) in "Artificial intelligence in fiction". It would be impossible to discuss Tron without discussing the fictional computer. Star Trek, OTOH, is not in "Artificial intelligence in fiction" or any similar category, despite the ship's computer or Data being significant elements of the franchise. The computers are not defining elements of the franchise. - SummerPhD (talk) 03:09, 19 March 2014 (UTC)

Semi-protected edit request on 24 April 2014

computer is an electronic device that takes input from the user analyze it , process it and give the desired output and also provide the capability for storing data for future use. Cncreate (talk) 05:32, 24 April 2014 (UTC)

Not done: as you have not requested a change.
If you want to suggest a change, please request this in the form "Please replace XXX with YYY" or "Please add ZZZ between PPP and QQQ".
Please also cite reliable sources to back up your request, without which no information should be added to any article. - Arjayay (talk) 07:20, 24 April 2014 (UTC)

Computer System Information

'To view the detailed computer system information:' You have a personal computer and you want to see the all details of your PC then you have to open the "Run" dialog box. After opening of the Run you have to type the command "sysinfo32.exe" and simply press the "enter" button. After that you have seen the popup window in that you have all the computer details. — Preceding unsigned comment added by Rhlraypure (talkcontribs) 10:27, 25 September 2014 (UTC)

Section degradation is odd

I am not about make an edit, but the section on "Degradation" and the links to ants is a bad joke and should be removed. — Preceding unsigned comment added by 131.180.145.184 (talk) 13:33, 17 December 2014 (UTC)

Bell labs, Silicon Valley

The entire wiki page on the history of computers is seriously lacking. There is an overwhelmingly British bias in the history section. The article hardly mentions the role of transistors and microprocessors, random access memory, along with any mention of modern programming languages, all invented in the US. It's the equivalent of cutting the history section off at the ancient Greek computers. I am saying this as a computer scientist myself. The vast majority of the history in Computers was written in the last 30 years, not the 1940's. — Preceding unsigned comment added by 68.198.27.88 (talk) 01:29, 11 February 2015 (UTC)

You are free to add more sourced facts. But be aware there are some other articles about the topic:
--Kgfleischmann (talk) 05:21, 11 February 2015 (UTC)

Semi-protected edit request on 9 February 2015

The word "medieval" in the introduction needs a "the" in front of it, because otherwise it's bad English. Or perhaps reword it to "the Middle Ages" (if that is indeed the same thing). 121.74.155.190 (talk) 21:25, 9 February 2015 (UTC)

Done, thanks. —Nizolan (talk) 00:02, 10 February 2015 (UTC)

hjj


uu ↔ — Preceding unsigned comment added by 112.198.134.17 (talk) 04:09, 17 February 2015 (UTC)

How to Add Subtitles

Lots of media players will allow you to select multiple subtitle files to play with your movie, but sometimes you just can’t load the subtitles, no matter how hard you try. In these cases, you may want to hardcode the subtitles into the video file itself. This means that the subtitles will always appear, regardless of what media player you are using. To do this, you will need to re-encode the video file, which will add the subtitles directly to the frames. www.subtitl.xyz,Read on after the jump to find out how. — Preceding unsigned comment added by Lolopopococo (talkcontribs) 10:04, 31 March 2015 (UTC)

Semi-protected edit request on 18 May 2015

Yossi reiche ris awesome 101.2.171.162 (talk) 04:55, 18 May 2015 (UTC)

Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format. ekips39 (talk) 05:18, 18 May 2015 (UTC)

Semi-protected edit request on 9 June 2015

Please link 'Geoff Tootill' in the section on the 'Manchester Small-Scale Experimental Machine to the relevant wikipedia page at https://en.wikipedia.org/wiki/Geoff_Tootill.

This is my father and I intend to flesh out the biography. I don't think I am able to make the change myself, even though I am logged in as a registered user. Thanks, Peter Tootill

 Done. You will be able to edit "semi-protected" articles such as this one once you have made ten edits with the account. Your request here was your second edit. -- John of Reading (talk) 11:01, 9 June 2015 (UTC)

Clarification Sought on First Turing Complete Computer

It's not clear whether the first Turing Complete (TC) computer actually built and running was the one by C. Babbage's son, or the German Z3. A distinction and description should be made between the first TC design, and the first TC machine actually built and running. It perhaps should also be noted that the Z3 probably didn't take advantage of TC features while in use. Thus, there may be a third category: first computer that actually took advantage of TC features, although an unambiguous definition may be tricky to craft. 146.233.0.201 (talk) 21:57, 29 July 2015 (UTC)

Semi-protected edit request on 28 August 2015

59.88.42.73 (talk) 15:46, 28 August 2015 (UTC)

You need to tell us what edit you want made to the article. Add it below and set the answered field in the template to "no". - X201 (talk) 15:50, 28 August 2015 (UTC)

Please swing by and help improve this new article! :D--Coin945 (talk) 03:30, 2 October 2015 (UTC)

Some mention should be made of this 1913 device in the section on early computers, particularly if more details about it, how it works, when it was built, etc. are available. http://gothamist.com/2015/10/15/grand_central_computer_video.php 173.160.221.10 (talk) 17:35, 5 November 2015 (UTC)

'General purpose' part of the definition????

Why is a computer defined as a general purpose device? There are many many special purpose computers!! 82.72.139.164 (talk) 15:45, 4 December 2015 (UTC)

Because that is a definition, not a description of one specific example of a computer. Andy Dingley (talk) 16:01, 4 December 2015 (UTC)
I'm not sure what you mean by that. 82.72.139.164 (talk) 19:08, 9 December 2015 (UTC)

Hello fellow Wikipedians,

I have just added archive links to 3 external links on Computer. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 07:24, 18 January 2016 (UTC)

Semi-protected edit request on 30 January 2016

2602:301:779F:DCA0:BCB9:44CA:70AD:384C (talk) 18:02, 30 January 2016 (UTC)

Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format. Datbubblegumdoe[talkcontribs] 18:18, 30 January 2016 (UTC)

Hello fellow Wikipedians,

I have just added archive links to one external link on Computer. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true to let others know.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 15:28, 19 February 2016 (UTC)


Notes were easy to understand

Computer notes were easy to understand as a form 6 and computer student. Singhtanisha (talk) 18:40, 22 February 2016 (UTC)

Appreciate your feedback, but please be aware that this page is for discussions about improving the article. -- ChamithN (talk) 18:45, 22 February 2016 (UTC)

Semi-protected edit request on 26 February 2016

Waahwooh (talk) 18:51, 26 February 2016 (UTC)

Waahwooh (talk) 18:51, 26 February 2016 (UTC)

Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format. EvergreenFir (talk) Please {{re}} 19:26, 26 February 2016 (UTC)

Hello fellow Wikipedians,

I have just added archive links to one external link on Computer. Please take a moment to review my edit. If necessary, add {{cbignore}} after the link to keep me from modifying it. Alternatively, you can add {{nobots|deny=InternetArchiveBot}} to keep me off the page altogether. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—cyberbot IITalk to my owner:Online 07:42, 1 March 2016 (UTC)

"The Atanasoff–Berry Computer (ABC) was the world's first electronic digital computer, albeit not programmable."[2

Following my sources, the A-B computer was first successfully used in summer 1941, while the Zuse Z3 was already successfully used in May 1941. http://de.wikipedia.org/wiki/Computer — Preceding unsigned comment added by 178.115.250.216 (talkcontribs) 12:45, 2 November 2012‎

Edit request: fix anachronism in Roman vs Babylonian abacus

Currently the article states "The Roman abacus was used in Babylonia as early as 2400 BC." This makes no chronological sense - the Babylonian culture preceded the Roman culture. As a simple emergency fix, until someone finds a better solution, please replace with "The precursor of the Roman abacus was used in Babylonia as early as 2400 BC." — Preceding unsigned comment added by 86.158.154.88 (talk) 10:26, 12 May 2016 (UTC)

 Done --A D Monroe III (talk) 15:24, 8 July 2016 (UTC)

Edit request: "programmable" contradiction

The article begins "A computer is a general purpose device that can be programmed ..."

In the section "Analog computers" is "... many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable ..."

One or the other or both need repair. Thanks, 73.71.159.231 (talk) 17:52, 14 June 2016 (UTC)

Analog computers are specialist, not general purpose computers, and are not programmable. However, they are normally considered on topic here due to the close historical links. For example the Bombe was an analog computer.GliderMaven (talk) 19:17, 14 June 2016 (UTC
Being "programmable" is an attribute, not a definition. A digital electronic computer is not necessarily a programmable computer, see Digital electronic computer. This article needs repair. 73.71.159.231 (talk) 20:15, 14 June 2016 (UTC)
Actually, it's Digital electronic computer that needs work; it's a stub (or worse, redundant with this article). Current computers are programmable by all useful definitions, even if original analog computers were not. Similarly, most of the rest of the lede did not apply to all analog or mechanical precursors to modern computers; that's not important. --A D Monroe III (talk) 21:52, 14 June 2016 (UTC)
Ah - you've trapped me. This article begins with "A computer is a general purpose device that can be programmed". Very clever - restrict "computer" to "general purpose" and it would seem they have to be programmable. Of course!!! Just a little detail - is it true that there are no special purpose computers? 73.71.159.231 (talk) 01:59, 15 June 2016 (UTC)
We're trying to understand your concern; is it "programmable" or "general purpose"? If it's some combination, that's hard to follow. Can you make a specific statement of what it should be changed to? --A D Monroe III (talk) 17:00, 20 June 2016 (UTC)
One stupidly nerdy point about Wikipedia is that the opening sentences don't define the term so much as the topic; the topic here is general purpose computers; that doesn't imply there are no special purpose computers; if you want to investigate other uses of the term 'computer' you shoudl check out the disambiguation page that is linked from the top.GliderMaven (talk) 19:45, 20 June 2016 (UTC)
It would be an error to require all computers, by definition, to be "general purpose". Significant numbers of them are single specialised purpose. As general purpose computers (which means most digital computers) become cheaper to provide as overall systems, then the number of special-purpose computers will reduce. It is not though a defining requirement to be general purpose. Andy Dingley (talk) 17:13, 20 June 2016 (UTC)
Is the topic "general purpose" then? That's different from the header just above, but okay.
As I see it, the idea behind "general purpose" is that computers are popular because they are complex machines that can be easily modified -- software. While special purpose computerized products exist (in fact, it could be argued that no truly general purpose ones exist), the computer technology they utilize is (nowadays) based on the ability to execute any given instructions. Gone are the days when designing a new product meant designing a new programming language, operating system, and cpu architecture; the computer inside isn't designed for the product, but vice-versa, made easy by the very general purpose nature of modern computers. The computer hardware I'm using to write this could be found in a jet plane, or a factory, or a slaughterhouse.
Now, is that stated clearly in the lede? No. Is it sourced? No. Are these the issues? --A D Monroe III (talk) 23:22, 20 June 2016 (UTC)
The lede continues to be modified for various views on this, yet also seems to get worse for opposing views. I've expanded the lede in this edit to attempt to combine all the concerns stated above, as I understand them. The added wording skirts around the non-programmable or non-general purpose aspects of some early computers, even though those are no longer significant. It also adds a paragraph on of history to highlight these changes to computers. A history summary is needed anyway, per WP:LEAD, since that's a large part of the article. Comments?
(A minor point; this adds several links, many of which also appear later in the body. Some editors are quick to eliminate any and all duplicate links; I'm not. Let's leave this until we see changes to the lede have settled down.)
--A D Monroe III (talk) 15:44, 8 July 2016 (UTC)

please change ((digital)) to ((Digital data|digital))

Done — Andy W. (talk ·ctb) 21:03, 9 July 2016 (UTC)

Semi-protected edit request on 5 October 2016

There is a simple gramatical erroe that i wish to change so people will not get acustomed to the use of wrong grammar.

Flame Rider (talk) 15:40, 5 October 2016 (UTC)

Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format. -- John of Reading (talk) 16:14, 5 October 2016 (UTC)

Hello fellow Wikipedians,

I have just modified 3 external links on Computer. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 07:43, 29 November 2016 (UTC)

Hello fellow Wikipedians,

I have just modified 2 external links on Computer. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 20:34, 31 December 2016 (UTC)

Semi-protected edit request on 1 January 2017

Header text Header text Header text
Example Example Example
Example Example Example
Example Example Example

ê±ℳ₰202.70.191.30 (talk) 02:52, 1 January 2017 (UTC)→≥

202.70.191.30 (talk) 02:52, 1 January 2017 (UTC)
Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format. DRAGON BOOSTER 06:08, 1 January 2017 (UTC)

Edit request: fix notes:22 (Dead Link)

The link under notes that is number 22 "Crash! The Story of IT: Zuse" currently listed is a dead link. There is an updated article at [4] where all that needs to be changed is the url it links too. Msearce (talk) 01:34, 14 July 2016 (UTC)

 Done. Also changed "replacement of" decimal to "rather than using" decimal since Zuse didn't know of the Babbage decimal machine, but designed his binary one completely independently. --A D Monroe III (talk) 19:32, 14 July 2016 (UTC)

Very nice Alshamiri1 (talk) 14:56, 6 March 2017 (UTC)

Hello fellow Wikipedians,

I have just modified one external link on Computer. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 23:18, 12 June 2017 (UTC)

Resetability

This article contains a major mistake in how the functionality of a computer is defined. The mistake is, that the definition does not mention, that a computer is always resetable in order to carry out successive operations and is therefore in fact different from a system comparable to a child being instructed to eat, defined by gender, his or her peas and doing so. This is a 2 part complain, containing: i.) resetability of operation and ii.) inevitability of operation — Preceding unsigned comment added by 88.65.146.27 (talk) 16:20, 1 August 2017 (UTC) 88.65.146.27 (talk) —Preceding undated comment added 16:15, 1 August 2017 (UTC)

additional errors in definition of computer

paragraph 2:

 sentence 1: my Personal Computer did not control a part of Wikipedia, because I was reading the wikipedia-article titled "Computer". 
Ergo: not all computers are control systems.
sentence 2, subsentence 3: it needs to be "general purpose devices" instead of "in general purpose devices",
due to the inexistance of the latter.

paragraph 3:

 sentence 1, subsentence 2: there were people not in possession of an abacus, 
even during a version of the badly defined period of "ancient times",
therefore they could not have been aided by simple manual devices like the abacus.
Instead only some people were enabled to receive aid due using the abacus.
   I do aknowledge the use of "ancient times" as usefull for the sake of using stylistic devices, 
which is said to increase the likelyhood for some people to understand the meaning of statements.

paragraph 4:

 sentence 4: a modern computer needs to have at least one input (e.g. a set of buttons) and one output (e.g. a display). 
If a modern computer does not contain one out of the two, then a modern computer is useless,
because it would either be a black box with or without input,
or it would not be able to execute operations due to the lack of input.
Therefore peripheral devices are not optional in order for a modern computer and any other computer to be of a use to the user.

88.65.146.27 (talk) —Preceding undated comment added 16:58, 1 August 2017 (UTC)

Hello fellow Wikipedians,

I have just modified 3 external links on Computer. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 14:34, 2 August 2017 (UTC)

Introduction description is semantic/other whimsy?

"device that can be instructed to carry out arbitrary sequences" 7viii2k17rdg

Question: Computer use is non arbitrary - rather inter-posed procession of tool use & simulacra systems to r/k selection symbiotic type mutation/exploration?

Mechagnosis fractal of a always destined machined reality could have hard research to use other than 'frame drag'/'latent potential mutation relevance' synarchies? First level description correctness?

User:text_mdnp ~, 7 August 2017 (UTC)

"Arbitrary" is correct. It means that any sequence of instructions may be defined (i.e. "arbitrary" choices, not choices restricted by the processor), and the processor will then carry them out. It's useful for this sequence to form a valid and correct program, but the processor will run incorrect programs as well as correct ones.
I cannot understand any of your posting on this page. Andy Dingley (talk) 09:40, 7 August 2017 (UTC)
The Internet has become self-aware and is editing Wikipedia! --Wtshymanski (talk) 20:44, 7 August 2017 (UTC)
Self aware? That's more than I'm seeing. Andy Dingley (talk) 22:26, 7 August 2017 (UTC)

Semi-protected edit request on 13 April 2018

The full form of the word 'computer' should be written there with the working of the machine written in that article. I just wanted to make edit on that by adding two more lines(at least 20 more words) making it easier because 80% of the audience to this topic are teens and youngster's. Yadav sharn117 (talk) 17:32, 13 April 2018 (UTC)

Not done. It's not clear what you want done. Please state specifically what you would like to see in the form "change x to y". --Wtshymanski (talk) 19:57, 13 April 2018 (UTC)

Semi-protected edit request on 26 April 2018

In the Software section of the article, can you please remove the Operating systems header because it lists software other than operating systems? 192.107.120.90 (talk) 13:37, 26 April 2018 (UTC)

 Done Removed. Was done by Ryanking16 with this edit. Looks like they forgot to "reaarange more [and] add sections in few days." ChamithN (talk) 17:16, 26 April 2018 (UTC)

More Display for PC at home... like page of book

Is there a normal PC (not "mobile") with more display (for Windows, Apple, Kindle, ecc) "to move to right and back left" like normal pages of a big book? — Preceding unsigned comment added by 93.38.65.148 (talk) 06:40, 21 June 2018 (UTC)

Lead images purports to show computers from different eras, but none are from earlier than the late 80's at most

And there's earlier images within the article already to use.--occono (talk) 18:53, 23 June 2018 (UTC)

Semi-protected edit request on 22 July 2018

2405:205:A161:5315:9259:920C:B18F:518A (talk) 13:54, 22 July 2018 (UTC)
 Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format and provide a reliable source if appropriate. L293D ( • ) 14:11, 22 July 2018 (UTC)

Semi-protected edit request on 1 September 2018

Plese let me help you guys to edit this page Jonnie jpt (talk) 16:55, 1 September 2018 (UTC)

Not done: Hi Jonnie jpt! Wikipedia would always be glad to have more volunteers, but unfortunately this article has been a frequent target of vandalism, so editing by newly registered users has been disabled. If you have a specific fact you'd like to add, write it down here in the format "change XXX to YYY" or "after the text ZZZ add new text WWW", reactivate this request, and we will be happy to make the change for you–just be sure to be specific, or otherwise we may not be able to understand your requested edit. If you would like the ability to edit this article yourself, please make at least six more edits and you'll be able to edit semi-protected articles like this one. Best, Altamel (talk) 18:28, 1 September 2018 (UTC)

Is the history section too British-centric?

The content of the History section of this article is very British-centric and might lead one to believe that most major advancements in digital computing occurred principally in the U.K. Even a cursory examination of the works of Brian Randell, an Englishman himself and probably the preeminent and best known historian of early digital computing, shows a more balanced approach to American accomplishments and advancements. I am not suggesting an edit specifically, but I am suggesting as part of clean-up of this article, a little more balanced approach in this section would improve completeness and quality. Ray Trygstad (talk) 17:53, 7 September 2018 (UTC)

Semi-protected edit request on 22 November 2018

please ad my links to this https://yogeshksahu.blogspot.com/2018/07/computer-what-is-computer.html Yogeshwarsahu (talk) 13:53, 22 November 2018 (UTC)

 Not done. Not without some compelling reason. –Deacon Vorbis (carbon • videos) 14:19, 22 November 2018 (UTC)

Semi-protected edit request on 17 February 2019

19th century -> 20th century

Under Etymology "From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations.[3]" — Preceding unsigned comment added by 199.58.98.69 (talk) 07:29, 18 February 2019 (UTC)

 Not done. "19th century" was intended and correct in the context of beginning to take on the modern meaning. By 1960 the "machine" meaning was fully established, although the "human" meaning had not quite disappeared. By 1970, the "human" meaning was fully obsolete. --R. S. Shaw (talk) 20:56, 21 February 2019 (UTC)
    I, uh, 20th century IS 1900s. 19th century means 1800s.
    199.58.98.69 (talk) 22:25, 21 February 2019 (UTC)

Semi-protected edit request on 22 March 2019

I am suggesting these changes to improve the page's grammar and writing style.

1. Please change this: The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation.

To this: The sector was developed in the late 16th century and found application in gunnery, surveying and navigation. It was a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots.

Because: The original text is a run-on sentence that can be restructured to improve readability.

2. Please Change this: Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow.

To this: Babbage's failure to complete the analytical engine can be attributed to difficulties of politics and financing, as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else.

Because: The writing style can be improved by omitting unnecessary words.

3. Please Change this: Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.

To this: Rather than the more difficult to implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.

Because: "harder-to-implement" is not grammatically correct.

4. Please change this: This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.[30]

To this: This design was also fully electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.[30]

Because: "all-electronic" is also not grammatically correct.

5. Please change this:

The U.S.-built ENIAC[37] (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US.

To this:

The ENIAC[37] (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US.

Because: The repetition of at the start and the end of the sentence makes it redundant. SFU-CMPT376W (talk) 11:00, 22 March 2019 (UTC)

@SFU-CMPT376W:
  1.  Not done I agree that the parenthetical "a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots" is way too long, but pushing it to the end of the paragraph makes the result sound weird. We should describe what the instrument does before how it's used. We can workshop this paragraph together and then I can make the finalized changes for you.
  2.  Done
  3.  Not done "Harder-to-implement" is a valid English construction.
  4.  Not done "All-electronic" is also a valid English construction.
  5.  Done
Qzekrom 💬 theythem 08:43, 23 March 2019 (UTC)

Semi-protected edit request on 18 July 2019

`From the end of the 19th century the word slowly began to take on its more familiar meaning` should be `From the end of the 20th century the word slowly began to take on its more familiar meaning` Cavilacion (talk) 18:45, 18 July 2019 (UTC)

 Done, kind of. I removed this line as the source is no longer available and the content it covers is restated in more detail just below in the same section. Alduin2000 (talk) 23:16, 18 July 2019 (UTC)

Semi-protected edit request on 4 August 2019

remove Greer, Jim in ref. [75]

it is incorrect. It is a repeat of Greer, James C. (same author- Jim is "short" for James, correct author list is Colinge, Jean-Pierre and Greer, James C.) Saneness (talk) 10:37, 4 August 2019 (UTC)

 Done Highway 89 (talk) 21:26, 4 August 2019 (UTC)

The Moth

The section on "Bugs" propagates the old trope that Admiral Hopper coined the term "Bug." It includes the famous photo of the technician's notebook that she used to show off that contained a page with a dead moth taped to it. Read what the tech wrote underneath the moth:

   First actual case of bug being found.

He wrote that, and Admiral Hopper used to show it off because it was funny. Everybody knew what a "bug" in a computer was, but here was the first time that a "bug" was found to be caused by an actual bug. A little research will show that engineers were using "bug" to describe unexplained bad behavior in systems since before computers were invented. — Preceding unsigned comment added by 75.149.30.179 (talk) 14:47, 16 September 2019 (UTC)

computer

A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A "complete" computer including the hardware, the operating system (main software), and peripheral equipment required and used for "full" operation can be referred to as a computer system. This term may as well be used for a group of computers that are connected and work together, in particular a computer network or computer cluster.

Computers are used as control systems for a wide variety of industrial and consumer devices. This includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer-aided design, and also general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other computers and their users. — Preceding unsigned comment added by 197.26.90.95 (talkcontribs) 14:35, 26 November 2019 (UTC)

Merge

It has been suggested that digital electronic computer be merged with this article. --mikeu talk 17:15, 4 December 2019 (UTC)

That's true; but there has also been no case made and no support in over a year since the proposal was made. I've therefore closed the proposal, with prejudice to later proposals. Klbrain (talk) 09:56, 22 February 2020 (UTC)

Semi-protected edit request on 28 February 2020


I request that in the article the definition of what makes a computer a computer ought to be explained. What defines a computer is its ability to receive information, process that information, store the result, and to give an output. I give this request because I believe it will give the reader a better understanding of what defines a computer. I think this should be positioned near the beginning of the page. Thank you for your time. ASimpleHuman290 (talk) 15:20, 28 February 2020 (UTC)

 Not done. It's not clear what changes you want to make. –Deacon Vorbis (carbon • videos) 15:35, 28 February 2020 (UTC)

Professions and organizations

Is the Software Freedom Conservancy, the Linux Foundation or the OSI notable enough to be added to Free/open source software groups shortlist? Perhaps, this list should also link to List_of_free_and_open-source_software_organizations? --88.96.197.246 (talk) 12:00, 8 June 2020 (UTC)

Semi-protected edit request on 11 August 2020

there are needed citation needed in the article and i found some dead links , and few resource article to cite . Itsvarunsingh (talk) 06:09, 11 August 2020 (UTC)

 Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format and provide a reliable source if appropriate.  Ganbaruby! (Say hi!) 07:02, 11 August 2020 (UTC)

Wrong definition

In first paragraph there is:

"A 'complete' computer including the hardware, the operating system (main software), and peripheral equipment required and used for 'full' operation can be referred to as a computer system."

It's not true in my opinion. Computer not need to has operating system to be complete. I propose change this sentence to:

"A Turing-complete computer can be used to simulate any Turing machine."

Second issue: computer system has two meanings. First is synonymous to computer hardware architectere. Second one is computer system = combination of hardware, software, user and data. Current defnition is not correct. 83.31.52.208 (talkcontribs) 11:44, 17 November 2020 (UTC)

Wikipedia is based on WP:Verifyability, not editors' opinions or reasoning. Changes to the basic definition of this subject will require authoritative sources to cite. Present those, and the article will get changed after they are evaluated and accepted. Without those, the article cannot be changed. --A D Monroe III(talk) 22:51, 29 November 2020 (UTC)

"Computer components" listed at Redirects for discussion

A discussion is taking place to address the redirect Computer components. The discussion will occur at Wikipedia:Redirects for discussion/Log/2021 June 24#Computer components until a consensus is reached, and readers of this page are welcome to contribute to the discussion. Piotr Konieczny aka Prokonsul Piotrus| reply here 08:11, 24 June 2021 (UTC)

This article states peripheral equipment is needed for a computer system to be complete. I think it may be beneficial to add more about this to the history section of this article. Including facts on how peripheral equipment has changed over time could be a nice addition to the history section of this article. --GoKnights2021 (talk) 18:04, 11 September 2021 (UTC)

Semi-protected edit request on 27 September 2021

Computer is a desktop or laptop. A machine that makes humans work very easy. ((Jecek)) 223.187.216.88 (talk) 17:13, 27 September 2021 (UTC)

 Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format and provide a reliable source if appropriate. Interesting Geek (talk) 18:04, 27 September 2021 (UTC)

Semi-protected edit request on 30 November 2021

Computer is an electronic machine that is work on the model of taking input from the user , processing or analysis that input, after it gives an output is called computer. Imdeepak2332 (talk) 19:56, 30 November 2021 (UTC)

 Not done: it's not clear what changes you want to be made. Please mention the specific changes in a "change X to Y" format and provide a reliable source if appropriate. ScottishFinnishRadish (talk) 20:00, 30 November 2021 (UTC)

Semi-protected edit request on 17 January 2022

Typo

"In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which, thRough a system of pulleys and cylinders and over, could predict the perpetual calendar for every year from AD 0 (that is, 1 BC) to AD 4000, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. " 94.216.25.88 (talk) 19:32, 17 January 2022 (UTC)

 Done Thanks for pointing this out. Uberlyuber (talk) 20:00, 17 January 2022 (UTC)

Lead paragraphs are too US-centric

Most of the examples of “computers“ listed in the early in the opening paragraphs of the article are mainly US devices and look like a commercial for IBM. Babbage’s Difference Engine as well as Colossus should be up there as well. In fact, if I remember correctly, that’s how the layout used to look..Inadvertent Consequences (talk) 15:07, 27 June 2022 (UTC)

Computers

I don’t see any text regarding the difference between 32 and 64 bit computers, when I think it is very important. — Preceding unsigned comment added by 2601:248:500:9540:64C0:C70E:1B0B:8631 (talk) 14:55, 5 July 2022 (UTC)

We don't have anything regarding the difference between 16 and 32 bit computers either, nor between 8 and 16 bit computers. Really the only significant difference is the size of the data registers, hence how large an integer may be to have arithmetic performed upon it in a single instruction without overflow. You could read up on 32-bit computing and 64-bit computing if you like. --Redrose64 🌹 (talk) 19:45, 5 July 2022 (UTC)
Computers with larger word sizes also tend to have more instructions and more registers than computers of smaller word sizes for various reasons (mostly, they were designed later and the more bits your instruction word has, the more things you can cram in there).
The other main difference is address space size. While there are various kludges to escape the address space size limits on 8 and 16 bit computers using multi-register addresses, these are mostly absent in 32 and 64 bit computers, thus enabling use of a flat address space for all uses. Such flat address space programming was largely restricted to very simple programs on 8 and 16 bit computers, so there is a significant difference. And as for 32/64 bit computers, the limitation to 2/3.5/4 GB of memory on 32 bit computers is well known to professionals and informed amateurs alike. So it's more than just being able to crunch larger numbers.
Additionally, many computers do have some means to do arithmetic in excess of their word size. Many 8 bit computers have limited support for 16 bit arithmetic (e.g. in special 16 bit registers) and for a more recent case, 32 bit x86 has support for 64 bit arithmetic through MMX and SSE registers. So it's not all about arithmetic either. Best might be to understand 8/16/32/64 bit as kind of a generation/size class roughly describing the feature set of a computer. --FUZxxl (talk) 17:51, 18 July 2022 (UTC)
The term "32 bit computer" is generally understood to denote the width of the data bus, this is independent of the width of the address bus. --Redrose64 🌹 (talk) 06:04, 19 July 2022 (UTC)
Modern computers have data busses that are far wider than 32 or 64 bit. Nobody calls a modern x86 compute a 512 bit computer, just because it can transfer cachelines at a time between CPU and memory. For a more historical example, the 8086 was available both with an 8 and a 16 bit databus (8088/8086), yet both variants of the chip are firmly 16 bit processors. So data bus size is not the deciding factor either. It is *a* factor though. --FUZxxl (talk) 15:02, 19 July 2022 (UTC)

Lead sentence

The lead sentence is "A computer is a digital electronic machine that can be programmed to carry out sequences of arithmetic or logical operations (computation) automatically". Is this accurate considering the article discusses older computers, such as analogue, and mechanical? To encompass all these types of computers, would a better lead sentence be "A computer is a machine used to perform arithmetic or logical operations (computations)"? Spekkios (talk) 19:55, 7 November 2022 (UTC)

Torres Quevedo's electromechanical arithmometer

I think the Electromechanical section should include a mention of Torres Quevedo's electromechanical arithmometer (1920), possibly the first digital calculator in history, but I couldn't find much about it on the English Wikipedia. (There's some info on the Spanish Wikipedia though.) —Cousteau (talk) 00:36, 19 December 2022 (UTC)

Go for it! doi:10.1109/MAHC.2021.3082199 looks like a decent source. Freoh (talk) 11:11, 19 December 2022 (UTC)


Semi-protected edit request on 16 December 2022

Change: A computer is a digital electronic machine that can be programmed to carry out sequences of arithmetic or logical operations (computation) automatically. Modern computers can perform generic sets of operations known as programs. These programs enable computers to perform a wide range of tasks. A computer system is a nominally complete computer that includes the hardware, operating system (main software), and peripheral equipment needed and used for full operation. This term may also refer to a group of computers that are linked and function together, such as a computer network or computer cluster.

To: A computer is a machine that can be programmed to carry out sequences of arithmetic or logical operations (computation) automatically. Modern computers can perform generic sets of operations known as programs. These programs enable computers to perform a wide range of tasks. A computer system is a nominally complete computer that includes the hardware, operating system (main software), and peripheral equipment needed and used for full operation. This term may also refer to a group of computers that are linked and function together, such as a computer network or computer cluster.

Why: Computers are not only digital. The article itself goes on about different types of computers later on in the articles. 5.20.131.252 (talk) 07:02, 16 December 2022 (UTC)

 Partly done: I moved digital electronic to the second sentence. It seems worth mentioning digital electronics to me, given that it's the basis for most things we think of as computers. Freoh (talk) 22:08, 24 December 2022 (UTC)