Jump to content

User:Khjkkj/My sandbox

From Wikipedia, the free encyclopedia

FACEBOOK

YOUTUBE

GOOGLE


Computer

[edit]

The history of computer science began long before the modern discipline of computer science that emerged in the twentieth century, and hinted at in the centuries prior. The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive worldwide industry[1]


Prehistoric man did not have the Internet, but it appears that he needed a way to count and make calculations. The limitations of the human body’s ten fingers and ten toes apparently caused early man to construct a tool to help with those calculations. Scientists now know that humankind invented an early form of computers. Their clue was a bone carved with prime numbers found in 8,500 BC.[2]


The biggest computer

[edit]
  • one

The abacus was the next leap forward in computing between 1000 BC and 500 BD. This apparatus used a series of moveable beads or rocks. The positions changed to enter a number and again to perform mathematical operations. Leonardo DaVinciwas credited with the invention of the world’s first mechanical calculator in 1500. In 1642, Blaise Pascal’s adding machine upstaged DaVinci’s marvel and moved computing forward again.

Alt text
How about now, this one is on the left


  • two

In 19th century England, Charles BabbageCharles Babbage, a mathematician, proposed the construction of a machine that he called the Babbage Difference Engine. It would not only calculate numbers, it would also be capable of printing mathematical tables. The Computer History Museum in Mountain View, CA (near San Diego) built a working replica from the original drawings. Visitors can see in the device in operation there. Unable to construct the actual device, he earned quite a few detractors among England’s literate citizens. However, Babbage made a place for himself in history as the father of computing. Not satisfied with the machines limitations, he drafted plans for the Babbage Analytical Engine. He intended for this computing device to use punch cards as the control mechanism for calculations. This feature would make it possible for his computer to use previously performed calculations in new ones.

The smallest computer

[edit]

Babbage’s idea caught the attention of Ada Byron Lovelace who had an undying passion for math. She also saw possibilities that the Analytical Machine could produce graphics and music. She helped Babbage move his project from idea to reality by documenting how the device would calculate Bernoulli numbers. She later received recognition for writing the world’s first computer program. The United States Department of Defense named a computer language in her honor in 1979.


The computers that followed built on each previous success and improved it. In 1943, the first programmable computer Turing COLOSSUS appeared. It was pressed into service to decipher World War II coded messages from Germany. ENIAC, the brain, was the first electronic computer, in 1946. In 1951, the U.S. Census Bureau became the first government agency to buy a computer, UNIVAC .

The most expensive computer

[edit]

The Apple expanded the use of computers to consumers in 1977. The IBM PC for consumers followed closely in 1981, although IBM mainframes were in use by government and corporations. The development of network technology and increases in processing capabilities for microcomputers made consumer Internet use possible by 1991. The computer evolution since then continues. New uses emerge every year.

Computers and video games

[edit]

A researcher at the University of Cambridge made the first official video game in 1952. It was a version of Tic-Tac-Toe and was programmed on a vaccuum-tube computer. Blalablablalablablalblablalbllalalabala more stuff about video games to make this a full paragraph. Gibberish: So frankly I don't understand why there is a length requirement if we are allowed to just write down gibberish but whatever. So video games are pretty cool. Some things about myself: My first video game was a Pokémon red version. Pokemon was a very popular video game and all of my friends played it. It was great. You can check out some cooool pokemon here: http://www.pokemon.com/

Video games have progressed a lot in the last 60 years. The new pinnacle of content as well as graphics will be Skyrim made by Bethesda Softworks. The game's set release date is 11/11/11. Skyrim will be the fifth installment of the Elder Scrolls games. It has already been pirated on xbox!

There are also different approaches to video game creation that greatly differ from the style of Skyrim. For example, another game releasing on 11/11/11 is Minecraft. This is a sandbox game that uses very low graphics in order to create huge virtual worlds that you can manipulate.

thumb|alt=Alt text|Yo is the pic bump worthy

The cheapest computer

[edit]
  1. one

Modern computing can probably be traced back to the 'Harvard Mk I' and Colossus (both of 1943). Colossus was an electronic computer built in Britain at the end 1943 and designed to crack the German coding system - Lorenz cipher. The 'Harvard Mk I' was a more general purpose electro-mechanical programmable computer built at Harvard University with backing from IBM. These computers were among the first of the 'first generation' computers.


  1. two

First generation computers were normally based around wired circuits containing vacuum valves and used punched cards as the main (non-volatile) storage medium. Another general purpose computer of this era was 'ENIAC' (Electronic Numerical Integrator and Computer) which was completed in 1946. It was typical of first generation computers, it weighed 30 tonnes contained 18,000 electronic valves and consumed around 25KW of electrical power. It was, however, capable of an amazing 100,000 calculations a second.


Top brand computers

[edit]

In the late 1880s, Herman Hollerith invented the recording of data on a machine readable medium. Prior uses of machine readable media, above, had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ..."[23] To process these punched cards he invented the tabulator, and the keypunch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for the 1890 United States Census by Hollerith's company, which later became the core of IBM. By the end of the 19th century a number of ideas and technologies, that would later prove useful in the realization of practical computers, had begun to appear: Boolean algebra, the vacuum tube (thermionic valve), punched cards and tape, and the teleprinter.




It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer, his analytical engine.[22] Limited finances and Babbage's inability to resist tinkering with the design meant that the device was never completed ; nevertheless his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. This machine was given to the Science museum in South Kensington in 1910.


Apple Mac

[edit]

The next major step in the history of computing was the invention of the transistor in 1947. This replaced the inefficient valves with a much smaller and more reliable component. Transistorised computers are normally referred to as 'Second Generation' and dominated the late 1950s and early 1960s. Despite using transistors and printed circuits these computers were still bulky and strictly the domain of Universities and governments.


Samsung

[edit]

The explosion in the use of computers began with 'Third Generation' computers. These relied Jack St. Claire Kilby's invention - the integrated circuit or microchip; the first integrated circuit was produced in September 1958 but computers using them didn't begin to appear until 1963. While large 'mainframes' such as the I.B.M. 360 increased storage and processing capabilities further, the integrated circuit allowed the development of Minicomputers that began to bring computing into many smaller businesses. Large scale integration of circuits led to the development of very small processing units, an early example of this is the processor used for analyising flight data in the US Navy's F14A `TomCat' fighter jet. This processor was developed by Steve Geller, Ray Holt and a team from AiResearch and American Microsystems.



Sony

[edit]

On November 15th, 1971, Intel released the world's first commercial microprocessor, the 4004. Fourth generation computers were developed, using a microprocessor to locate much of the computer's processing abilities on a single (small) chip. Coupled with one of Intel's inventions - the RAM chip (Kilobits of memory on a single chip) - the microprocessor allowed fourth generation computers to be even smaller and faster than ever before. The 4004 was only capable of 60,000 instructions per second, but later processors (such as the 8086 that all of Intel's processors for the IBM PC and compatibles is based) brought ever increasing speed and power to the computers. Supercomputers of the era were immensely powerful, like the Cray-1 which could calculate 150 million floating point operations per second. The microprocessor allowed the development of microcomputers, personal computers that were small and cheap enough to be available to ordinary people. The first such personal computer was the MITS Altair 8800, released at the end of 1974, but it was followed by computers such as the Apple I & II, Commodore PET and eventually the original IBM PC in 1981.


hp

[edit]

Although processing power and storage capacities have increased beyond all recognition since the 1970s the underlying technology of LSI (large scale integration) or VLSI (very large scale integration) microchips has remained basically the same, so it is widely regarded that most of today's computers still belong to the fourth generation.[3]

Image of computer

[edit]

References

[edit]