Script atau kode diatas fungsinya untuk mengatur PARTUS: THE DEVELOPMENT OF COMPUTERS IN WORLD google.com, pub-5323710190924562, DIRECT, f08c47fec0942fa0 -->

12/13/2023

THE DEVELOPMENT OF COMPUTERS IN WORLD

"development of computers" Development of computers in the world spans a very long period of time, starting from the 40s until the present.

Smart machines called computers have experienced ups and downs from initially using super old school devices to using devices that can be said to be very sophisticated.


Tracing the history of the development of computers technology, it cannot be denied that it also reflects the development of human thinking because they are able to think and find solutions to overcome the problems that existed in that era.

Read also

powerfull solution, speed up internet connection on android phone

The discovery of computer devices has a close connection through mathematics, because the figures behind the development of computers in the world in their historical history, most of them had educational backgrounds in mathematics.


Source development of computers

Referring to the explanation of the Big Indonesian Dictionary, it is said that a computer is an automatic electronic device that can calculate or process data carefully according to the instructions given and present the results of the processing and can run multimedia systems.

The development of computers technology has also had a changing effect on human life. At first, computers were only used for calculating, because the name of the computer was adapted from Latin and English, namely computare or to compute.

Both of these mean counting, therefore the discovery of calculating devices is part of the early history of the development of computers. Well, historically the calculating tool was called the abacus as the source of the initial idea for the discovery of computers.

Reporting from the kemendikbud.go.id page, it is explained that the abacus was used every day by Roman and Greek people when calculating like today's calculators. Later, in 1642, a French mathematician, Blaise Pascal, succeeded in making an automatic calculator.

From the invention of the calculator, in the history of computer development, the idea emerged to develop his invention into a more sophisticated machine, not only used for calculating but could be used for other purposes.

Then in 1672 Pascal's findings were further developed by the German mathematician, Gottfried Wilhelm Leibniz.

Gottfried added the binary number system to Pascal's calculator machine so that it could calculate multiplication and division models.

2 centuries later in 1800, an English mathematician, George Boole, developed Gottfried's discovery of binary numbers into a mathematical pattern, later popularly known as Boolean algebra.

Furthermore, citing the contents of a book published by the Ministry of Education and Culture in 2020 (Class

For his creativity, he was named the father of the computer, as the main source of ideas for the birth of the idea of making machines that could be programmed by humans.
Development of computers between generations

1. First generation computers

In the history of further development, the first generation computer was created by Konrad Zuse from Germany. This computer is very large in size, weighing 30 tons, 2.4 meters high and 30 meters long.

You can imagine, if these first generation computers were still used today, only a handful of people would want to use them. The reason is that its size is the size of a residential house.

Oh yes, the first generation computer was created in 1937 by the name of Atanasoff Berry for the purpose of solving systems of linear equations. Then the Colossus computer was created for the special purpose of the military to break secret German military codes.

The history of computer development continues to grow, not only for special needs as previously explained, but also for other general purposes. In 1946 the ENIAC computer was born in Philadelphia, United States. The construction of this first generation computer was quite large, the reason being that it used a vacuum tube the size of a room to store the data.

2. Second Generation Computers

The term second generation computer emerged in 1951 after the development of the use of transistors to replace the use of vacuum tubes in the era of first generation computers as previously discussed.

The use of transitors on computers results in the size of the computer being more minimalist in appearance, energy efficient and cheap in price. Apart from transitors, the use of magnetic memory was also introduced during the second generation of computers.

It is not surprising that computer operations at that time started to be faster than previous older versions. For commercial purposes, in 1953 IBM released its computer series IBM 650 and IBM 700. 3. Third generation computers

The historical evolution of computer development in the third generation, starting in 1964 because in that year Jack Billy introduced the use of integrated circuit chips into computer machines.

This brilliant idea came about because we saw that previous versions of computers heated up quickly and couldn't stand it if they were used for a little while. The chip made by Billy, has the function of accommodating all the component functions on the computer engine board in one small and thin source.

Or in other words, Billy's chip is the brain of electronic equipment containing millions of transistors. As additional knowledge, for example the Intel Pentium 4 microprocessor chip with a frequency wave of up to 3.8 GHz, the number of transitors in it is approximately 125 million.

4. Fourth generation computers

After IC chip components were used, subsequent developments in computers began to use silicon microprocessors. This is the historical milestone of the start of the fourth generation of computers in 1971.

The origins of the birth of silicon products actually began in 1958 through the cold hands of American electrician Jack Kilby and Robert Noyce as the initiator. However, it wasn't until 1971 that their silicon was used in computer technology by the US Air Force.

Thanks to his spectacular work, Kilby was awarded the Nobel Prize in 2000 and his findings were labeled an IEEE milestone. Referring to a poll on the cnn.com page with 119 reader respondents, on the question of the most important discovery in the world, 24 percent of people said silicon was the most important discovery, followed by second place in the world wide web (www) at 20%.

Since then, computers have been produced in a smaller display mode so they can be carried anywhere when you want to work and are now called laptops or notebooks.

5. Fifth generation computers

So the latest version of computers in the current digital era is called the fifth generation. The initial sign was the emergence of very large scale integration (VLSI) in the early 80s.

The only striking difference between the fourth generation and below is the capacity of the transistor. VLSI-based microprocessor chips have a capacity of up to billions of transitors.

In the future, it is predicted that development of computers will be even more vibrant. Several semiconductor experts have begun discussing plans to replace silicon transistors as the basic material for making chips. Indium gallium arsenide or InGaAs is an alternative to silicon transitors.

The reason is that InGaAs transistors are more promising for use in high-speed, low-power logic applications in the future.

Read also History of internet development in the world 

RECOMENDED FOR YOU

Share this article

No comments: