Main

La Storia del Computer e di come ha STRAVOLTO la nostra Vita

La storia dei computer, dalle origini delle popolazioni più antiche fino alle guerre software degli anni '90 e all'intelligenza artificiale degli anni 2020. Il video è stato realizzato ispirandosi al nuovo Asus Zenbook 14 OLED, un laptop potenziato dall’AI grazie al processore fino a Intel®️ Core™️ Ultra 9. @asusitalia #notebook #laptop #ASUS #Zenbook #zenbook14oled Scoprilo qui 👉 https://estore.asus.com/it/asus-zenbook-14-oled-ux3405ma-pp032w.html?utm_source=vanillamagazine&utm_medium=kol&utm_campaign=24q1_zenbook-14-oled-ux3405_2in1 Scopri il 1° libro di Vanilla Magazine: “L’Ultima Ora” in prevendita su AMAZON 👉 https://amzn.to/3v2Cs6m Per sostenere il nostro progetto abbonati: https://www.youtube.com/channel/UC_OSSIRAJ1QhAqDO113JVYg/join Seguici su Whatsapp: https://whatsapp.com/channel/0029VaCxomnIyPtbZ4Ta6S0n ISCRIVITI: https://bit.ly/30HCoX4 Puoi diventare un nostro sostenitore su PATREON e accedere ai vantaggi esclusivi: https://www.patreon.com/VanillaMagazine Animazione logo a cura di Filippo Marchetti: https://linktr.ee/orange_wedge ------------------------------------------------------------------------------------------ 📩 Richieste commerciali - Business requests: business(at)vanillamagazine.it ------------------------------------- Musica: Inspired by Kevin MacLeod Link: https://incompetech.filmmusic.io/song/3918-inspired License: http://creativecommons.org/licenses/by/4.0/ Music: Energetic & Drive Indie Rock by WinnieTheMoog Free download: https://filmmusic.io/song/10443-energetic-drive-indie-rock Licensed under CC BY 4.0: https://filmmusic.io/standard-license

Vanilla Magazine

7 days ago

To speak of the history of computers means opening up a world as big as (or almost) the history of civilization. I know that you are interested in the punches that were exchanged between Microsoft and Apple, the machine of Turing and the role of Artificial Intelligence, we will get to them and I will write the chapters here below in the timeline so you can jump to the part that interests you the most. First of all I want to present the computer for what it is on an abstract level: a machine capa
ble of doing calculations, of saving them and of being programmed. I am writing this video on a computer, so is what you hold in your hand when you are on social media or that you are using now if you are viewing Vanilla on a TV set. Computers are integrated into all the objects that we use daily because everywhere calculations have to made, data needs to be saved and we have to deal with programmable actions. Even in the 1980's in Italy it was not difficult to hear it being called "calculator";
that word covered all the instruments that could execute calculations even though the definition is not always appropriate. Before going on, let me thank Asus for having supported the realization of this video thanks to which we have created contents that are also educational for those who study the history of calculators. They did not only support us but have let us try out their new portable 14 inch Zenbook OLED that we will put to the test to make this documentary, and will supply us with eq
uipment dedicated to Vanilla Magazine for the elaboration of future videos that, we hope, will be inceasingly more complex and articulated for the "joy" of Gioele who works with me on these documentaries. But he will explain everything later. A computer is a much more advanced version of an instrument that we utilized when we were children in kindergarden, the abacus, that has grown and nowadays allows for the simulation of the human mind with Artificial Intelligence, the most complex object - l
et's so call it - that exists. The abacus was used by Sumarians and Chinese from at least 2.000 before Christ but it was the Greeks that invented the Antikythera mechanism, the most ancient mechanical calculator in history. At the time that mechanism was used to calculate the lunar phases, the rising of the sun, the movement of planets, equinoxes, months, the days of the week and maybe even the dates of Olympic games, but it was still far away from what we understand nowadays to be a computer. T
o see something that is more similar we must 28 00:02:35,760 --> 00:02:40,960 move to centuries later in time to the XVII century, when experiments began that led to the Pascaline, the first mechanical calculator of history. Compared to today's it did very little as a calculator, it added and subtracted numbers up to 12 digits on the basis not of 10 but of Liras, the French currency of the time, but the project was so effective that similar machines were produced at least until the beginning o
f the 1900's. Being a rudimentary example of cumputers it was still limited but I believe that, over the centuries, its success beat all records. The 1600's was the century of mechanical calculators, 35 00:03:17,760 --> 00:03:22,960 but it was during the 1800's that the bases were laid for the theoretical revolution of information technology. In 1833, the first real computer of history was designed, the analytical machine of Charles Babbage. It could have calculated numbers but also save them
as data to be used again in the future. In short, it was a real computer, too bad that it was never made. In order to build it, much financing was necessary and mostly a large space. The analytical machine had to be powered by a steam engine and would have been 30 meters long and 10 meters wide. Forget about ASUS portable Zenbook... Babbage's machine remanied only a theoretical project, but it was for that machine that Ada Lovelace, a mathematician and daughter of Lord Byron, wrote the first com
puter program of history, an algorythm able to calculate the Bernoulli's numbers. Before we see anything concrete in modern computing we must go to 1889 when Herman Hollerrith trademarked the pierced cards that worked with his tabulator. Hollerith was 29, he was a young minerary engineer and participated in a competition to build a machine capable of calculating the immense quantity of data of the 1890 census after that the data of the preceding one, of 1880, had not yet been elaborated because
it was too much! Hollerith took part in the competition with his tabulating machine, that went back conceptionally to the machine of Babbage but worked in a very simple manner. A hole indicated a man while its absence indicated a woman, or a series of other holes or their absence could indicate a more complex question and so on. The machine examined up to 800 cards a minute (a fabulous speed for those times and impossible to man) and the accounting of the census of 1890 was completed in 50 days.
For the preceding one, 10 years had not been enough. That engineer was not just anybody and he does not disappear from this story because we will find him again shortly: in 1896, Herman Hollerrith, who was an American of German origin, founded the Tabulating Machine Company which, after changing its name and amalgamating with other companies, became IBM in 1924. We are already in the 1900's, having left the 1800's behind us and making one last mention of a system that today hides behind every s
ingle information that you exchange and that you enter in any of your devices: Boolean algebra. No, this is not a math lesson but, if in the history of computers we do not also speak of operations with 0s and 1s, it would be as if we were speaking of fried air. In 1854 an Irish researcher, called George Boole, invented Boolean algebra, that allows for the realization of complex operations thanks to the use of the binary system of numbers, precisely the 0s and the 1s that assume the value of true
and false. He did not invent the binary system, it had already been seen in the 1600's but he was the one that tied 0s and 1s to operators such as AND, OR, NOT, NAND, XNOR and so on. We are at the begining of the 1900's, theoretically the computer had already been invented as well as the system of calculations that regulate its operations at the software level. What was still missing was the translation of this theory into a physical object that could be used by any person. The actual computer,
as we know it today, was missing and it took almost a century to get to it. There were many crucial stages in the 1900's but the first one that we must mention is the one of 1939 when Konrad Zuse, a German scientist, invented the Z1 which is the first, true, programmable computer of history that worked like modern computers. It did not do many calculations, it reasoned at only one Hertz of speed, therefore 1 cicle a second, but its conceptual structure was similar to the one of modern PCs. Ther
e was a unit for calculating, a unit for memory and an electric motor that generated the power to do the calculations. We are in 1939, the II World War and everyone had an impelling necessity for calculators capable of doing any operation, from calculating the trajectory of ballistic missiles to decyphering the coded messages of the enemy. At this point, one of the most famous computer scientists of all time entered the scene: Alan Turing. Touring was an English statistician that worked on top s
ecret projects of decyphering at Bletchey Park and he conceived the "Universal Turing Machine", or a computing theory that could foresee and simulate the evolutions of the Turing machine itself. I will not go into too much detail but it was on this principle that the construction of the Colossus is based, the calculator that, in 1943, managed to decipher the coded messages exchanged by the Nazis. Without Turing and consequently the Colossus, maybe the II World War would have ended in a different
way... Just to be clear, the Colossus could decipher about 4.000 German, Italian and Japanese messages a day with a power of calculation that resembled a small processor of the 1990's. When the war ended all the Colossus machines were destroyed and the scientists were forbidden from talking about what they had done in the context of that war. Turing was one of the fathers of computer science with his ideal machine, he was one of the champions of England's victory in the war but he experienced a
bad ending because of an absurd and backward legend, to discover it I will direct you to the movie The Imitation Game or to his online biography. From the 1950's onward, with the impulse that had been given, during the war years, to the race for calculators, there was a race to create products that could be utilized by common people, but 30 more years would be needed to reach the result. We will not analyze in detail those thirty years of trials and attempts. Computers were as big as rooms or e
ven buildings and elaborated few data a second, while in those years, languages were studied that we utilize every day, the program ones. We have all heard them being named: Fortran, Cocol, Pascal, C and C++. These types of program laguages are still in use today, while the computers on which they operated at the beginning are now only exhibited in museums. The first commercial computer was called UNIVAC I, it was made by Remington Rand and sold at the modest sum of one million dollars each in t
he 1950's, which amounts to about 11 or 12 million dollars of today. The computer was very small for the era: it was as big as a Fiat Panda and weighed about 7 tons, portable when compared to the ENIAC that was its predecessor and weighed 30 tons filling entire rooms. We have moved from computers as large as a house to an extremely portable UNIVAC-I which is as big as a large desk. How was it possible to reduce the dimensions so much? The revolution took place in 1947 with the invention of the t
ransistor, an electronic component that replaced the thermionic valve. To have an idea of the improvement consider that the thermionic valve is more or less the size of a lightbulb while the transistor is about the size of an USB key. But the revolution would take place a few years later, in 1958, when Jack Kilby and Robert Noyce invented, more or less in that period, the first integrated circuit, the part of the computer that reads the input and sends the output in the form of electric signals,
all of it occupying a few centimeters of space. All the physical and theoretical components to build modern computers were there: the integrated circuits, the program languages, the logical theory behind their functioning. In the 1960's, however, computers were still machines for specialists, no one or almost no one was able to use them unless they were specialized technicians. This was due to the fact that to carry out operations with the machine, it was necessary to know the specific language
of that operating system and type in instructions that the machine would be able to understand. Among others, here is an Italian curiosity: in 1965 the Olivetti Programma 101 was presented in New York. Confidentially it was called Perottina from the name of engineer Pier Giorgio Perotto who led the research group, the Perottina is considered by many to be the first desktop computer of history. It was a programmable calculator and had the possibility of memorizing some data, but it practically c
arried out the functions of a calculator, while the HP 9100A, presented in 1968, was the first real and true personal computer that revisited many patents of the Perottina for which HP paid Olivetti 900.000 dollars of compensation. We are near to the modern PCs but something was still missing. It was time for a change, the moment of the graphic interface. The first version was developed at the end of the 1960's by XEROX and, in 1973, it debutted on the first personal computer with graphic interf
ace and mouse, the Xerox Alto. It was a revolutionary product where windows could be superimposed, it had a language of objects, and a connected printer and was a PC in the modern sense of the word. It was so far advanced that no one bought it, it remained a project limited to the world of universities and almost relegated to the role of an experimental product. But in California a race was being run in projecting a revolution, in that race a personage stood out from all the rest, Steve Jobs. Jo
bs was not a designer, nor a technician in a strict sense of the word, but the prototype of U.S. entrepreneurs. He approached the most capable people, made them work to exhaustion and developed products that, at the same time, created and satisfied a need. In 1976 the Apple I hit the market, a computer with a wooden case that was produced in 200 pieces that were sold for 666.66 dollars, a reference to the number of the devil. In 1977, Apple II was released, a super cheap computer, it cost only 1
.195 dollars and was a all-in-one: keyboard and monitor in one block that was able to manage programs such as Apple Writer, Visicalc, Screenwriter and so on. Computers did calculations and were useful in writing and printing, but also for videogames which, in that period, enter into collective imaginations as amusement for youngsters. The operative system of Apple II was Apple DOS, while other computers utilized other versions of DOS, some even had more than one operative system such as the IBM
PC (IBM 5150) the best seller of its era which, in 1981, the year of its launch, sold about 200.000 pieces and became the most sold among home computers. Among its three operating systems, the IBM PC also had MS DOS and the brand became a synonym of quality in the world of fixed computers and would be thus for at least another decade. In 1981, for the first time in history, the first portable computer was put on sale. It was based on a XEROX project that dated back to a few years earlier, to 197
6. The Osborne 1 was, for those times, a feather: it weighed 10 Kg., had a DOS operating system and became a commercial success without precedent. The 5 inch screen makes us smile today, the operations that it could do were very limited but the company produced 10.000 pieces a month and the reason was simple: in the same case it integrated screen and keuboard, and it cost only 1.795 dollars, about 6.500 dollars of 2024, a revolution for whoever had need of calculating power also while mobile. Sp
eaking of portables: Gioele, how is editing the video with the Zenbook? Hi Matteo and hello to all! I believe that with the Osborne maybe I would have had some difficulty! I will interrupt the video for a second to take you behind the scenes of our documentaries. As Matteo was saying, our work has literally become gigantic because, as you might have noticed, the videos have become much more similar to television documentaries rather than YouTube videos. For this reason Asus contacted us so we co
uld try out their Asus Zenbook 14 OLED. With its 1.2 Kg. of weight and only 14 mm of thickness it is perfect for working lives and allows you to take it practically anywhere and I literally mean anywhere. In this case it is equipped with a windows 1 home operating system and, thanks to a last generation microprocessor such as Intel Core Ultra 9, it allows us to also elaborate articulated and difficult videos in 4K and then export them quickly. Besides a large capacity for calculations, the Zenb
ook has brought a huge change to the world of portable comuters because it has integrated a Software dedicated to Artificial Intelligence which helps in all the fields of application. The AI allows interventions, for example, on the quality of the images of the webcam or on correcting the audio with regard to the speakers. In regard to the elaboration of videos, here I am dealing with a piece that Matteo sent me about the calculators of the 1970's. There are many levels because there is the audi
o of the microphone, the transition, the effects on the images, the movement of the miniatures, the music, the background and, in short, you can understand. It takes very little to render the video. For mounting I use the programs: Davinci Resolve or Cap Cut, in this case the exportation of a 4K video that lasts 25 minutes is being done in about 15 minutes. Matteo was speaking of the computers of the past with which, to do something similar, would have taken decades. Thanks to a device such as A
sus Zenbook the work has become faster and we can assemble with a continuous rhythm but mostly make documentaries for which only a few years ago would have needed many persons with many computers. Now, instead, ASUS and I suffice. Oh yes! and also Matteo but, if necessary, we can recreate him with AI. After its Alto, Xerox did not waste time and commercialized the Xerox Star, which was the first widely distributed computer to utilize windows, icons, the mouse and the pointing system. It was real
ly a computer as we intend nowadays and it was also how our smartphones, tablets and TV's work. The Star had only one problem: it cost as much as an apartment. It was marketed at 75.000 dollars with the basic configuration and 25.000 units were sold, it certainly wasn't a revolution as far as popularity was concerned. The graphic interface of Xerox was revisited by Apple that in 1983 commercialized the Lisa, a computer that had been in the development stage from 6 years and that perfected the co
ncept of desktop created by Xerox and its graphic interface. With regard to this circumstance, there is a metropolitan legend that Apple stole the interface from Xerox, that Jobs was passing by, saw the mouse and that the Xerox staff had told him "go ahead, take it", but this is absoutely not true. Rather Xerox bought large quantities of Apple shares at an agreed price because it had allowed the engineers of Steve Jobs to learn from the engineers of Xerox and therefore Xerox itself earned from t
he future successes of Apple. The Lisa was a refined computer for that period, it had great technical characteristics but also a great price, selling at 9.945 dollars much less than the Xerox Star but still very expensive, today a value of about 30.000 euros. Steve Jobs had been ousted from the Lisa project and had dedicated his time to a small work team that was trying to develop a cheap computer, not affordable to anyone but affordable to many, a truly revolutionary product: the Macintosh. It
was presented in a theater where Jobs was inundated with applause. I believe that its advertising along the theme of George Orwell's 1984 was the most beautiful ever made together with the Think Different one, always by Apple, of a few years later. The first Mac was cheap, only 2.495 dollars, it had a mouse and a keyboard, a graphic interface, used graphic metaphors such as the paper basket, notes or desk and became very widespread all over the world. Schools and companies bought it, and it was
preferred by everyone who did graphics and some parents who had the financial possibility bought it for home use. It was the beginning of the revolution, but another player broke their eggs in the basket and inserted himself into Steve Jobs's and Apple's scene: William Henry Gates, known as Bill. Since 1982, Bill Gates had been selling MS DOS for the functioning of the IBM PC and the success of that device had greatly favored his industrial development. If the competition worked on the graphic i
nterface in such an efficient way but thought only of a closed system of computer sales connected to the operating system, Bill Gates reasoned in a totally different manner. He was not interested in machine sales, he wanted to earn with the software. In 1985, he presented the first version of Windows, called Windows 1.0 and, substantially, it was a graphic interface for DOS, but worked very well. It allowed for the moving of windows over each other (thus the name Windows) and the possibility for
the PC to be used by anyone. Windows began selling on the IBM market but then spread its frontiers to other producers and, up until the version 3.0, it was simply a graphic interface that operated DOS. It was many years later that things were revolutionized, with Windows 95. In this version, DOS was no longer the operating system that launched the graphic interface and Windows became a mature product that was utilized in every home. In its interior there is an application that revolutionized al
l our lives, a program that today no longer exists but that, for the people of my generation, was an interface on the fundamental world: Internet Explorer. Yes because, if earlier computers had been used to do calculations, to elaborate graphics and also to assemble the first videos, they were always utilized in closed spaces, with Windows 95 and internet explorer, the computer opened up to the world. The race was on for the diffusion in all homes and churches like had never happened for any oth
er device. Now I do not want to tell you the history of all the machines and all the computers of the planet, but I want to reflect on how an electronic object went from being utilized mainly in the technical field to becoming a part of our every day life. During the 1980's computers began to be transformed from instruments for calculations and the elabration of contents to devices that are used in every object of our daily life. Portable consoles, cellular phones, the first automobiles with pro
cessors integrated in their dashboards and so on, were born. This period saw the introduction of Game Boy, Motorola phones and all those electronic devices which were precursors of the real computer revolution that entered into every day life: to have computers that regulated all human activities. There are various milestones in the computer revolution after the advent of Internet, I will point out 3 of them: - The advent of Wi-Fi and mobile networks at the beginning of the 2000's. - The marketi
ng of iPhones in 2007. - The spreading of Artificial Intelligence in 2023. There are many more stages and devices or software that have characterized the history of computers and informatics but I believe that these were the main ones. The spread and improvement of the technology related to the transmission of data on wireless networks have allowed for the production of devices that no longer need to be imagined as being connected to a network through a cable but can be carried around freely, a
conceptual revolution without which there would not be a reason for any of the devices that we now utilize daily to exist. The smartphone that you are holding in your hand is useful only because it connects you to other persons and to an unlimited database of information thanks to Internet. The mobile network allows each one of us to have a device almost anywhere in the world, able to communicate with the rest of the planet and to elaborate information in real time precisely because of its conne
ction, a revolution that does not have precedents in the history of mankind. Then in 2007, there was the presentation of the pivotal device for the development of everything that followed: the iPhone. I can narrate this as a direct witness because I watched that presentation live on Internet and there was a huge expectation for something that we did know whether it would turn out to be just a phone, a new iPod or something else and, with the iPhone, Apple revolutionized everyone's life. That was
the moment in which computers ended up in the hands of all the people of the world who could play, calculate, navigate in Internet, watch videos, communicate. In short, everything that it is possible to do with information technology, in the palm of your hand. The latest revolution is Artificial Intelligence, which forcibly spread in 2023 even though it had been in the air for decades, at least since the theoretical formulas of Alan Turing. Much more than automation, of smart TV's, etc. Artific
ial Intelligence gives the user a result that is original in respect to the request and has never before been produced, it constitutes a re-elaboration of the data that we feed into the machine. I can ask it to take a photograph, a video, a totally new textual process when compared to what has been produced by man up till now. It is man that tells the machine to produce an original content, something that makes us face even ethical and existential questions that have no precedent. Artificial Int
elligence can be used by computer companies, such as ASUS with its Zenbook, to improve the performance and the abilities of their devices and, in a few years, we will find it also in automobiles, refrigerators or who knows in how many other objects thought of for mankind. In particular, Zenbook 14 OLED is ready to skilfully manage Artificial Intelligence thanks to the new processor: Intel Core Ultra 9. The computer revolution has been, for a long time, something reserved to expert persons, at le
ast from its beginning until the end of the 1900's, but nowadays it influences the lives of us all, it does that thanks to trials and attempts, failures and successes. Only one thing is certain: there is no turning back. There will be an increasing need for calculators in every human activity to simplify operations that are now complex and that, tomorrow, could become very simple. And to think that it was all born from the Abacus...

Comments

@Vanilla_Magazine

Scopri Asus Zenbook 14 OLED qui 👉 https://estore.asus.com/it/asus-zenbook-14-oled-ux3405ma-pp032w.html?utm_source=vanillamagazine&utm_medium=kol&utm_campaign=24q1_zenbook-14-oled-ux3405_2in1

@muffo1965

ciao Matteo, questo video mi ha riportato indietro nel tempo. Da quasi sessantenne ( ho 58 anni), mi sono avvicinato all'informatica nel 1985, quando diciannovenne, convinsi mia madre a comprarmi lo spectrum plus ( che ho ancora conservato e perfettamente funzionante). Il primo pc, usato si per giocare, ma anche per capire i primi rudimenti della programmazione informatica. Anni dopo ebbi il mio primo pc, un 486 senza processore matematico, che montava il sistema Dos e windows 3.1. Natale 1994, alla viglia di Natale installai windows 95 e via via fino ad oggi con windows 11.Sembra un eternità e vedere oggi un processore nel telefonino, che a detta di molti, ha la stessa potenza di calcolo di quelli che occupavano intere stanze, sembrava a quei tempi, quasi fantascienza. Molte cose sono state perse nel tempo ( chi si ricorda ormai del browser concorrente di windows explorer?). Guardiamo poi il lato internet, chi si ricorda più dei modem a 56 k che usavano la rete telefonica e per la quale andavi a consumo? O la rete BBS? Oggi la maggior parte di noi usa la fibra, velocissima e chi se lo può permettere anche la parabola con i satelliti. Mi chiedo però una cosa, se un guasto mondiale bloccasse tutto? Non saremo più nemmeno in grado di fare l'operazione più semplice. Speriamo che non avvenga....sarebbe una catastrofe!

@vivianamerlo7180

Mamma mia, Matteo, che bello questo video!! È stato come ripercorrere il mio cammino lavorativo. Avevo quasi 17 anni ed ero agli esordi della mia carriera lavorativa quando mi hanno fatto provare una Rank Xerox nello studio notarile in cui lavoravo. Per me è stato un momento esaltante ma di incredulità. Con quella macchina il lavoro veniva eseguito in un tempo brevissimo e senza più errori. È stato un momento pazzesco. Era il 1976. E poi ho vissuto tutto il resto venuto dopo. Grazie a te e al tuo team sempre per i contenuti così preziosi che ci trasmettete. Un abbraccio ❤

@riqualificoaffitto

Le missioni Apollo hanno permesso la miniaturizzazione dei PC, prima prendevano una stanza intera. Sarebbe bello anche un video sulla storia di internet e di Tim berners Lee.

@riccardoscalabrin7868

Mi aspettavo un menzione a Faggin, almeno per campanilismo.

@martinlutherwrong4040

Grazie :-) La macchina di Babbage fu realizzata (esclusa la sezione stampante) : ma solo nel XX sec. per motivi storici alla fine dei 1980 , ed esposta in un importante Museum di Londra , dove erano specificati orari per assistere alla dimostrazione pratica di funzionamento .

@stefaniazanin6781

Che evoluzione , da un semplice abaco ......grazie Matteo

@giuseppeo757

Ciao, bellissimo video. Come altri, anch'io sono tornato bambino: pensa che il primo computer che ho toccato con le mani era un pdp-11... Adesso peró mi aspetto un video sui sistemi operativi, magari sull' 86-DOS o QDOS. Grazie.

@ggtv2000

per curiosità ma la OLIVETTI che ruolo ha avuto nell'elaborazione di sistemi computerizzati. Grazie

@andreac.1130

Ottimo video come sempre. Però, capisco che comprimere tutta la storia del computer in un breve video comporta delle scelte, ma è stata omessa completamente la citazione degli home computer diffusi negli anni 80. Sono quelli i primi veri computer diffusi in tutte le case perché pur essendo usati per lo più come videogiochi, avevano una capacità di elaborazione sufficiente per essere usati come veri personal computer. Non citare aziende come Commodore, Sinclair, Atari, il consorzio MSX è un po' un buco nella storia dell'informatica. Su quelle macchine sì è formata la Generazione X.

@leonardoalaimo4435

Grazie Matteo & C. Ho studiato queste cose decenni fa ed ho potuto fare un tuffo nel passato.

@identifiantidentifie397

Molto sintetico ma molto interessante, grazie team Vanilla.

@PunsNook

0:08: 💻 Il computer come macchina astratta capace di calcoli, immagazzinamento dati e programmabilità. 4:01: 💻 Invenzione delle schede perforate da Herman Hollerith per elaborare dati del censimento del 1890. 8:05: 💻 L'importante contributo di Turing nella Seconda Guerra Mondiale e l'evoluzione dei calcolatori fino ai prodotti utilizzabili. 12:00: 💻 L'evoluzione del computer attraverso l'introduzione della calcolatrice, del primo personal computer e dell'interfaccia grafica. 15:48: 💻 L'evoluzione del portatile Zenbook con sistema operativo Windows 11 Home e processore Intel Core Ultra 9. 19:05: 💻 L'ascesa del Macintosh e la sfida di Bill Gates nel settore informatico. 22:44: 📱 L'evoluzione della tecnologia mobile ha rivoluzionato la nostra vita e la connettività globale. Riassunto utilizzando Tammy AI

@mariellapellizzari8613

Quanta acqua sotto i ponti.... Bellissimo video. Grazie Matteo e un abbraccio 🤗🤗🤗

@AlbertoPirrotta

Mitico l'Olivetti Lettera 22.

@RobinBagnolati

Non hai nominato i miei amati Windows 3.1 e 3.11. Male male 😂 Scherzo, ottimo video come sempre

@marcofortuna84

Ottimo contenuto come sempre. Ma da buon vicentino avrei gradito venisse nominato Alfio Faggin, che sicuramente conoscerete, e che reputo fondamentale nell’evoluzione dei pc

@luigisilvestri404

Non hai parlato della lotta che ci fu in Europa fra il Sinclair Zx spectrum ed il Commodore Vic 20

@valeriacolle4168

Buondì 😊. Bello svegliarsi in tua compagnia!

@lindaburks

Il miglior video/documentario di sempre. Complimenti anche a Gioele!