From Zuse to the M1 Chip: The importance of the laptop in our times

Rafael Perez Medina
July 12, 2022

We can consider the Treaty of Paris, together with the Treaty of Versailles and the Geneva Convention, as one of the most important agreements in the history of our time. After the Second World War, populations have built, with hard effort, a state of peace and prosperity thanks to the economy, intellectual property, and assisted social and economic experiments. All of this with the evolution of technology came an explosion of invention, optimization and data processing. From the ABC (Atanasoff-Berry Computer) capable of performing an operation in 15 seconds to the 5 nm transistors of the M1 APPLE whose distance between its components (Source-Drain) we calculate in atoms (1).

   In the last 77 years, we have witnessed and participated in the most explosive period in the short history of homo sapiens on this planet. Geographical, cultural and ethnic contrasts have brought the human population this today: a time of complexity, diversity and detail, as never before recorded. Every detail, every analysis, every methodology, production, supply chain and systems in a broader sense are the most sophisticated ever conceived.                                      

    The computer, symbol of our times, has accelerated the improvement of all processes, all systems and all populations. It is the object of modernity. As early as 1941, Konrad Zuse had created the Z3, the first programmable electromechanical computer: it had a monitor, a keyboard and even a flat screen.  Turing also created a fully programmable digital computer, the Colossus, in 1943. At the same time, "The Giant Brian" was being designed in the USA, and in 1945 its first task was to perform the calculations for the launch of the hydrogen bomb. These calculations took about 20 seconds, while a mechanical computer (like Zuse's) would have taken about 40 hours to perform the same process.          

The leap was exponential: with the creation of the first transistor in 1947 by William Shockley and the creation of the first RAM memory system by "The Manchester Baby" in 1948; we arrived at an object that could only improve in design or processing. Today, one of the big problems is that transistors have become so small that the distance between source and drain is measured in atoms. A transistor works like a light switch: it has an "on" state and an "off" state, and because the distance between the components is so short, phenomena like quantum tunneling (2) start to occur and the two states of the switch overlap.                                                         

    If we make an exception for space equipment, in terms of the materials used, the resources available and the complexity of the processes, the most expensive object ever created by man is undoubtedly the CHIP. Suffice it to say that this tiny rectangle of silicon and electrodes contains the most complex supply chain ever put in place by humanity, due in part to the difficulty of finding the rare materials necessary for its creation and the intricacy of the manufacturing process. Materials that go under the most extreme conditions seen by a microscope, a spectrometer or simply imagined in theoretical calculations.                      

        Add to this, many critical materials are difficult to mine for a variety of reasons, one being their abundance. Setting all this up to create a computer is very complex. Its optimization has been extremely fast, to the point of doubling the number of transistors every 18 to 24 months, as described by G. Moore in 1965 (2).

     But let's leave aside the history of the computer to ponder on what it represents, today, for our daily life, trying to understand and clarify how its use, its rehabilitation after use, its modular and upgradeability and adaptability are involved in the complex path that starts from the design and, through all the successive steps: extraction and processing of materials, creation of components, assembly chain and distribution to finally arrive to the user.        

     Join us to discover the world of mobile computers and learn the practices that will allow us to use them durably, efficiently and sustainably over time. This document aims to give you all the information you need and to invite the reader to a unique analysis of the various challenges and opportunities offered by this device.


(1) Who Invented Computers? A Short History:

(2) Transistors and the end of Moore's Law, 2veritasium, - Andrea Morello,

(3) Moore, Gordon E., Cramming more components into integrated circuits, Electronics Magazine, Vol. 38, Number 8, April 19, 1965.