Can something replace silicon for making chips?

Rafael Perez Medina
March 9, 2023

Since its first use in microprocessors in 1965, silicon has been the go-to material that has enabled the proliferation of ICT equipment around the world and the development of the different processes and technical challenges in creating these amazing pieces of equipment. Its low cost and extensive natural deposits have led to the development of the most complex supply and manufacturing chain ever seen by mankind, allowing the number of transistors to be increased in nanometric spaces. This is reaching its physical limit: Moore's law has slowed down in recent years.

Although silicon has been the king material for integrated circuits, it has several properties that do not work in its favor, as this Autodesk article points out: 

  • Electrons Go Crazy. Many of today’s circuits are as small as 7nm wide, and when you’re trying to send electrons down transistor pathways in these tiny silicon spaces, they often become unstable and difficult to control. What do you do when electrons go rogue and start interfering with other signals? Hope for the best, I guess.
  • Mobility Issues. There’s also the problem of electron mobility. Yeah, you can pack billions of transistors into a space the size of a red blood cell, but silicon itself doesn’t provide the best environment for electron mobility as other materials do, like indium or graphene.
  • High Heat Problems. Another issue is that the more you pack into silicon, the higher the temperature climbs with all of that activity, leading to degraded performance. Today’s ICs with billions of transistors requires a ton of fans just to keep everything cool. Think of the giant heatsink strapped to your computer’s processor.
  • Lazy With Light. Silicon is also terrible at transmitting light. And with a widespread use of lasers and LEDs, manufacturers are starting to use alternative semiconductor materials for photonic applications to work around their silicon deficiencies.

Wasted Power. Despite all of the power pumping out of a silicon-based circuit, there’s also a ton of energy being lost in the process. Check out the graph below, and you’ll see what we mean. 20nm processors are already tipping the scales beyond 50/50 between usable power, and power that goes to waste.(1)

Although silicon offers these disadvantages behind a multi-billion dollar industry and specialized distribution networks for this complex supply chain, there are several candidates to replace silicon in different industries and applications.

The application of flexible circuits in the healthcare industry is more than a decade old. Applications such as optical detectors, flat-panel displays, and sensor arrays have been developed under the model of thin-film transistors. These flexible integrated circuits can be used for thermal sensors for artificial skin, and flexible vibration and photodetectors could work as electrically powered artificial Auris Interna and retina, respectively; as well as electronic implants (2).

On the other hand, bronze is often used in integrated circuits, but is not always the best material for situations outside an office or home. There are also conditions that determine how resilient an integrated circuit must be. If for example, this integrated circuit must operate in a very corrosive environment: bronze should be replaced by stainless steel; if its activity is supercomputing, perhaps aluminum is better because of the low temperatures (3).

The last candidate to replace silicon could be graphite: it is a better conductor, more resistant to temperature changes, and very flexible. Although it has properties that make it a viable candidate, extracting a gram of usable graphite to create a processor would be economically unfeasible: it costs around $800 (1). On the other hand, graphite when doped with impurities to make it into a semiconductor starts to change its electrical behavior. Although it is a great option for many reasons, it still has a long way to go before it becomes a contender to replace silicon.