23

I've started playing with electronics a while ago and making simple logic gates using transistors. I know modern integrated circuits use CMOS instead of transistor-transistor logic. The thing I can't help wondering about is how CPUs are designed.

Is design still done at a (sub)logic gate level, or is there not much innovation in that area anymore and have we moved on to a higher level of abstraction? I understand how an ALU is built, but there is a lot more to CPUs than that.

Where do the designs for the billions of transistors come from? Are they mostly auto generated by software or is there still a lot of manual optimization?

Overv
  • 341
  • 1
  • 2
  • 7
  • 2
    I'd say Verilog or VHDL. – avakar Mar 30 '12 at 15:27
  • 3
    While these topics are fascinating, we seem to be a long way from ["practical, answerable questions based on actual problems that you face"](http://electronics.stackexchange.com/faq). Also I can imagine an entire book that answers this question. – Martin Mar 30 '12 at 15:31
  • Isn't it VLSI now? http://en.wikipedia.org/wiki/Vlsi – AngryEE Mar 30 '12 at 15:38
  • @Martin I strongly agree - for even a single processor, this could be a large article. – W5VO Mar 30 '12 at 15:45
  • @W5VO No worries, I just needed to know what to look for. With "Verilog" and "VHDL" this question was pretty much answered for me. – Overv Mar 30 '12 at 15:49
  • 1
    @Overv, there is still a lot of work where you make sure your base blocks you are plugging together are optimized at the gate level, then you just plug in those optimized blocks in an optimized way! – Kortuk Mar 30 '12 at 16:13
  • 13
    I voted to re-open -- while I agree that a complete answer telling "everything you need to know to build an entire CPU from scratch" is not a good match for this site, I think a brief overview and [a few](http://electronics.stackexchange.com/questions/16206/how-are-most-alus-built-and-is-it-possible-to-build-your-own) [links](http://electronics.stackexchange.com/questions/4185/what-are-different-types-of-computer-architectures) would be a good answer here. – davidcary Mar 30 '12 at 17:59
  • This is a fascinating question. Time to jump down the rabbit-warren of research required to answer it! – Polynomial Apr 01 '12 at 12:40
  • 1
    I'm surprised we don't have any people who work in the semiconductor industry who can comment on more complex ASIC design here. Since I am no expert, this is just some stuff I've heard: There is a lot of licensing in IP cores and the field by which it is all put together is called VLSI. I believe design is done in VHDL/Verilog with highly optimized synthesizing tools - how this gets down to the wafer level and manufactured is beyond my knowledge. – Jon L Apr 07 '12 at 16:07
  • Early CPUs had thousands of transistors, not billions. The design of the [MOS Technology 6502](https://en.wikipedia.org/wiki/MOS_Technology_6502) (around 4300 transistors) has passed into the public domain. You should be able to find it on the internet. – Solomon Slow Feb 20 '18 at 15:16
  • 1
    Sounds like you have learned a bit about _combinitorial logic_ (e.g., how ALUs work). The "lot more than that" that you are looking for might be the [sequential logic](https://en.wikipedia.org/wiki/Sequential_logic). That's the part that makes things happen over time. A good place to start would be understanding [finite state machines](https://en.wikipedia.org/wiki/Finite-state_machine). – Solomon Slow Feb 20 '18 at 15:19

3 Answers3

11

It is very likely CPU's and SoC's are used by hardware description languages like Verilog and VHDL (two major players).

These languages allow different levels of abstractions. In VHDL, you can define logic blocks as entities; it contains inputs and output ports. Within the block you can define the logic required. Say you define a block with input A, input B and output C. You could easily write C = A and B;, and basically you created a AND port block. This is possibly the simplest block you can imagine.

Digital systems are typically designed with a strong hierarchy. One may start 'top-level' with major functions a CPU requires: proccesor (multiple?) memory, PCI-express, and other busses. Within this level busses and communication signals between memory and procesor may already be defined.

When you step one level down, it will define the innerworkings of making something 'work'. Taken an example of a microcontroller, it may contain an UART interface. The actual logic required to make a functional UART is defined one level below.. In here, much other logic may be required to generate and divide the required clock, buffer data (FIFO buffers), report data to the CPU (some kind of bus system).

The interesting thing of VHDL and digital design is the reuse of blocks. You could for example, just copy&paste the UART block in your top-level to create 2 UARTs (well, maybe not that easy, only if the UART block is capable of somekind of addressing!).

This design isn't any kind of gate-level design. The VHDL can be also be 'compiled' in a way that it is finally translated to logic gates. A machine can optimize this far better than a human could (and quicker too). For example; the internals of block A requires an inverter before outputting the signal. Block B takes this output signal and inverts it once again. Well, 2 inverters in series don't do much right? Correct, so you can just as well leave them out. However, in the 'top-level' design you won't be able to spot the two inverters in series.. you just see two ports connected. A compiler can optimize this far quicker than a human.

Basically what digital system design contains is the description how the logic should 'behave', and the computer is used to figure out what the most efficient way is to lay out the individual logic gates.

Hans
  • 7,238
  • 1
  • 25
  • 37
  • 1
    Just as there is still a place for assembly code in software, lower-level hardware design can be cost effective in some cases. E.g., SRAM cells are often so commonly used that highly optimized designs are developed to optimize for density (last-level cache), access latency (L1 cache), or other characteristics, especially at an integrated design-manufacturer like Intel. –  Dec 17 '12 at 03:48
  • @Paul the intriguing question that raises for me is how much Intel invests in hand-optimizing their designs vs. writing software optimization passes that achieve the same performance improvements dynamically and more generally. – Ponkadoodle Jun 04 '15 at 05:40
5

Let me simplify and expand my previous comments and connect the dots for those who seem to need it.

Is design still done at a (sub)logic gate level?

  • YES

Design is done at many levels, the sub-logic level is always different. Each fabrication shrink demands the most brilliant physics, chemistry,and lithographic process experience as the structure of a transistor changes and geometry also changes to compensate for trade-offs, as it shrinks down to atomic levels and cost ~$billions each binary step down in size. To achieve 14nm geometry is a massive undertaking in R&D, process control & management and that is still an understatement! enter image description here

For example the job skills required to do this include; - "FET, cell, and block-level custom layouts, FUB-level floor plans, abstract view generation, RC extraction, and schematic-to-layout verification and debug using phases of physical design development including parasitic extraction, static timing, wire load models, clock generation, custom polygon editing, auto-place and route algorithms, floor planning, full-chip assembly, packaging, and verification."*

- is there not much innovation in that area anymore? - WRONG - There is significant and heavily funded innovation in Semiconductor Physics, judging by Moore's Law and the number of patents, it will never stop.The savings in power, heat and thus quadrupling in capability pays off each time.

- have we moved on to a higher level of abstraction? - It never stopped moving. - With demand for more cores, doing more in one instruction like ARM RISC CPU's , more powerful embedded µC's or MCU's, smart RAM with DDR4 which has ECC by default and sectors like flash with priority bits for urgent memory fetches. - The CPU evolution and architectural changes will never stop. enter image description here

Let me give you a hint. Go do a job search at Intel, AMD, TI or AD for Engineers and see the job descriptions.

- Where do the designs for the billions of transistors come from? - It came from adding more 64bit blocks of hardware. but now going nanotube failures, thinking has to change from the top down approach of blocks to the bottom up approach of nanotubes to make it work.

  • Are they mostly auto generated by software? with tongue firmly planted in cheek...
  • Actually they are still extracting designs from Area51 from spaceships and have a way to go.... until we are fully nano-nano tube compliant. An engineer goes into the library and says nVidia we would like you to join us over here on this chip and becomes a part, which goes into a macro-blocks. The layout can be replicated like Ants in Toystory but explicit control over all connections must be manually routed /checked out as well as use DRC and auto-routing for comparison. Yes Automation Tools are constantly being upgraded to remove duplication and time wasted.

    - is there still a lot of manual optimization?

  • Considering one airline saved enough money to pay for your salary by removing just 1 olive from the dinner in First Class, Intel will be looking at ways to remove as many atoms as possible within the time-frame. Any excess capacitance means wasted heat, performance and oops also more noise, not so fast...

But really CPU's grow like Tokyo, its not overnight, but tens of millions live there now with steady improvement. I didn't learn how to design at Univ. but by reading and trying to understand how things work, I was able to get up to speed in industry pretty fast. I got 10 years experience in my 1st 5 yrs in Aerospace, Nuclear Instrument design, SCADA design, Process monitoring, Antenna design, Automated Weather station design and debug,OCXO's PLL's VLF Rx's, 2 way remote control of Black Brandt Rockets... and that was just my 1st job. I had no idea what I could do.

Don't worry about billions of transistors or be afraid of what to learn or how much you need to know. Just follow your passion and read trade journals in between your sleep, then you won't look so green on the job and it doesn't feel like work anymore.

I remember having to design a 741 "like" Op Amp as part of an exam one time, in 20 minutes. I have never really used it, but I can recognize good from the great designs. But then it only had 20 transistors.enter image description here

But how to design a CPU must start with a Spec., namely; Why design a CPU and make measurable benchmarks to achieve such as; - Macro instructions per second (MIPS) (more important than CPU clock),for example; - Intel's Itanium chip is based on what they call an Explicitly Parallel Instruction Computing (EPIC) design. - Transmeta patented CPU design with very long instruction word code morphing microprocessors (VLIWCMM). They sued Intel in 2006, closed shop and settled for ~$200million in 2007. - Performance per watt (PPW) , when power costs > cost of chip (for servers) - FLoating point Ops Per Second (FLOPS) for math performance.

There are many more metrics, but never base a CPU's design quality on its GHz speed (see myth)

So what tools de jour are needed to design CPU's? The list would not fit on this page from atomic level physics design to dynamic mesh EMC physical EM/RF design to Front End Design Verification Test Engineer, where skills required include; - Front-end RTL Simulation - knowledge of IA and computer architecture and system level design - Logic verification and logic simulation using either VHDL or Verilog. - Object-oriented programming and various CPU, bus/interconnect, coherency protocols.

Adam Haun
  • 21,331
  • 4
  • 50
  • 91
Tony Stewart EE75
  • 1
  • 3
  • 54
  • 182
  • 6
    "Verilog" and "VHDL" only scratches the surface of all these naive yet inspiration searching questions. **The real world is much more analog than digital than you realize.** – Tony Stewart EE75 May 17 '12 at 16:30
  • Do you have an explanation of the Op Amp circuit anywhere. All I can see is a cascoded OTA, the rest is circuit Voodoo. – CyberMen May 17 '12 at 20:33
  • 5
    Wow. Too bad it's mostly irrelevant to the question. – Dave Tweed Oct 19 '12 at 22:18
  • 6
    I must say that this was a very amusing read. The gradual shift from the writer toying with the original question to him making an attempt to blow the reader's mind with numbers and vocab and then providing some self help followed by a reminiscence of his school-boy days with just a *hint* of arrogance and finally moving on to the cliched "it's so complicated that I could never summarize it here". Absolutely beautiful. – Ponkadoodle Jun 04 '15 at 05:53
  • And even though that comment of mine was somewhat satirical, I hope you will take it light-heartedly. I honestly did enjoy the read. – Ponkadoodle Jun 04 '15 at 05:53
2

AMD's Overview of CPU design

Intel's version

Neither of these provide much detail, but interesting none the less. Don't accept this as an answer. Others have considered your question in detail and have provided more effort in attempting to answer in detail.

stevenvh
  • 145,145
  • 21
  • 455
  • 667
rdivilbiss
  • 919
  • 1
  • 7
  • 12
  • 1
    I had seen that TomsHardware page before. However, it explains how processors are *manufactured*, not how they're *designed* – stevenvh Apr 07 '12 at 16:57