2

What are the steps taken by the CPU to sum 2 numbers(2+2) from the keyboard input to the display in the screen?

example: reading the ascii code .... convert the typed number to binary ... send to cpu .... printing on screen?

RollRoll
  • 163
  • 4
  • 6
    To close voters: please don't VTC this as "unclear what you're asking." Just because *you* don't understand basic compiler theory doesn't mean this question isn't absolutely clear to those of us who do. – Mason Wheeler Jan 20 '17 at 20:28
  • 5
    @MasonWheeler: i will VTC because it is incredibly broad though. – whatsisname Jan 20 '17 at 20:36
  • 6
    What is wrong with you people? We have someone here who wants to learn and simply doesn't know where to start, and you deny anyone even the opportunity to point him in the right direction?!? – Mason Wheeler Jan 20 '17 at 20:40
  • Just migrate this question to the appropriate site. – Christopher Francisco Jan 20 '17 at 20:49
  • @ChristopherFrancisco Which would that be? (Also, this is not something you can do when it's already been put on hold.) – Mason Wheeler Jan 20 '17 at 20:50
  • 1
    @MasonWheeler I was looking for the appropriate site but I can't find one related to this topic. Closest one is SO but it's still for programming (actually writing code) questions so it might get closed... – Christopher Francisco Jan 20 '17 at 20:53
  • Well I tried to figure out the appropriate site, I posted it a while ago in server vault and they said it doesnt belong there. I'm happy to give more details about this question. I just don't know how to be more specific. Honesly I don't understand the complain about my question. – RollRoll Jan 20 '17 at 20:55
  • 2
    @ChristopherFrancisco. Before we migrate this to a site that may or may not exist, can you explain how this is not related to software engineering. Does software engineering not include programming at high-levels as well as low-levels like assembly? – AnotherDeveloper Jan 20 '17 at 20:56
  • 3
    I'm inclined to believe that *as is*, this question is indeed much too broad (does OP want high level pseudocode, low level (pseudo)assembly, or machine code? cpu only or cpu + peripherals?), but should be fixable. I'm inclined to believe it's probably on topic, as understanding what's going on with code can be useful in writing it, though depending on how clarification goes (if at all), it may turn out to be off topic. – 8bittree Jan 20 '17 at 20:58
  • 1
    They are only asking for steps, and for 99% of stuff the steps are the same ie. compiler/interpretter parses raw code then to a syntax tree then emits machine code (possible intermediate compilers/interpeter and optimization in between) CPU executes machine code, part of which is opcodes that basically tell the cpu to shunt data or code to different bits oh hardware, in this case possibly to an ALU but certainly to some bit of hardware that implements an adder, adder returns a result and more machine code will shunt that into memory somewhere and eventually on to display – jk. Jan 20 '17 at 21:00
  • 1
    @AnotherDeveloper I could be wrong, but this site (Software Engineering) is for questions relating to architecturing, design principles, etc.. I think of it as if I'm in front of a whiteboard trying to figure out a way to solve something. That kind of questions. Again, I could be wrong. – Christopher Francisco Jan 20 '17 at 21:01
  • 2
    @ChristopherFrancisco And compilers don't fall under architecture and design? – Mason Wheeler Jan 20 '17 at 21:04
  • @MasonWheeler while I agree this should be reopened I dont think accusations of trolling help, afaict everyone is well meaning here, its just a difference of opinion – jk. Jan 20 '17 at 21:04
  • @ChristopherFrancisco Maybe this would be the right site: http://cs.stackexchange.com/help/on-topic – AnotherDeveloper Jan 20 '17 at 21:08
  • 2
    @jk. I agree. I also feel that deciding what questions are on or off topic or of sufficient quality is a balancing act by people. That said, to me, the greater Stack Exchange community tends to be a little overzealous in marking questions as insufficient. Over time, I am finding more answers to my questions outside of Stack Exchange, and I believe there is a correlation here. – AnotherDeveloper Jan 20 '17 at 21:12
  • I think it's too broad because it involves keyboard and screen (i.e. OS) – imel96 Jan 20 '17 at 21:45
  • I don't know if it's too broad per the rules....I think it could be fun to write an answer though. – John Wu Jan 20 '17 at 23:26
  • 3
    I decided to reopen as there were already 4 reopen votes on it. I can understand how this might be considered broad if you truly wanted to give a comprehensive answer, but I still feel like this can be solved with a high level overview without getting into mucky details of computer architecture and logic gate combinations. – maple_shaft Jan 22 '17 at 21:42
  • 1
    Not qualified to answer this, but this link is fairly descriptive at describing how CPU logic gates can be combined to create Adders. If you understand how transistors can be combined to create NAND gates and NAND gates can be combined to create other logic gates, then you basically can somewhat grasp how a CPU with many transistors can be wired together to create an OP code that triggers an ADD operation. http://www.cs.bu.edu/courses/cs101/old/2013spring/slides/CS101.06.GatesCircuitsAdder.ppt.pdf – maple_shaft Jan 22 '17 at 22:01
  • 5
    "from the keyboard input" makes this a **gigantic** question – Richard Tingle Jan 22 '17 at 22:28
  • 1
    I agree and am voting to close for the same reason. (Source: Co-wrote a compiler in the 1980s.) – Blrfl Jan 22 '17 at 22:35
  • [Add 2 numbers in assembly language and print the result](http://stackoverflow.com/a/23682807/102937) – Robert Harvey Jan 22 '17 at 23:45
  • My complaint with this is that there are very few applications where you type in equations and get a result. A computer is not a calculator. – user1118321 Jan 23 '17 at 06:08
  • 1
    @user1118321 A computer is most certainly a calculator. Just because humans often use it for things other than inputting just two numbers, adding them, and printing the result, doesn't make it not a calculator. Anyway, many applications fit the general format here perfectly: read input, convert to a more appropriate format, perform some operations using that input to generate some output, convert that output to a more appropriate output, write the output. Using addition as the work here is a good choice: it avoids all the noise that would be added by using a web browser or a spreadsheet. – 8bittree Jan 23 '17 at 15:17
  • @RichardTingle Apologies, I missed that part about "keyboard input". I wonder if it can be edited out without making the existing answer confusing? – maple_shaft Jan 23 '17 at 16:07
  • I don't know just how low level you are asking about, but I made a couple videos to explain how computers work at the lowest level http://youtube.com/playlist?list=PL5sUefBdGhUVwK08Tgef92m4l5pvsCKvC – jhocking Jul 02 '19 at 13:24

1 Answers1

3

The "higher level" steps are as you've guessed: Get characters from keyboard, convert to integers, add the integers, convert the result back to characters, then display those characters.

The CPU does none of this itself.

The CPU executes tiny little instructions that each do a very simple (and very specific) thing. These instructions are the fundamental (lowest level) building blocks of software. Each of those high level steps represents many instructions.

For a detailed example; to get 1 character from the keyboard (using multiple assumptions about the computer's hardware and the OS being used and over-simplifying):

  • Some sort of controller (that the keyboard is connected to) will send an IRQ (Interrupt ReQuest) to the CPU, and the CPU will (sooner or later) respond by starting an interrupt handler.
  • The OS's interrupt handler will figure out what the IRQ was and invoke a device driver's interrupt handler
  • The device driver's interrupt handler will do whatever it has to to get the byte from the controller (this can be several layers of "complex" for some cases - e.g. USB). Then it'll send that byte to a keyboard driver
  • The keyboard driver will figure out a "scan code", which typically involves a state machine to figure out if multiple bytes are part of the same scan code (or part of a new scan code). Then it will typically convert the "potentially multi-byte scan-code" into a "fixed size integer key-code".
  • Then the keyboard driver will use the key-code and various lookup tables and other meta-data (that depend on which keyboard layout is being used) to determine if there is/isn't a "character" (Unicode codepoint?) associated with that key. Note that a lot of keys simply don't have any character.
  • The keyboard driver will combine this with other information to form some sort of "key press event"; and send that event somewhere (e.g. to a GUI).
  • The "key press event" will make its way through various processes (e.g. from X to GUI to terminal emulator to shell to foreground console app) until it finds its way to an application. This can involve stripping a lot of useful information at some point (terminal emulator) to make it work for legacy stdin.
  • Once the key/character arrives at the application; there's typically some sort of input buffering that allows the user to edit (and supports things like backspace, delete, cursor movement, cut/copy/paste, etc). Also; the "current buffer" is typically being displayed while the user edits it (so that they can see what they're doing). Usually, when the user presses "enter" the entered text is considered complete. This may all be done by a library (e.g. C standard library).
  • Then application determines if the input is valid. E.g. if it's expecting a string representing a number but the user typed "FOO" then it may (should) display an appropriate error message and reject the input.
  • While doing input validation, or after doing input validation, (or instead of doing input validation for extremely bad software), the application converts the input text (a string representing a number) into an integer

Note that all of the above can easily add up to thousands of tiny little instructions that are executed by the CPU; even though it's only a fraction (barely more than one of the "higher level steps" we started with) and even though I didn't provide any details of how the input buffer is displayed while it's being edited (font engine, text layout engine, 2D graphics renderer).

For all higher level steps (get 2 numbers from user, add them, then display the result) the total number of instructions that a CPU executes can be (literally) millions.

Brendan
  • 3,895
  • 21
  • 21
  • 1
    "convert to integers, add the integers, convert the result back to characters" - The CPU almost certainly directly performs all the instructions necessary for these three tasks. These three tasks are also much simpler than the I/O tasks. The actual work for all three of these combined is not more than a few dozen instructions and can be done entirely in user space, meaning no extra complexity from calling into the kernel, and even calling into a library for the conversions doesn't necessarily add much more than a few `push`es, a `call` and a `return`. – 8bittree Jan 23 '17 at 15:06
  • @8bittree: Ignoring all the work that a library function does (and only caring about the instructions used to call the library function) does not give a complete picture of "the steps/instructions a CPU actually does". – Brendan Jan 25 '17 at 01:56
  • I didn't say anything about ignoring the work done by a library function. In fact, I specifically mentioned the extra work that using a library function, rather than your own code, would add. – 8bittree Jan 25 '17 at 02:01
  • @8bittree: In that case, (except for pointless "splitting hairs" - e.g. "CPU doesn't do each high level step" vs. "CPU does all the tiny little steps that make up a high level step") I have no idea what the point of your comment was. – Brendan Jan 25 '17 at 03:40
  • There were two things that I was addressing in my comment: 1) Your statement "The CPU does none of this itself" seems to imply, at least to me, that it offloads all those tasks to a co-processor or external device, which is rarely, if ever, the case for the three tasks I quoted in my original comment. 2) You're not explicit about it, but in several places, you seem to imply that those same three steps are roughly equivalent in complexity to the I/O steps, and need thousands or millions of instructions to complete, which is a gross misrepresentation of the complexity of those three steps. – 8bittree Jan 25 '17 at 22:44
  • @8bittree: I mostly wanted to point out that the CPU doesn't do high level steps (no "get keypress from user" instruction); and if you want to be picky, getting keyboard input actually begins with a micro-controller that polls a grid of switches and communicates with another micro-controller (half the work of is done before CPU gets an IRQ). – Brendan Jan 30 '17 at 03:02
  • @8bittree: Something like (e.g.) converting a string into an integer can easily be more work than the IO. Consider things like accepting `1.234e4` and `four`, and/or Internationalisation (e.g. `1,234.0` vs. `1.234,0`) instead of only thinking of the crude/minimal "joke code" from places like standard C library. – Brendan Jan 30 '17 at 03:05