13

I'm not talking just how memory is assigned and memory management (things that you can learn from C for example) but rather the hardware aspect and how each component of the computer hardware works internally and how they communicate with each other.

How many of you know all of this?

Adrian
  • 147
  • 1
  • 3

9 Answers9

31

It depends on what you do.

If you're an embedded developer (and you're writing very near the metal in a very small device), you need to know every in and out of every component in the system.

If you're a systems developer (and are writing operating systems or device drivers or maybe even databases), then you'll need to know just about everything there is to know about low-level hardware interfaces.

If you're a games developer and late in your project (where you're optimizing things), you need to know the ins and outs of the CPU cache and graphics architectures you'll be using.

If you're an applications developer, you don't need to know any of this except keeping memory usage at reasonable levels.

If you're a web developer, it's handy to know internet protocols, but none of this other stuff is necessary except how to deal with memory issues.

greyfade
  • 11,103
  • 2
  • 40
  • 43
  • 20
    Knowing about your machine always makes you a better developer, even if you only do line-of-business applications. Such knowledge may not be required, but it is certainly important. – Robert Harvey Dec 29 '10 at 19:22
  • @Robert Harvey: Correct, which is why I say "don't *need* to know." It certainly doesn't hurt to know these things anyway. :) – greyfade Dec 30 '10 at 04:41
  • 2
    +1, but I know (more than) a few web developers who never consider memory usage. – Tim Post Dec 30 '10 at 19:13
  • -1 for "If you're an applications developer, you don't need to know any of this except keeping memory usage at reasonable levels." Some of the larger problems in the systems around here are caused by application designers not knowing how to use databases and then blaming the dba's when "things are running slow, and its not my code; see, there's the profile". – Andrew Hill Sep 08 '14 at 08:16
  • @AndrewHill: When is that a hardware matter and not an algorithmic one? The question is about understanding low-level hardware behavior, not high-level algorithms. – greyfade Sep 08 '14 at 17:16
12

No, you don't have to, but I think it is an excellent idea. Learning a general overview of how things work at the logical level has really helped me in development.

Michael K
  • 15,539
  • 9
  • 61
  • 93
6

I think that the comments about application developers and web developers is wrong.

For example, if someone is developing a web application that involves a lot of security stuff like ssh or some other encryption algorithm it is pretty important that they know what type of hardware they are running on so that they can determine if the machine can actually handle the work load. Another example might include a server that hosts some sort of downloaded content. You better know the capabilities of the disk drive and whatever type of bus interface it is attached to if you expect a reasonably large amount of requests.

From an application stand point, if you are developing some sort of CAD program, or something that does 3d rendering you can expect these application to be compute intensive both algorithmically and graphically. It would be prudent to understand the hardware to make sure the application is responsive and usable.

I am not saying that you have to go as far as understanding the ins and outs of something like the PCI protocol, but you better understand what the interface and hardware is capable of.

Ultimately, it is important regardless of what type of development you do. The level of detail necessary for you to understand is debatable.

Pemdas
  • 5,385
  • 3
  • 21
  • 41
2

For a professional programmer, I look at it as a holistic approach to understanding the entire system, rather than just knowing the syntax of a given programming language du jour. I find it helps programmers (and analysts) make smart design decisions, and make more informed algorithm & data structure choices.

In my own experience the best programmers tend to know about the inner workings, to varying degrees, whether that is understanding the native assembly instruction set for a target platform, an introductory computer organization course at school, rudimentary digital electronics or being able to understand detailed descriptions of CPU and GPU cores in the latest models, the best have a more complete knowledge than their less stellar peers.

mctylr
  • 1,173
  • 6
  • 12
1

I'm not sure if it helps, but I feel more comfortable when I know more about the system I'm working on than I really need to. When I was younger, I didn't like not knowing the assembly language of the system I was working on; either I've changed, or I've learned sufficiently many that one more isn't going to expand my feel for the system much. I've never been much of a hardware guy, but I can take a computer apart and name the different subsystems and how they interact.

David Thornley
  • 20,238
  • 2
  • 55
  • 82
0

What little I know I learned in a computer architecture class almost 25 years ago, and on a non-real-world architecture at that.

I do primarily applications programming on a variety of platforms. There was a period where I was developing code to run on not just commodity x86 hardware, but also on Sparc, PA-RISC, and other server architectures. Knowledge at that level simply isn't required for the work I do.

John Bode
  • 10,826
  • 1
  • 31
  • 43
0

I teach Java programming as an AP high school class. I've found that when students know something about the internal workings of a computer, it does help them understand programming concepts.

I don't go overboard - just simple concepts such as how things are stored in memory seem to help the students.

bestattendance
  • 173
  • 2
  • 6
0

I'm going to answer this as asked in the title:

Yes, a developer should know the inner workings of the hardware. How much depends on the type of developer & their goals, available time, and personal interest. The priority should, of course, be on the immediate tools, techniques, etc., that they employ in their area. This opinion is along the lines of being a well rounded individual. The more you know of things outside of your craft, without sacrificing your craft, the better.

That doesn't mean that you need to get crazy with the Cheese Whiz. Have a good overview of the hardware, how the pieces interact, how operating systems use them. Along these lines, I suggest reading an operating systems concepts book for all developers.

Do I know all of this? Heck no. I've forgotten so much useless information about SCSI that it's not even funny. However, learning about it was an invaluable experience. I've also forgotten a lot of other hardware related details, but recall the important concepts that I learned from that knowledge.

So, I certainly suggest learning about the hardware. Do it at a reasonable pace, depending on your needs. Learn as much of the details as you reasonably can, but focus on the concepts.

George Marian
  • 4,360
  • 26
  • 31
0

Every developer should know the basic concepts associated with computer engineering: binary arithmetic, base conversion, Boolean algebra, logic gates, composition of a CPU (and what the components do), cache, virtual memory, compression, error detection and correction...

The extent to which these should be known depends on the job of the programmer. Like others have said, an embedded systems developer would have to be a lot more familiar with hardware capabilities and how to write software that best utilizes the hardware than say, a web developer.

Thomas Owens
  • 79,623
  • 18
  • 192
  • 283