15

Quote from Wikipedia of the article "High-level programming language":

A high-level programming language is a programming language with strong abstraction from the details of the computer. In comparison to low-level programming languages, it may use natural language elements, be easier to use, or be more portable across platforms. Such languages hide the details of CPU operations such as memory access models and management of scope.

I.e., as the level of programming language increases, the further away the programmer goes from the hardware on which the program runs.

Now I do not know the statistics of language usages across levels, but I would suspect that the higher level languages more and more are replacing the lower level languages. If so, can this lead to a deficit of programmers with computer architecture knowledge? Would this be a problem for the industry?

Jonas
  • 14,867
  • 9
  • 69
  • 102
gablin
  • 17,377
  • 22
  • 89
  • 138

4 Answers4

16

It can, but likely won't lead to a problem.

It's just economics. If the vast majority of people lose the ability to understand the underlying architecture, and there is still a huge NEED to understand the underlying architecture, then the ones who do will have jobs and get paid more, while those who don't will only have jobs where that is not needed (and may still get paid more...who knows?).

Is it helpful to know? Absolutely. You'll likely be better. Is it necessary in most cases? No. That's why abstraction is so great, we stand on the shoulders of giants without having to be giants ourselves (but there will always be giants around).

Ryan Hayes
  • 20,139
  • 4
  • 68
  • 116
  • 4
    But all abstractions are leaky. Knowing the underlying architecture is a must if you want to be the go-to guy for troubleshooting leaky abstractions. – dsimcha Oct 20 '10 at 18:19
  • 5
    @dsimcha, Agreed, but to be the go-to guy you need "the others" to come-to you ;-) If everyone needs to know everything, the abstraction has failed miserably. – Preets Oct 21 '10 at 13:50
  • 1
    @Preets, And that is why many abstractions *have* failed miserably. To even have room for a go-to guy to exist is proof that an abstraction has already failed. – Pacerier Aug 27 '14 at 00:19
  • @Ryan, It will actually lead to a problem where in the future the world is flooded with apps full of subtle bugs due to layers upon layers of leaky abstractions. It's amazing enough now that companies like Google with *infinite* resources can still have bugs in their core apps. – Pacerier Aug 27 '14 at 00:29
  • 3
    @Pacerier google has far from infinite resources and they make applications that are several orders of magnitude more complex and that consist of several orders of magnitude more lines of code than most others. claiming everyone should know lowlevel computer stuff because all abstractions can leak is like saying everyone should know how to build a house from scratch using no tools because a storm might come and tear their home down. it's just not feasible (or smart) to spend resources like that. – sara Jun 20 '16 at 09:52
9

I think so. It's a trend that has me worried. No abstraction is perfect; if there was a perfect way to simplify any complex problem, it would replace the original very quickly. (That's happened in the past, occasionally with computers, and a lot more frequently in other fields that don't worry as much about backwards compatibility as we do, such as physics.)

What this means is that every time you use an abstraction, there's some important piece of essential complexity that it's hiding from you. If you don't know what that is, why it's there and what it's doing, you end up accidentally writing big train wrecks, and not knowing how to fix them because you don't know what's really going on.

Anyone who tries to tell you otherwise is either selling snake oil or simply doesn't have much experience with serious software. At work, I work on a program that runs a good percentage of all the TV and radio stations in the USA. As stations and networks get bigger and more complex, quick and dirty techniques that worked fine for designing a product for one small station end up hitting big technical walls when implemented for a network with 50 stations and 200 channels! Without a deep understanding of how the language works, (and an efficient language in the first place,) and a deep understanding of how the database works, our coders would never have been able to make the product scale successfully.

This isn't an isolated story, either. Software continues to grow more and more complex, not simpler, and I'm afraid that this level of technical expertise is going to become something of a lost art, and tomorrow's programs will be worse than today's, not better.

Mason Wheeler
  • 82,151
  • 24
  • 234
  • 309
  • 5
    I've heard it expressed as "you need to know the abstraction one layer down from where you work". Well, it was rather pithier; my memory's flawed. So if you're working in C or Delphi, you should know how assembly works. If you're working in Smalltalk or Java, you should know how your VM works. (Arguably, you should _always_ know _something_ about assembly!) If you're working with TCP, you should know how IP works. And so on. – Frank Shearar Oct 21 '10 at 13:12
  • "[every time ...] there's some important piece of essential complexity that it's hiding from you" I find this to simply not be true. this is like saying that every time you do the dishes, unless you have a solid understanding of exactly how plumbing and the city's water grid and pumping stations and the chemical reactions from your soap and the food, then you are *missing out* and *can never do the dishes quite good enough*. soap producers are not selling "snake oil" simply because they'd claim you don't need a PhD in organic chemistry to apply soap to plates. – sara Jun 20 '16 at 10:34
  • 1
    I mean, you could take it even further: why should it be enough to just know assembly? that's just a hand-holding high-level abstraction over the binary CPU instructions. but wait! machine code? that's just an abstraction! you need to learn how the CPU is constructed using transistors to build logic gates! and the bus and the registers. but wait! transistors? that's just an abstraction for a certain configuration of atoms. and atoms are just an abstraction over fluctuations in quantum fields. in the end, this just makes `javascript:alert("Hello world")` require a PhD in string theory. – sara Jun 20 '16 at 10:38
  • 1
    @kai You're being [a bit ridiculous](https://en.wikipedia.org/wiki/Reductio_ad_absurdum). I tend to agree with you that it's not *usually* the case that you need to understand several layers down for *most* applications. But we're talking about edge cases, not every day stuff. Sure, you don't need to know how the plumbing works to do the dishes. But if dirty water starts filling up your sink, you should probably learn a bit about it before you try to fix it. Or, you can just ram a piece of rebar down the drain until it works again, and who cares where the water goes. (cont...) – DrewJordan Jun 20 '16 at 13:20
  • Or learn it first just in case. Mason's answer might be biased towards cases where performance is a more important constraint than say, time to market. There's plenty of software that goes out without being fully finished, bug free and optimized. But that doesn't mean that you shouldn't strive to understand the techniques used. You might need them someday. – DrewJordan Jun 20 '16 at 13:28
  • 1
    @DrewJordan my point is just that: unless you are a professional who *needs* to know about and fix a certain class of issues (or just have a keen private interest), then it's quite frankly absurd to claim that you *need* to know the inner workings and weird subtleties of every tool you're using. I merely drew the standpoint to it's logical conclusion. sure, knowing how an ethernet cable is built would allow you to build a new one, given the right tools when your internet connection fails, but spending time and money on gaining the knowledge when you work 20 layers above is just wasteful. – sara Jun 20 '16 at 13:59
  • 1
    @DrewJordan of course SOMEONE needs to know all this stuff so someone can fix it when an abstraction breaks, but the whole POINT of abstractions is to keep the number of people who need to know the details to a minimum so the vast majority can focus on getting stuff done. where you arbitrarily draw the line of where things get "too low level" just depends on what you personally happen to work with, as illustrated with my argumentum ad absurdium (which isn't a fallacy, I'll have you know!) – sara Jun 20 '16 at 14:01
  • @kai: And my point is that if you *are* a professional or want to become one, (which encompasses the vast majority of users on this site,) eventually you're going to run into these sorts of problems and need to fix them whether you want to or not. (*Professionals* don't stay at the `javascript:alert("Hello world")` level for very long!) And if you don't know how things work under the hood, this is the point where everything comes to a screeching halt for you. – Mason Wheeler Jun 20 '16 at 14:24
  • 1
    @MasonWheeler to some degree, yes, which is why I added my caveat about stuff that is actually relevant to your work that you could reasonably be expected to have a positive return on investment on for the time you spend learning it. even if I am a world-class professional chef, if the stove breaks I call someone who can repair it, I don't take a course in electrical engineering and fix it myself, because that is a waste of time and resources. – sara Jun 20 '16 at 14:39
5

Yes, I think people will understand hardware much less as languages progress (and, similarly, as instruction sets progress). But as has been noted many other places, the primary constraint on most programs now days is not CPU time or efficiency, but programmer time. If people who design languages keep doing their job in making the abstraction efficient, and if people keep using these abstractions properly, then an understanding of the computer architecture is not entirely necessary; at least a complete knowledge is not fundamental to being a good programmer these days.

1

No, it won't lead to a deficit of programmers with computer architecture knowledge. Languages are used to solve problems in a particular domain. If you want to solve a particular problem, you use the appropriate language or one good enough given your resources.

In reality, what domains actually need knowledge of the computer's architecture? That need to be tied down to a particular hardware architecture? Operating Systems? Device Drivers? Sure, but even then only parts of such code need specific architectural knowledge.

Performance improvement? Yes, you can apply knowledge of the computer's architecture to improve the performance of algorithms. But two other factors have a bigger impact on performance: the use of better algorithms and knowledge of the language's runtime environment.

In essence, more abstract programming languages solve problems for which details of the computer architecture isn't necessary. They allow more problems to be solved. The people that use them aren't using them to solve machine-dependent problems. People that need to solve machine-dependent problems will continue to use machine-capable languages. This isn't a zero-sum issue.

Huperniketes
  • 2,205
  • 14
  • 17