Many of the textbooks on the C programming language tell that C is a high-level programming language, but many of the tutors online say that C is also a middle-level programming language. Why is it like that?
-
3recommended reading: **[Discuss this ${blog}](http://meta.programmers.stackexchange.com/questions/6417/discuss-this-blog)** – gnat Dec 29 '14 at 13:24
-
6I empathize with your confusion. This is the sort of classification that would aggravate me to see on an exam because even a "wrong" answer is correct. Pray you never have this as a true/false or multiple-choice exam question. If you do, answer based on what the textbook said, not what you feel is correct (I feel evil recommending going against your gut). – Brandon Dec 29 '14 at 16:11
-
1Historically 'high[er]-level languages' were those that, unlike assembly language, allowed the programmer to abstract from the specific machine and write code in a more problem-domain way. Fortran and Cobol were 'high-level' because of that in early days. C is but a thin veil above the machine-specific stuff, but its very purpose is to be portable, machine-unspecific. So by 1960s standards C was high-level. Compared to C++14 or Haskell or OCaml or Rust, C is not-so-high level, of course. – 9000 Jan 06 '15 at 03:05
-
Most of these "modern high level "languages"" are really just scripting languages. The difference is if it's compiled or interpreted. – MarcusJ Jun 04 '17 at 00:29
-
Asm is high level. The abstraction provided by asm over hardware is enormous. – curiousguy May 29 '18 at 17:18
2 Answers
Historically, everything that abstracts over assembly code was called high-level. C certainly does that. This definition is also relatively clear-cut, in contrast to what follows.
Over time, we created more and more programming languages and invented more and more abstractions and tools. Compared to, say, Python, the C language is positively primitive in semantic richness and level of abstraction over the hardware. With that in mind, many people find it misleading to call C high-level, when there is a wealth of languages that are far higher above the hardware.
So now "high-level" usually means "abstracts a lot over hardware" and "low-level" means "abstracts little". This is the definition your tutors use. Not everyone agrees though, and old texts don't magically adopt the new terminology, so you still see the old use of "high-level" (under which C is high-level) floating around. Keep in mind that many good C books are basically newer editions of books released twenty years ago.
-
For something to be considered a proper low-level programming language, it must be allow a programmer to easily operate at the same level of abstraction as the underlying hardware. While the C Standard would allow implementations to behave in such fashion, there is no formally-recognized distinction between implementations that operate on that level of abstraction versus those that do not. – supercat Jul 02 '18 at 20:34
It is a higher level language than machine code (assembly), which is the point of view that C programming books and tutorials come from.
In that respect it is a high level programming language.
However, it is still very close to the hardware - much more so than other, more modern languages (Java, C# and such) - when viewed from this point of view, it is a middle level programming language.

- 53,326
- 19
- 166
- 181
-
1Then why can't we clearly say that C is a middle level programming language.In textbooks we don't find that. – CodeIt Dec 29 '14 at 12:24
-
5@manutd - that's a question to book authors ;) - but that's the thing. From their point of view, C is a high level language. That's what they deal with. The concept of "high level language" is fuzzy - there is no single definition agreed by all. – Oded Dec 29 '14 at 12:27
-
Unfortunately agreeing opinions in a community isn't how defining works though. It's creators and it's concepts are higher level than the assembly language of the processor - this is the defining line. This is what defines it as high level. Any book you pick up on computer science will always define it in this way. Tutors may not and peoples opinions may not, but the actual science of how computers work and how it has been defined in the science of computers and processors is very clear. – Ryan Rentfro Apr 23 '16 at 20:46
-
C# allows programmers to access a range of memory as a sequence of 64-bit, 32-bit, 16-bit, or 8-bit values as convenient; if memory holds 16-bit pixels, for example, code can process pixels individually or in groups of 2 or 4, depending upon what's best for the algorithm. C as defined by the Standard doesn't define a practical means by which programmers can request similar semantics. Many C dialects are lower level than C#, of course, but the language defined by the Standard isn't. – supercat Feb 10 '17 at 19:38