29

All programming languages are having their design flaws simply because not a single language can be perfect, just as with most (all?) other things. That aside, which design fault in a programming language has annoyed you the most through your history as a programmer?

Note that if a language is "bad" just because it isn't designed for a specific thing isn't a design flaw, but a feature of design, so don't list such annoyances of languages. If a language is illsuited for what it is designed for, that is of course a flaw in the design. Implementation specific things and under the hood things do not count either.

Anto
  • 11,157
  • 13
  • 67
  • 103
  • 6
    Note that this is constructive for language designers (mistakes to avoid), if someone would like to question how constructive this question is. – Anto Mar 05 '11 at 16:08
  • Obvious dupe of http://programmers.stackexchange.com/questions/20985/what-mistakes-do-language-writers-often-make-which-doom-their-language – greyfade Mar 05 '11 at 16:33
  • 1
    @greyfade: Not really, this are about flaws in the actual language, that seems to be about things which lower adoption of a language, which could include bad standard library or just a bad website for the language. Some answers list e.g. bad syntax, but that isn't a *specific* design flaw – Anto Mar 05 '11 at 16:54
  • 8
    Greatest flaw in any programming language? Humans. – Joel Etherton Mar 07 '11 at 12:42
  • If these answers are the greatest flaws, then I'm really impressed by programming languages. – Tom Hawtin - tackline Mar 12 '11 at 19:27
  • Wow, only 1 answer on BASIC and 1 on VB. Guess everyone got tired of bullying that language...or no one is using it anymore :) – Todd Main Mar 12 '11 at 21:03
  • @JoelE That's funny, but could you be more precise? – Mark C Mar 13 '11 at 15:17
  • @MarkC - Programming languages are designed by humans. Humans are imperfect will unavoidably create flawed things. Humans will also use these things in flawed manners. – Joel Etherton Mar 13 '11 at 19:12
  • @Joel And the Wright brothers are responsible for 9/11. – Rei Miyasaka Mar 18 '11 at 06:01
  • @Rei Miyasaka - Your comment makes absolutely no sense. However, I must commend you for an absolutely unnecessary use of an enormous tragedy to make an inflammatory remark that has no pertinence whatsoever to the topic at hand. – Joel Etherton Mar 18 '11 at 10:03
  • @Joel My point is that "humans are the greatest flaw in programming languages" is absolutely meaningless and adds nothing to a discussion about specific flaws in *programming languages*, not people. Are you trying to encourage a political lynching to sidestep an argument? Because as much as it'll work in light of recent exaggeration of seemingly *everything*, it's a pretty neurotic thing to do. If you'd have understood the point I was making, I think you'd have seen that my comment was an example of a comparable absurdity, not an endorsement of a crime or a belittlement of any loss of lives. – Rei Miyasaka Mar 18 '11 at 19:37
  • @Rei Miyasaka - No, my argument was tongue in cheek. Your comment was tasteless. I understood your point, and I think you easily could have chosen a different metaphor to make it. – Joel Etherton Mar 18 '11 at 20:39
  • @Joel How is the mere act of *mentioning a tragedy* in an analogy tasteless? That's seriously just politics. I don't see anyone finding any offense in using earthquakes or tsunamis or heck even nukes (and there's only one specific incident of that) as metaphors for happenings of epic proportions. Either way this conversation is utterly ridiculous. For the sake of people reading, let's agree to stop. – Rei Miyasaka Mar 18 '11 at 20:44
  • @Rei Miyasaka: The key is that you don't see anyone USING earthquakes or tsunamis or nukes (good job bringing it up though) as metaphors. Yes. I agree. You should just shut the hell up. – Joel Etherton Mar 19 '11 at 00:34
  • I beg to differ: http://www.urbandictionary.com/define.php?term=nuke – Rei Miyasaka Mar 19 '11 at 03:23
  • I invoke the power of Godwins Law on you! Discussion over :p – MattDavey Oct 07 '11 at 16:03

49 Answers49

42

One of my big annoyances is the way switch cases in C-derived languages default to falling through to the next case if you forget to use break. I understand that this is useful in very low level code (eg. Duff's Device), but it is usually inappropriate for application level code, and is a common source of coding errors.

I remember in about 1995 when I was reading about the details of Java for the first time, when I got to the part about the switch statement I was very disappointed that they had retained the default fall-through behaviour. This just makes switch into a glorified goto with another name.

Greg Hewgill
  • 10,181
  • 1
  • 46
  • 45
  • That's why it's good there's no switch statement in Python. – Christopher Mahan Mar 05 '11 at 22:16
  • 3
    @Christopher Mahan: `switch` doesn't have to work that way. For example, Ada's [case/when statement](http://www.adahome.com/rm95/rm9x-05-04.html) (equivalent to switch/case) does not have fall-through behaviour. – Greg Hewgill Mar 05 '11 at 22:40
  • @Christopher Mahan: You could easily implement a switch construct which implicitly breaks after each case. Python could easily have a switch statement which does this. EDITL Ninja'd by Greg =D – Ed S. Mar 05 '11 at 23:19
  • but if elif else works just fine, no? – Christopher Mahan Mar 06 '11 at 01:24
  • In Python, you can use `dict` to take care of many of the cases where'd you'd use `switch` in C. – dan04 Mar 06 '11 at 07:31
  • 2
    @Greg: `switch`-like statements in languages unrelated to C don't have to work that way. But if you use C-style control flow (`{`...`}`, `for (i = 0; i < N; ++i)`, `return`, etc.), language inference will make people expect `switch` to work like C, and giving it Ada/Pascal/BASIC-like semantics would have confused people. C# requires `break` in `switch` statements for the same reason, although makes it less error-prone by forbidding silent fallthrough. (But I wish you could write `fall;` instead of the ugly `goto case`.) – dan04 Mar 06 '11 at 08:05
  • 4
    then don't write switch, write if() else. the reason you're complaining about is what makes switch better: it's not a true/false conditional, it's a numeral conditional, and that makes it different. You could also write your own switch function though. – jokoon Mar 06 '11 at 11:40
  • @jokoon, a glorified indexed goto table + program block. –  Mar 07 '11 at 09:34
  • Using switch lets the compiler perform optimizations that it can't do if you use if-else. – Dunk Mar 07 '11 at 20:22
  • 9
    -1 I consider it a benefit to allow fall through, there is no danger from it other than stupidity, yet it grants additional function. – Orbling Mar 08 '11 at 00:25
  • 11
    The problem isn't that `switch` *allows* fallthrough. It's that most uses of fallthrough are not intentional. – dan04 Mar 08 '11 at 00:57
  • 1
    I think that mandating `break` in C# is a flaw - sure, a warning, but mandatory `break`? If it's mandatory then what's the point of it? – Kirk Broadhurst Mar 08 '11 at 02:58
  • 2
    @Kirk Broadhurst: Technically `break` is not mandatory in a C# switch statement, if it was then it would be pointless since it could just be assumed. What is mandatory is a jump statement of some sort, such as `break`, `return` or `goto`. (Except in an empty `case`, where fall-through is allowed.) – Tim Goodman Mar 08 '11 at 04:07
  • 1
    @dan04: I agree, it's accidental fall through that's the problem, and requiring an explicit `fall` would prevent this. However, I suspect (just a guess) that this was left out of C# as a deliberate design decision to avoid bugs due the order of the cases being changed. `goto case` can achieve the same functionality without that risk. Yes, everyone hates gotos, but I'm not sure how strong that sentiment would be if its use had always been restricted to the way it's used in C#'s `switch` statement (and especially if we further restrict ourselves to only forwards-jumping gotos). – Tim Goodman Mar 08 '11 at 04:19
  • @Kirk -- if the case is empty, you don't need a break; you can leave it out and have it fall through to the next case. You also don't need it if you exit control, using `return` for instance. So this works: `switch (ch) { case 'a': case'e': case 'i': case 'o': case 'u': return "vowel"; default: return "non-vowel"; }` – Rei Miyasaka Mar 09 '11 at 20:29
  • @Tim Goodman I really prefer C#'s `switch` for that reason. Normally, you just need break or return, but if you really want fallthrough behavior, then goto can be used to do so. I think that allows the language to best match the programmer's intent. – CodexArcanum Mar 09 '11 at 21:43
  • @Orbling -- most programming languages have as part of their design goals that they reduce the chance of human error. I get that it adds additional function, but that function doesn't require this particular risk. – Rei Miyasaka Mar 09 '11 at 22:06
  • @Rei Miyasaka: Whilst I agree with that up to a point, that threshold need not be high. It is up to a programmer not to be crap, not the language to wrap them in cotton wool. – Orbling Mar 09 '11 at 22:10
  • 1
    @Orbling There's a difference between a programmer that's an idiot and a programmer that's tired. Tired programmers are the sort that make mistakes like neglecting breaks, and they should be wrapped in cotton wool. – Rei Miyasaka Mar 09 '11 at 22:29
  • @Rei Miyasaka: For some idiotic reason, I rarely sleep much - I've had about 3 hours a night average for many many years and I can say that in the quarter of a century that I've been coding, I've never introduced a fall-through where I did not intend one. – Orbling Mar 10 '11 at 00:49
  • 1
    @Orbling Wish I were like you, I can't even talk straight without at least 6 hours of sleep :) – Rei Miyasaka Mar 10 '11 at 01:06
  • @Orbling Just curious, do you do the 15 minutes every 3 hours thing, or do you sleep 3 hours at a time? – Rei Miyasaka Mar 10 '11 at 01:14
  • 1
    @Rei Miyasaki: Usually my daily sleep is in two sections. Depends on the day. Sometimes I'll have 2 1/2, then a couple of hours up and another 1 1/2. Other days, just 1 hour, then back up. Also, you definitely do not want to be like that, lack of sleep is not good for you when sustained. – Orbling Mar 10 '11 at 01:40
41

I've never really liked the use of = for assignment and == for equality testing in C-derived languages. The potential for confusion and errors is too high. And don't even get me started on === in Javascript.

Better would have been := for assignment and = for equality testing. The semantics could have been exactly the same as they are today, where assignment is an expression that also produces a value.

Greg Hewgill
  • 10,181
  • 1
  • 46
  • 45
  • +1 I like ML style <- for assignment. – Nemanja Trifunovic Mar 06 '11 at 01:49
  • Or Fortran .eq. .lt. .gt. etc – Martin Beckett Mar 06 '11 at 01:52
  • 3
    @Nemanja Trifunovic: I originally thought of that suggestion, but it has an unfortunate ambiguity in C with less-than comparison against a negative number (ie. `x<-5`). C programmers wouldn't tolerate that kind of required whitespace :) – Greg Hewgill Mar 06 '11 at 02:35
  • In imperative languages, assignment is more common than equality testing, so it's convenient to have an assignment operator you can type with one keystroke. It just doesn't mix well with allowing non-booleans in `if` statements. – dan04 Mar 06 '11 at 07:20
  • @Martin: Fortran 90 does support the `<` and `>` operators. – dan04 Mar 06 '11 at 07:33
  • 7
    @Greg: I would prefer `:=` and `==` because it would be too easy to forget the `:` and not be notified like it's already the case (though reverted) when you forget a `=` today. I am thankful for compiler warnings on this... – Matthieu M. Mar 06 '11 at 13:31
  • 13
    On almost all keyboards I've ever used, ":=" requires changing the shift key while typing it. On the one I'm using now, ':' is uppercase and '=' is lowercase, and I've had that reversed. I type a lot of assignments, and don't need that sort of typing hassle in them. – David Thornley Mar 07 '11 at 19:00
  • 14
    @David Thornley: Code is read *many* more times than it is written. I don't buy any arguments about "typing hassle". – Greg Hewgill Mar 07 '11 at 19:06
  • 8
    @Greg Hewgill: Sure it's read more often than written. However, the problem between `=` and `==` isn't in reading, because they're distinct symbols. It's in writing and making sure you got the right one. – David Thornley Mar 07 '11 at 19:45
  • @Greg Hewgill: Would it be fair to say you don't like C-style languages? – Orbling Mar 08 '11 at 00:27
  • 1
    @Orbling: No, but I believe there are some aspects of C-style languages that could have been designed better. – Greg Hewgill Mar 08 '11 at 04:26
  • 3
    You'd be surprised how this little difference makes it *incredibly* easier to teach programming to beginners. `=` already has a very specific meaning in mathematics, so I think it's important that we as a newer discipline respect that and find our own symbol to avoid confusion. – Rei Miyasaka Mar 18 '11 at 19:43
  • I like Prolog's `=`, meaning unification - an interesting amalgamate between assignment and comparison. – configurator Oct 07 '11 at 16:26
  • 2
    It would only take a few times typing `;=` or `:+` for me to get sick of `:=` as an assignment operator. – Kyralessa Oct 07 '11 at 17:51
  • Even dennis agrees with you http://cm.bell-labs.com/cm/cs/who/dmr/chist.html – Martin Beckett Jul 15 '12 at 05:20
29

The choice of + in Javascript for both addition and string concatenation was a terrible mistake. Since values are untyped, this leads to byzantine rules that determine whether + will add or concatenate, depending on the exact content of each operand.

It would have been easy in the beginning to introduce a completely new operator such as $ for string concatenation.

Greg Hewgill
  • 10,181
  • 1
  • 46
  • 45
  • I hated this back when I used ActionScript (ECMAScript based) – Anto Mar 05 '11 at 21:38
  • 1
    Java suffers from this, as well. – Barry Brown Mar 06 '11 at 05:04
  • 13
    @Barry: Not really. `+` makes a lot of sense as a string concatenation operator in a strongly typed language. The problem is that Javascript uses it but is not strongly typed. – Mason Wheeler Mar 06 '11 at 06:02
  • The problem is that there are some types where both addition and concatenation make sense, but you only have the one `+` operator. NumPy arrays, for example, suffer from this. – dan04 Mar 06 '11 at 07:40
  • 13
    @Mason: `+` does not make sense for concatenation, because concatenation is definitely not commutative. It's an abuse as far as I am concerned. – Matthieu M. Mar 06 '11 at 13:36
  • 12
    @Matthieu: Umm... why does that matter? Concatenation isn't commutative (like addition is), but the addition of two strings is nonsensical so no one thinks of it that way. You're inventing a problem where none exists. – Mason Wheeler Mar 06 '11 at 13:49
  • 4
    just switch to C++ and add any kind to esoteric overloading to the operator of your choice – Newtopian Mar 07 '11 at 05:50
  • 1
    @Mason Wheeler: I find at the higher levels of programming, programmers fall in to two camps, "strongly typed (anal)" and "who cares". – Orbling Mar 08 '11 at 00:31
  • 8
    @Matthieu: Commutativity isn't the issue here. If JS had a Matrix class, would you consider it an abuse for it to have an `*` operator? – dan04 Mar 08 '11 at 00:56
  • @dan04: I've always been bugged by the non-commutativity of the `*` on matrices :p @Orbling: I probably fall into the "(anal)" category, since Haskell is the language I prefer with regard to typing. – Matthieu M. Mar 08 '11 at 07:56
  • 3
    @Matthieu M. I like Haskell for a lot of things, but the general inability to overload operators (but ability to define new ones) leads to the opposite problem: too many byzantine operators. I'm particularly fond of things like `|>>` and `|&|` and `***`. – CodexArcanum Mar 09 '11 at 21:52
  • 1
    @CodexArcanum: Ah yes, I can only agree about the bizarre operators. I much prefer *named* methods wherever possible, and I look at operators only when thinking about DSEL, which is not that common. – Matthieu M. Mar 10 '11 at 07:11
  • 2
    Use PHP for a while where '.' is the concat operator. I have a feeling you'll grow to dislike it even more. Personally, + makes a lot more sense to me intrinsically. – Evan Plaice Mar 18 '11 at 17:08
  • @Mason Wheeler: if + to string is commutative, no one would have to complain about it at the first place. I think Matthieu M. has a good point. – Codism Oct 07 '11 at 17:05
24

I find Javascript default to global to a major issue, and often source of bugs if you don't use JSLint or the like

Zachary K
  • 10,433
  • 2
  • 37
  • 55
  • 2
    Default to global? I'm glad I don't use JS then... – Anto Mar 05 '11 at 16:31
  • 5
    Actually it is very nice language, with a few bad design choices in it. This is the major one, if you don't scope a variable it will scope it as a global. The good thing is that by using Doug Crockford's Jslint program you can catch this error and a bunch more. – Zachary K Mar 05 '11 at 16:36
  • 1
    Just use `var` whenever you declare a variable and you are good to go. And don't tell me it's too much typing because Java forces you to declare all the types twice and nobody complains about it being a shitty design choice. –  Mar 06 '11 at 06:10
  • 3
    @davidk01: People do complain about that. Similarly, people complained about having to declare `std::map::const_iterator` variables in C++ enough that `auto` is being added to that language. – dan04 Mar 06 '11 at 07:03
  • @davidk01, Oh I do, and I use jslint to catch me when I miss it (its been known to happen) – Zachary K Mar 06 '11 at 07:53
  • But JSLint will hurt your feelings! – Mateen Ulhaq Mar 12 '11 at 22:00
  • 1
    the `"use strict"` directive, added in es5, changes undeclared references into an error. – Sean McMillan Oct 07 '11 at 15:14
  • Or better yet use Coffee Script! it solves the problem – Zachary K Oct 21 '11 at 06:58
22

The preprocessor in C and C++ is a massive kludge, creates abstractions that leak like sieves, encourages spaghetti code via rat's nests of #ifdef statements, and requires horribly unreadable ALL_CAPS names to work around its limitations. The root of these problems is that it operates at the textual level rather than the syntactic or semantic level. It should have been replaced with real language features for its various use cases. Here are some examples, though admittedly some of these are solved in C++, C99 or unofficial but de facto standard extensions:

  • #include should have been replaced with a real module system.

  • Inline functions and templates/generics could replace most of the function call use cases.

  • Some kind of manifest/compile time constant feature could be used for declaring such constants. D's extensions of enum work great here.

  • Real syntax tree level macros could solve a lot of miscellaneous use cases.

  • String mixins could be used for the code injection use case.

  • static if or version statements could be used for conditional compilation.

dsimcha
  • 17,224
  • 9
  • 64
  • 81
  • 2
    @dsimcha: I agree with the `#include` issue, but the module system was invented... afterward! And C and C++ aim for a maximum of backward compatibility :/ – Matthieu M. Mar 06 '11 at 13:34
  • @Matthieu: Yes, module systems were invented afterwords, but we're talking about hindsight here anyhow. – dsimcha Mar 06 '11 at 15:13
  • 3
    I agree that this is a massive pain in various body parts, but the multi-pass design has the advantage of simplicity: a C compiler doesn't need to know much about the context of operation before it can successfully compile a chunk of C code. This can only be qualified as a design error if you can show that the costs of using a hypothetical module system within the C language itself (e.g. C++-like classes) is always lower than or comparable to the present cpp-based `#include` hacking. – reinierpost Mar 08 '11 at 09:00
  • I agree. However, but some people think that preprocessing is a good thing, and complain that it is not supported in (for example) Java. – Stephen C Mar 10 '11 at 01:49
  • 5
    @Stephen: I agree that Java with a preprocessor might be better than Java without, **but only because** Java doesn't have several of the "real" features necessary to replace the preprocessor. In languages like D, which include such features and Python, which gets flexibility in other ways by being dynamic, I don't miss it one bit. – dsimcha Mar 10 '11 at 03:55
  • @reinierpost: A C compiler might not need much context, but a C++ compiler does. – dan04 Mar 11 '11 at 13:18
  • @dan04: That's because C++ started out as another preprocessor for C. It's part of the design philosophy: don't rock the boat. If you don't want that approach, there's Java. – reinierpost Mar 14 '11 at 08:54
  • @dsimcha I definitely miss the conditional compilation option in python. Mostly because you can't support python 2x and 3k source in the same file. The only converse option is to run 2to3 and support two different version branches of the source. Which most library maintainers aren't willing to do. That's one of the main reasons why roughly 300 of 10k+ of the projects in PYPI are available to use in 3k. – Evan Plaice Mar 18 '11 at 17:13
  • @StephenC: Java would do well to follow C# in adding conditional compilation, either using CPP's syntax like C# or new keyword with block like D. – Jan Hudec Oct 07 '11 at 16:06
21

One could list hundreds of mistakes in hundreds of language, but IMO it is not a useful exercise from a language design perspective.

Why?

Because something that would be a mistake in one language would not be a mistake in another language. For instance:

  • Making C a managed (i.e. garbage collected) language or tying down the primitive types would limit its usefulness as a semi-portable low-level language.
  • Adding C-style memory management to Java (for example, to address performance concerns) would break it.

There are lessons to be learned, but the lessons are rarely clear cut, and to understand them you have to understand the technical trade-offs ... and the historical context. (For instance, the cumbersome Java implementation of generics is a consequence of an overriding business requirement to maintain backwards compatibility.)

IMO, if you are serious about designing a new language, you need to actually use a wide range of existing languages (and study historical languages) ... and make up your own mind what the mistakes are. And you need to bear in mind that each of these languages was designed in a particular historical context, to fill a particular need.

If there are general lessons to be learned they are at the "meta" level:

  • You cannot design a programming language that is ideal for all purposes.
  • You cannot avoid making mistakes ... especially when viewed from hind-sight.
  • Many mistakes are painful to correct ... for users of your language.
  • You have to take account of the background and skills of your target audience; i.e. existing programmers.
  • You cannot please everyone.
Stephen C
  • 25,180
  • 6
  • 64
  • 87
  • 9
    -1 -- programming languages follow design goals. A features of a language that work against these goals is a design flaws, unless it is a necessary compromise for one of its other goals. Not many languages are made with the intention of satisfying everyone, but all languages should attempt to satisfy those people that it sets out to satisfy in the first place. This kind of postmodernistic political correctness is quite stifling to programming language research and development. – Rei Miyasaka Mar 09 '11 at 22:08
  • My point is that it is difficult to learn lessons without taking into account the design goals. – Stephen C Mar 10 '11 at 01:47
  • 1
    seems to me that the OP already accounted for your answer in the second paragraph of the question. – Aidan Cully Mar 10 '11 at 09:14
20

C and C++: All those integer types that don't mean anything.

Especially char. Is it text or is it a tiny integer? If it's text, is it an "ANSI" character or a UTF-8 code unit? If it's an integer, is it signed or unsigned?

int was intended to be the "native"-sized integer, but on 64-bit systems, it isn't.

long may or may not be larger than int. It may or may not be the size of a pointer. It's pretty much an arbitrary decision on the part of the compiler writers whether it's 32-bit or 64-bit.

Definitely a language of the 1970s. Before Unicode. Before 64-bit computers.

dan04
  • 3,748
  • 1
  • 24
  • 26
  • 10
    C was not designed to be a standard fits-all language, hence it cannot be a design error. It was designed to model a cpu as portable assembler avoiding cpu-specific assembler code. However, it was fixed in Java. –  Mar 05 '11 at 17:10
  • 7
    I think the bigger problem is newer languages that continue to use the same meaningless terms, despite history showing us it's a terrible idea. At least the C guys noticed their mistake and created standard int types. – Mark H Mar 05 '11 at 17:48
  • 5
    A char is not a utf-8 unit. A utf-8 character can take more that 8 bits to store. C is not a language of the 1970s, I'm using it for a project now (voluntarily). – dan_waterworth Mar 05 '11 at 18:45
  • 4
    C is little more than a high-level abstraction of the PDP-11 processor. For example, pre and post incrementation were directly supported by the PDP-11. – bit-twiddler Mar 05 '11 at 22:32
  • 5
    This is a terribly misguided answer. First off, C and C++ are not interchangeable. Second, the language spec clearly defines what a char is - *An object declared as type char is large enough to store any member of the basic execution character set.*. Third, C is not a "language of the 70's", it is a language that lives close to the hardware and is probably the language that ultimately allows all of your high level abstractions to actually make sense to a CPU. You come off as a person who knows only high level languages and has no appreciation for how things actually work. -1 – Ed S. Mar 05 '11 at 23:18
  • First off, I'm well aware that C and C++ are different languages. But this is something that **C++ didn't fix**. And it's actually a **worse** problem in C++ because of overloading. – dan04 Mar 06 '11 at 06:11
  • 1
    Secondly, C's definition of `char` is *not* a clear definition. Java and C#'s "`char` is a UTF-16 code unit" is a clear definition. C's definition is very much a 1970s "every language/OS combination has its own character encoding" model. When Linux has UTF-8 `char` and Windows has CP125x `char` while recommending the use of UTF-16 `wchar_t` instead, it's very painful to write cross-platform code that correctly deals with strings. – dan04 Mar 06 '11 at 06:26
  • Third, the C(++) integer model doesn't match the hardware either. To the hardware, a 32-bit integer is a 32-bit integer; having an `int`/`long` distinction where both types are the same size is an artificial distinction made by C. – dan04 Mar 06 '11 at 06:54
  • programming with number is extremely reliable and fast. On top of that, letting the programmer do integer operations with characters happens to be much useful. Your answer is wrong. – jokoon Mar 06 '11 at 11:43
  • @jokoon: I'm not saying to disallow integer operations with characters. I'm saying to make `char` and `byte` separate types. A character is not a byte. – dan04 Mar 06 '11 at 18:23
  • "byte" as a type is useless on 32 bits processors, there is no use to it besides characters. char is 8 bit because it saves memory. – jokoon Mar 06 '11 at 19:53
  • No, jokoon, char is 8 bits because at the time C was developed, most languages used 8-bit character encodings. – dan04 Mar 07 '11 at 01:12
  • 2
    But, in C a char is not 8 bits, it's CHAR_BITS bits. This is, admittedly, 8 on all platforms I can lay my hands on, but the fact that there is a per-compiler/host macro to tell you how many bits a char has should be an indication that this is not written in stone. – Vatine Mar 07 '11 at 13:59
  • 1
    Well, there are 36-bit systems with 9-bit bytes, but these days they're used more for nitpicking like this than they are for actual programming. – dan04 Mar 07 '11 at 14:36
  • I wonder if C++0x will have this sometimes annoying "problem"? – Mateen Ulhaq Mar 12 '11 at 21:55
18

null.

Its inventor, Tony Hoare, calls it the "billion dollar mistake".

It was introduced in ALGOL in the 60s, and exists in most of the commonly used programming languages today.

The better alternative, used in languages like OCaml and Haskell, is the maybe. The general idea is that object references cannot be null/empty/non-existent unless there's an explicit indication that they may be so.

(Although Tony's awesome in his modesty, I think almost anyone would have made the same mistake, and he just happened to be first.)

rahmu
  • 1,026
  • 6
  • 16
Rei Miyasaka
  • 4,541
  • 1
  • 32
  • 36
  • Disagree, even with its inventor !!! null is the empty value for the pointer / reference data type. Strings has empty string, sets has empty set (Pascal empty set = []), integers has 0. Most programming languages that use null / nil / whatever, if a variable is assigned properly a null error can be prevented. – umlcat Mar 10 '11 at 18:52
  • 4
    @user14579: Every language that supports any kind of set or string or array has {}, but such is still semantically appropriate and will not crash unless you already have something that could possibly cause an array bounds error -- but that's another issue. You can process an empty string to uppercase all the characters, which will result in an empty string. You try the same thing on a null string, and without proper consideration, it'll crash. The problem is that this proper consideration is tedious, often forgotten, and makes it difficult to write single-expression functions (i.e. lambdas). – Rei Miyasaka Mar 10 '11 at 20:15
  • 1
    I make money every time I type null...oh right someone loses money every time I type null. Except this time. – kevpie Mar 12 '11 at 12:59
  • @kevpie Tony saw you write that, and he had a heart attack resulting in a really small but expensive medical bill. (The font size is 1/1000 inch.) – Mateen Ulhaq Mar 12 '11 at 22:15
  • 3
    @umlcat - when you have languages with pattern matching like Ocaml, Haskell, and F#, using the Maybe x | None pattern prevents you from forgetting the null case at compile-time. No amount of compile-time trickery can catch an error in languages where null is the established idiom. Since you have to explicitly choose not to deal with the null case in languages that have the Maybe and Some monad, they have a serious advantage over the "null" approach. – JasonTrue Mar 12 '11 at 22:27
  • @muntoo a null a day keeps the doctor away. I thought I may find something nifty to say with regard to nulls and the spaceship operator <=>. I gave up. – kevpie Mar 12 '11 at 23:08
  • 1
    @Jason -- I like to think of `maybe` as a null opt-in, whereas the null exception is an opt-out. Of course there's some things to be said about the difference between runtime errors and compile-time errors too, but just the fact that null is essentially injecting behavior is noteworthy in itself. – Rei Miyasaka Mar 13 '11 at 00:11
14

I get the feeling that the people who designed PHP didn't use a normal keyboard, they don't even use a colemak keyboard, because they should have realized what they were doing.

I am a PHP developer. PHP isn't fun to type.

Who::in::their::right::mind::would::do::this()? The :: operator requires holding shift and then two key presses. What a waste of energy.

Although->this->is->not->much->better. That also requires three key presses with the shift being in between the two symbols.

$last = $we.$have.$the.$dumb.'$'.$character. The dollar sign is used a tremendous amount of times and requires the award stretch up to the very top of the keyboard plus a shift key press.

Why couldn't they design PHP to use keys that are much faster to type? Why couldn't we.do.this() or have vars start with a key that only requires a single keypress - or non at all (JavaScript) and just pre-define all vars (like I have to do for E_STRICT anyway)!

I'm no slow typist - but this is just a lame design choice.

Mateen Ulhaq
  • 968
  • 3
  • 11
  • 21
Xeoncross
  • 1,213
  • 1
  • 11
  • 24
13

The use of desktop inspired forms within asp.net.

It always felt a fudge and got in the way or the how the web actually works. Thankfully asp.net-mvc does not suffer in the same way, though with credit to Ruby etc for that inspiration.

dove
  • 121
  • 4
  • 18
    Isn't that a library thing? – nikie Mar 05 '11 at 16:16
  • @nikie that's a good point;) – dove Mar 05 '11 at 16:34
  • 1
    @nikie Actually, the XML-based ASPX code is a language, so you can twist this one to work. :D – CodexArcanum Mar 09 '11 at 21:58
  • 2
    The real problem with ASP.NET, I think, is how hard it tries to hide the details of the web from the programmer. There's actually some really neat, useful stuff going on in ASP.NET, but you have to fight so hard and dig so deep to get at it. – CodexArcanum Mar 09 '11 at 21:59
  • 1
    On the other hand there are thousands and thousands of simple and successful data collection apps out there that where put together using the "classic" desktop app thing. The only bad thing was that until MVC the only Microsoft option was the windows forms. – ElGringoGrande Mar 12 '11 at 02:27
  • @Codex That's still not a problem with XML nor with the domain-specific language made atop it. – Rei Miyasaka Apr 14 '11 at 10:07
13

For me it is PHP's absolute lack of naming and argument ordering conventions in its standard library.

Though JASS's necessity to nullify references after the referenced object was released/removed (or the reference would leak and several bytes of memory would be lost) is more serious, but since JASS is single purpose language, it is not that critical.

Matěj Zábský
  • 1,529
  • 12
  • 18
  • 9
    The lack of conventions in PHP's stdlib is arguable not a *language design* flaw. –  Mar 05 '11 at 16:15
  • 3
    @delnan: The lack of conventions is a result of how PHP was designed, and therefore has a lot to do with the language design. Also it is not clear to me that there is a clear distinction between libraries and language. Lisp in particular has a proud tradition of bootstrapping one language on top of another. – btilly Mar 08 '11 at 17:35
  • 1
    The truly remarkable thing about JASS was that it had reference counting on handles, but wouldn't clean them up unless they were manually destroyed (and the graphic interface created functions that leaked memory everywhere)! – Craig Gidney Oct 07 '11 at 17:39
13

The greatest design flaw that I face is that python was not designed like python 3.x to begin with.

dan_waterworth
  • 7,287
  • 2
  • 34
  • 45
  • 1
    Well, not even Guido can get *everything* right at once... –  Mar 05 '11 at 21:12
  • 5
    @delnan, oh I know, and python < 3 is still a staggeringly good language, but it is a little annoying to have better language in the form of python 3.x that I can't use because it breaks all of the modules that I need. – dan_waterworth Mar 06 '11 at 06:19
  • Continue to lobby for your python 3.x modules! Meanwhile I will keep writing in 2.5.4. Thanks to SO I actually am reminded that 3.x is alive and well. – kevpie Mar 12 '11 at 12:55
  • @kevpie First, lobby to add conditional compilation to python to make the transition easier for library maintainers. 2to3 is not a maintainable solution in the long term. – Evan Plaice Mar 18 '11 at 17:16
12

Array decay in C and consequently C++.

Nemanja Trifunovic
  • 6,815
  • 1
  • 26
  • 34
  • I wish for proper array support too. In C++ you can prevent decay using abscons syntax and templates... but it's more of a hack :/ – Matthieu M. Mar 06 '11 at 13:38
  • 1
    Note that this is the reason that C++ has to have separate `delete` and `delete[]` operators. – dan04 Mar 08 '11 at 02:35
  • You can always put an array into a struct and have it passed around by value if you like, but this is usually more awkward than the original problem. In C++, you can usually avoid the need to use arrays. – David Thornley Mar 08 '11 at 15:40
  • 2
    At least in the C case, the argument against proper array support is "array bounds checking is expensive", particularly given the way that C pointer arithmetic works. – Stephen C Mar 10 '11 at 01:51
  • @Stephen C: What array bounds checking has to do with array decay? – Nemanja Trifunovic Mar 10 '11 at 03:35
11

Primitive types in Java.

They break the principle that everything is a descendant of java.lang.Object, which from a theoretical point of view leads to additional complexity of the language specification, and from a practical perspective they make the use of collections extremely tedious.

Autoboxing helped alleviate the practical drawbacks but at the cost of making the specification even more complicated and introducing a big fat banana skin: now you can get a null pointer exception from what looks like a simple arithmetic operation.

biziclop
  • 3,351
  • 21
  • 22
  • It would cause terrible performance issue if you removed primitives types. And autoboxing can be messed up pretty easily, so it improves nothing. – deadalnix Jun 05 '11 at 00:02
  • At the time, this was a sensible decision from the Java designers. The performance gains given the machines / VMs available in the 90s outweighed the conceptual advantages of unifying everything around java.lang.Object. – mikera Oct 08 '11 at 04:14
10

I know Perl best, so I'll pick on it.

Perl tried many ideas. Some were good. Some were bad. Some were original and not widely copied for good reason.

One is the idea of context - every function call takes place in list or scalar context, and can do entirely different things in each context. As I pointed out at http://use.perl.org/~btilly/journal/36756 this complicates every API, and frequently leads to subtle design issues in Perl code.

The next is the idea of tying syntax and data types so completely. This lead to the invention of tie to allow objects to masquerade as other data types. (You can also achieve the same effect using overload, but tie is the more common approach in Perl.)

Another common mistake, made by many languages, is to start off by offering dynamic scoping rather than lexical. It is hard to revert this design decision later, and leads to long-lasting warts. The classic description of those warts in Perl is http://perl.plover.com/FAQs/Namespaces.html. Note that this was written before Perl added our variables and static variables.

People legitimately disagree on static versus dynamic typing. I personally like dynamic typing. However it is important to have enough structure to let typos to be caught. Perl 5 does a good job of this with strict. But Perl 1-4 got this wrong. Several other languages have lint checkers that do the same thing as strict. As long as you are good about enforcing lint checking, that is acceptable.

If you're looking for more bad ideas (lots of them), learn PHP and study its history. My favorite past mistake (long ago fixed because it lead to so many security holes) was defaulting to allowing anyone to set any variable by passing in form parameters. But that is far from the only mistake.

btilly
  • 18,250
  • 1
  • 49
  • 75
  • 5
    Yea, Perl has a lot of mistakes, because the folks who built it were trying new ideas, and when you do that you often get them wrong. (Perl also has some very good stuff, and is the standard for Regexps that everyone else seems to have copied) – Zachary K Mar 05 '11 at 17:04
  • @zachary-k: Absolutely agreed. And I tried to make that clear before I began walking through issues. – btilly Mar 05 '11 at 17:39
  • 4
    Lisps were originally dynamically scoped, and over time got changed to being lexically scoped (at least in Scheme and Common Lisp). It's not impossible to change. – David Thornley Mar 07 '11 at 19:07
  • 4
    @david-thornley: It is impossible unless you sacrifice backwards compatibility somewhere. Scheme was always lexically scoped. Common Lisp was lexically scoped from the time it was standardized, but various Lisp communities had their struggles adopting it. And Emacs Lisp is still using dynamic scoping, even though there has been a desire to change it for a long time. – btilly Mar 07 '11 at 19:41
  • 1
    BTW, many of the things people dislike Perl for weren't invented in Perl but taken from other languages, mostly the Bourne shell. – reinierpost Mar 08 '11 at 09:05
  • @reinierpost - Absolutely true. – btilly Mar 08 '11 at 17:31
  • The context-senvitive variables takes a while to learn and is nearly unintelligible to beginners. But once you get your head around it: totally awesome. – Barry Brown Oct 07 '11 at 19:28
  • @BarryBrown I agree that it is convenient once you understand it. However given how frequently it causes problems in APIs, and how many don't take advantage of it, I think it is a misfeature. – btilly Dec 07 '11 at 20:15
10

JavaScripts ambiguity for code blocks and object literals.

  {a:b}

could be a code block, where a is a label and b is an expression; or it could define an object, with an attribute a which has the value b

user281377
  • 28,352
  • 5
  • 75
  • 130
10

I'm going to go back to FORTRAN and whitespace insensitivity.

It pervaded the specification. The END card had to be defined as a card with an 'E', an 'N', and a 'D' in that order in columns 7-72, and no other nonblanks, rather than a card with "END" in the proper columns and nothing else.

It led to easy syntactic confusion. DO 100 I = 1, 10 was a loop control statement, while DO 100 I = 1. 10 was a statement that assigned the value 1.1 to a variable called DO10I. (The fact that variables could be created without declaration, their type depending on their first letter, contributed to this.) Unlike other languages, there was no way to use spaces to separate out tokens to allow disambiguation.

It also allowed other people to write really confusing code. There's reasons why this feature of FORTRAN was never duplicated ever again.

David Thornley
  • 20,238
  • 2
  • 55
  • 82
  • 1
    In a language where you can redefine literals that's the worst example - I mean it only caused one tiny little spacecraft to crash – Martin Beckett Mar 08 '11 at 00:10
  • Early on, you could write DAMNATION in place of DIMENSION, and it would work. – Mike Dunlavey Mar 08 '11 at 14:32
  • It's still being taught, by people outside of CS. I still have to deal with those who a) fight against declarations, b) fight against whitespace, c) like "continuation lines", d) use 6-character names, or 4, d) are stumped when they see `(test ? a : b)`, e) insist on using `**`, f) can't handle case-sensitivity. Most of this was because of keypunches in the 50s. – Mike Dunlavey Mar 08 '11 at 14:38
  • @Mike Dunlavey: Really? The only intro I ever had to FORTRAN parsing, we separated out the IF and DO and assignment statements and recognized all others by their first three letters. Even that would have caught DAMNATION (although not DIMNESS). – David Thornley Mar 08 '11 at 16:19
  • @David: Well this was around '65, on an IBM 360. It might have been a legend carried forward from Fortran 2 (Load-and-go) on the 1620, where I first did real code in '62. (Not to date myself :) – Mike Dunlavey Mar 09 '11 at 00:46
  • @David Thornley: Did you really write 'END' ***card***? – oosterwal Mar 09 '11 at 22:38
  • 1
    @Martin Beckett - redefining literals in FORTRAN was really a compiler flaw rather than a language feature. It certainly wasn't an intentional language feature. – Stephen C Mar 10 '11 at 01:56
  • 1
    @oosterwal: I certainly did. I might be wrong, but I vaguely remember the language definition based on punch cards. They were the main way of inputting FORTRAN programs back then, and the idea of an 80-column line with columns 73-80 reserved is from punch cards. – David Thornley Mar 10 '11 at 14:30
  • @David Thornley: I do remember those days. I got into programming at the tail end of the punch card era (early '80s) and never actually had to use them, but clearly remember the window at the university computing center where programmers could drop off their decks for batch jobs. During my first or second year, I had one class where we had to turn in our ***handprinted*** programs on special green and white FORTRAN program forms that had boxes for each character on each line and special margin breaks on either side of column 6 and after column 72. – oosterwal Mar 10 '11 at 16:07
  • In the early days of BASIC (a direct descendant of FORTRAN), there was a version used on XEROX/Honeywell mainframes where whitespace insensitivity would allow ambiguous statements like `150 FOR T = A TO M`, which was interpreted by the computer as `150 FORT = ATOM`. – oosterwal Mar 10 '11 at 16:19
9

One of the biggest issues with BASIC was the lack of any well defined method to extend the language beyond it's early environments, leading to a bunch of completely incompatible implementations (and a nearly irrelevant post-facto attempt at any standardization).

Almost any language will get bent into general purpose use by some crazy programmer. It's better to plan for that general purpose usage at the beginning in case that crazy idea takes off.

hotpaw2
  • 7,938
  • 4
  • 21
  • 47
  • 2
    +1: Any language without proper modules and libraries is a mistake waiting to happen. COBOL suffered from this, also, leading to peculiar variants that aren't compatible. – S.Lott Mar 10 '11 at 16:07
8

I believe in DSLs (domain-specific-languages) and one thing I value in a language is if it allows me to define a DSL on top of it.

In Lisp there are macros - most people consider this a good thing, as do I.

In C and C++ there are macros - people complain about them, but I was able to use them to define DSLs.

In Java, they were left out (and therefore in C#), and the lack of them was declared to be a virtue. Sure it lets you have intellisense, but to me that's just an oeuvre. To do my DSL, I have to expand-by-hand. It's a pain, and it makes me look like a bad programmer, even though it lets me do a heck of a lot more with tons less code.

Mike Dunlavey
  • 12,815
  • 2
  • 35
  • 58
  • 4
    I'd agree that any language without decent macros is a one huge unfixable design flaw. But what do you mean by '*they were left out*'? C preprocessor was not any kind of a decent macro system. Java is not derived from any proper language with macros. – SK-logic Mar 06 '11 at 09:04
  • 1
    You can write your DSL in an external macro processing language (like m4, say, among a myriad of others). – JUST MY correct OPINION Mar 06 '11 at 10:42
  • @JUST: I've heard that, but not seen it, and in a group it's just not considered an option because the IDE is so entrenched, with its intellisense and everything. I'm enough of an outlier without introducing that :-) – Mike Dunlavey Mar 06 '11 at 15:45
  • 4
    @SK: I won't say the C preprocessor is a decent macro system compared to Lisp (for example). But, compared to *nothing*, it's extremely useful. – Mike Dunlavey Mar 06 '11 at 15:46
  • Intellisense, but any kind of automatic language analysis benefits from the omission of advanced syntactic features. E.g. typechecking or dead code detection. – reinierpost Mar 08 '11 at 09:07
  • @reinierpost, it is not a problem if your syntax extensions are really advanced. You can infer automatically that intellisense rules, and code analysis works on an intermediate code level, where all the advanced features are already eliminated. – SK-logic Mar 08 '11 at 12:01
  • 4
    @reinierpost: I'm thinking of things I could do in Lisp, such as introduce control structures like differential execution and backtrack. These could be done with Lisp macros. In C/C++ I could do differential execution with C macros (and a little programmer discipline), but not backtrack. With C# I can do neither. What I get in exchange is things like intellisense. BFD. – Mike Dunlavey Mar 08 '11 at 12:57
  • @MikeDunlavey: You can't backtrack in C/C++ because you don't have `call/cc`, not because of the different macro system. – configurator Oct 07 '11 at 17:26
  • @configurator: You can backtrack in Common Lisp (in at least some senses of the word), which does not support continuations. – David Thornley Oct 07 '11 at 20:18
  • 1
    @David: The way I did it was I had a macro to wrap around ordinary code, such as a list of statements. It would take the `cdr` of the list and form a lambda closure out of it (i.e. a continuation) and pass it as an argument to the `car` of the list. That was done recursively, of course, and would "do the right thing" for conditionals, loops, and function calls. Then the "choice" function just turned into a normal loop. Not pretty, but it was robust. Problem is, it makes it super easy to make overly-nested loops. – Mike Dunlavey Oct 07 '11 at 21:16
7

Statements, in every language that has them. They do nothing that you can't do with expressions and prevent you from doing lots of things. The existence of a ?: ternary operator is just one example of having to try to get around them. In JavaScript, they are particularly annoying:

// With statements:
node.listen(function(arg) {
  var result;
  if (arg) {
    result = 'yes';
  } else {
    result = 'no';
  }
  return result;
})

// Without:
node.listen(function(arg) if (arg) 'yes' else 'no')
munificent
  • 2,029
  • 13
  • 11
  • I'm confused here: Do you just want a simpler way to do things? – TheLQ Mar 08 '11 at 01:38
  • He wants everything to be an expression – compman Mar 08 '11 at 02:58
  • 2
    Correct. Expressions for everything. – munificent Mar 08 '11 at 06:11
  • 1
    Lisp works well for this. – David Thornley Mar 08 '11 at 15:50
  • I suspect that the statements were blindly inherited from Fortran and further. – SK-logic Mar 08 '11 at 17:18
  • @David: Yup. Ruby too. – munificent Mar 08 '11 at 20:08
  • 1
    @SK-logic: I suspect that statements were blindly inherited from machine language, through FORTRAN, ALGOL, and COBOL. – David Thornley Mar 08 '11 at 21:15
  • 1
    I'm pretty sure machine language is the common ancestor, and that's just a reflection of the fact that modern computers based on von Neumann architecture execute instructions sequentially and modify state. Ultimately when IO happens, there are going to be expressions that don't yield meaningful data, so statements aren't entirely useless in indicating semantically that some code has only side effects. Even languages that have a notion of `unit` type (aka `()`) instead of statements have special consideration to ensure that they don't throw warnings or otherwise behave strangely. – Rei Miyasaka Mar 09 '11 at 21:03
  • I experimented with adding statement-expressions to C. It does not complicate the compiler at all. If you try to assign `void`, type checker will complain - that's it. You don't have to add a first-class `unit` type. – SK-logic Mar 10 '11 at 10:36
  • Well, in that case `void` basically *is* `unit`, but you're right that it works without any problems. – munificent Mar 10 '11 at 17:01
  • They're nice when you've got to `#define` stuff in strong typed languages (C++). `#define min(a,b) (a < b ? a : b)` `if(min(4,7) < 5) goto hell;` – Mateen Ulhaq Mar 12 '11 at 22:28
  • @muntoo: your example is just a bad use of macro. – Codism Oct 07 '11 at 17:59
  • @Codism The `#define min(a,b) (a < b ? a : b)` is separate from the example `if(min(...`, just in case you didn't notice. C isn't dynamically typed, so the `min()` macro is 'necessary', if you don't want to make a `min_int()`, `min_unsigned_int()`, `min_float()`, `min_double()`, etc, which would all contain the same code anyways. – Mateen Ulhaq Oct 07 '11 at 23:21
6

For me, it is the design problem that plagues all of the languages that were derived from C; namely, the "dangling else." This grammatical problem should have been resolved in C++, but it was carried forth into Java and C#.

bit-twiddler
  • 2,648
  • 1
  • 16
  • 17
  • 3
    One of the core goals of C++ was to be fully backward compatible with C. If they had drastically changed semantic behavior it may not have caught on like it did (or at least, that was the thought at the time) – Ed S. Mar 05 '11 at 23:31
  • 2
    @Ed S., however, elimination of the "dangling else" problem could have been accomplished by eliminating the (a.k.a. ) grammatical production and incorporating the curly braces into the conditional and iterative control structures like they did when they added the the try/catch exception handling control structure. There is no excuse for not rectifying this grammatical ambiguity in Java and C#. Currently, the defensive work around for this grammatical ambiguity is to make every statement that follows a conditional or iterative control statement a compound statement. – bit-twiddler Mar 06 '11 at 00:19
  • 1
    What do you mean by that? This isn’t a “grammatical problem” (it’s completely unambiguous). How would you “resolve” it? I actually find the rules in C satisfactory. Arguably, only a Python-like syntax (= meaningful indentation) can really solve this problem. Furthermore, I’m actually very happy that modern languages do *not* mandate braces. I agree that all C-like syntaxes suck but dangling-else is the least of their problems. – Konrad Rudolph Mar 06 '11 at 16:51
  • It is a well-defined shortcoming of C and Pascal. Wirth corrected the problem when he designed Modula. However, the grammatical ambiguity was carried forth in the C descendants. If one feeds the C grammar to an LALR(1) parser generator, one receives a shift-reduce error. This error is caused because the parser cannot determine which "if" to match an "else." The practice of matching the "else" to the innermost "if" is a deus ex machina technique that is brute forced into the parser to resolve the error. This mod introduces a context-sensitive element into a context-free grammar. – bit-twiddler Mar 06 '11 at 19:16
  • 1
    Continuing: I think that Python's use of a indentation as means by which to delineate a statement list is weird beyond belief. This technique violates the "separation of concerns" principle by tightly-coupling lexical scanning with syntax analysis. A context-free grammar should be able to be parsed without knowing anything about the layout of the source. – bit-twiddler Mar 06 '11 at 19:33
  • @bit-twiddler: C++ kept C's syntax intact, that being a main design goal. That means that the `if` statement syntax was kept as is. New features that were added, such as exception handling, could be done more consistently. – David Thornley Mar 07 '11 at 19:05
  • 3
    @bit-twiddler: No, it doesn't. The Python lexer just converts the whitespace to the appropriate INDENT and DEDENT tokens. Once that's done, Python has a pretty conventional grammar (http://docs.python.org/reference/grammar.html). – dan04 Mar 08 '11 at 00:53
  • @Dan: A grammatical production such as "suite: simple_stmt | NEWLINE INDENT stmt+ DEDENT" is about as close as it gets to injecting knowledge of the layout of the source in to the parser. Granted, the parser does not have to keep track of indentation levels, but it is still a bit of a kludge to form grammatical productions around whitespace. – bit-twiddler Mar 08 '11 at 17:35
  • Dangling elses are more a fault of LALR parsers than with the language itself. There's no ambiguity in the language, and it's trivial to write a recursive descent parser than handles them correctly. Requiring curlies for all `if` statements would make chained `else if` statements hideous with no actual benefit to the user. – munificent Mar 18 '11 at 01:42
  • @munificent: The "dangling else" problem is not an LALR parser generated problem. It is a well-documented grammatical problem (see http://en.wikipedia.org/wiki/Dangling_else). One still has to introduce the deus ex machina context-senstive technique of matching an "else" with the nearest "if" in a leftmost derivation (LL) parser in order to resolve the problem (recursive-descent is an LL-based parsing technique). LL parsers are fun to implement, but they are far less powerful than LR parsers. LALR parsers belong to the LR parser family. – bit-twiddler Mar 18 '11 at 02:29
  • @bit-twiddler: You say LL parsers are just "fun" and LR parsers are more powerful, but the majority of the world's code is being parsed by "far less powerful" recursive descent parsers: GCC, LLVM, Microsoft's C# compiler, et. al. – munificent Mar 18 '11 at 21:12
  • @munificent: That's not what I said. I said that LR parsers are more power than LL parsers, which is true. With that said, even LL parsers choke on the "dangling else" problem. The rule to match an "else" to the innermost "if" must be forced into the parser because it is not in the grammar. Without this brute force addition, a parser would not know how handle an "if" followed by and "if" followed by and "else" because that combination can be resolved two different ways. The grammar is ambiguous. – bit-twiddler Mar 19 '11 at 01:30
  • continued: The reason why people have moved to writing recursive-descent parsers for the C family is due to the need to have more control over the parsing process than is afforded by the major LL and LR parser generating tools. This level of control is needed so that the compiler writer can brute force deus ex machina techniques into the parsing process. These rules are not defined by the grammar (e.g., matching an "end" with the innermost "if"). They are in essence hacks that are required to resolve ambiguities in the grammar – bit-twiddler Mar 19 '11 at 02:11
6

I think all the answers so far point to a single failing of many mainstream languages:

There is no way to change the core language without affecting backward compatibility.

If this is solved then pretty much all those other gripes can be solved.

EDIT.

this can be solved in libraries by having different namespaces, and you could conceive of doing something similar for most of the core of a language, though this might then mean you need to support multiple compilers/interpreters.

Ultimately I don't think I know how to solve it in a way that is totally satisfactory, but that doesn't mean a solution doesn't exist, or that more can't be done

jk.
  • 10,216
  • 1
  • 33
  • 43
  • 1
    How would you go about "solving" this? – Bjarke Freund-Hansen Mar 09 '11 at 18:36
  • I'm not sure it can be totally solved - obviously keeping things out of the core language and in a standard library helps so I think I'd try to push this as far as it could go – jk. Mar 09 '11 at 21:05
  • 1
    By backwards compatible, do you mean newer compilers should be able to compile old code? – Rei Miyasaka Mar 10 '11 at 01:08
  • I mean newer compilers shouldn't change the meaning of old code, failing to compile would be a subset of that – jk. Mar 10 '11 at 09:04
  • most of cases, when a particular feature doesn't exist, or want to change, people make a new language based in previous one. C with classes => C++ – umlcat Mar 10 '11 at 18:54
  • This is solved, albeit differently than you suggest, in [Newspeak](http://NewspeakLanguage.Org/). (Where "is solved" means "is one of the overarching design goals, and there is a strategy for how to solve it, it just(!) needs to be implemented.") – Jörg W Mittag Mar 11 '11 at 02:36
  • @jk.: I'm not sure how that would work. You could wind up with a program written in what is effectively several different languages, each with its quirks. – David Thornley Mar 11 '11 at 20:59
  • yep, hence no satisfactory solution. I'm really just identifying the problem. – jk. Mar 12 '11 at 09:03
  • @Jörg what is the strategy that Newspeak has? – jk. Mar 12 '11 at 09:07
  • @jk: First off: I cannot do it justice in just one comment, please read [Gilad Bracha's blog](http://GBracha.BlogSpot.Com/search?q=Objects+as+Software+Services), watch [the video](http://YouTube.Com/?v=_cBGtvjaLM0), read the papers [(1)](http://Bracha.Org/oopsla05-dls-talk.pdf), [(2)](http://Bracha.Org/objectsAsSoftwareServices.pdf). In short: cloud computing, but not the way you think about it. Most of us already have all our data in the cloud, but not our code. But, in OOP, code and data are inseparable (and code is just data, anyway), so why not put our code in the cloud as well? – Jörg W Mittag Mar 12 '11 at 11:20
  • ... We have no idea what the representation of our Google Calendar is, and we don't care. Why would we care about the representation of our objects? (In fact, *not knowing* the representation of objects is the *very definition* of OOP, something which Java, C#, C++ and friends get *horribly* wrong.) Now, if we don't care what our objects look like, that means that their representation can change without breaking anything, and if they are all in the cloud, this means that they are all in *one place*, and this means that refactorings can be applied across *all code ever written*. – Jörg W Mittag Mar 12 '11 at 11:23
  • ... and *that* in turn means that language features can be changed in backwards-incompatible ways, as long as the translation can be automated. That's *why* the language is called *Newspeak*, because just like the Newspeak language in George Orwell's *1984*, it *shrinks*, unlike any other programming language, which can only ever grow. [But, like I said: please go directly to the source, my explanation doesn't do it justice, mainly because I don't 100% understand it myself yet.] – Jörg W Mittag Mar 12 '11 at 11:26
4

Java's silent integer arithmetic overflow

fretje
  • 310
  • 1
  • 10
Mauricio
  • 141
  • 4
4

Both Java and C# have annoying problems with their type systems due to the desire to maintain backwards compatibility while adding generics. Java doesn't like mixing generics and arrays; C# won't allow some useful signatures because you can't use value types as bounds.

As an example of the latter, consider that

public static T Parse<T>(Type<T> type, string str) where T : Enum
alongside or replacing
public static object Parse(Type type, string str)
in the Enum class would allow
MyEnum e = Enum.Parse(typeof(MyEnum), str);
rather than the tautological
MyEnum e = (MyEnum)Enum.Parse(typeof(MyEnum), str);

tl;dr: think about parametric polymorphism when you start designing your type system, not after you publish version 1.

Peter Taylor
  • 4,012
  • 1
  • 24
  • 29
  • 2
    The inability to restrict types to enum is annoying in C#, but you can kind of work around it like this `MyMethod(T value) where T : struct, IComparable, IFormattable, IConvertible` But you still have to test for an enum and it's a hack. I think the bigger lack in the C# generics is no support for higher kinds, which would really open up the language to some cool concepts. – CodexArcanum Mar 09 '11 at 22:10
4

The worst sin of a programming language is not being well defined. A case I remember is C++, which, in its origins:

  1. Was so ill defined that you could not get a program to compile and run by following books or examples.
  2. Once you tweaked the program to compile and run under one compiler and OS, you'd have to start over if you switched compilers or platforms.

As I remember, it took about a decade to get C++ defined well enough as to make it as professionally dependable as C. It's something that should never happen again.

Something else I consider a sin (should it go in a different answer?) is having more than one "best" way to do some common task. It is the case of (again) C++, Perl, and Ruby.

Apalala
  • 2,283
  • 13
  • 19
  • 1
    I don't see how to avoid ill-definedness in an evolving language. Or, for that matter, in a predesigned language where the original designer missed some important points (like Pascal). – David Thornley Mar 11 '11 at 21:02
  • @David Thornley Well-definedness is well defined. Bugs withstanding, most programming language designers get it right from the start. Tools can check that a grammar is unambiguous when it's complete (C++ requires at least three grammars), and the semantics should for standard implementations should be specified. – Apalala Mar 13 '11 at 01:04
  • I agree that well-defined languages are possible, but that's not going to happen in an evolving language like pre-Standard C++. The alternative is that each language is thoroughly designed before release, and that's not necessarily the way to get the best languages. I'd say that most programming language designers get things wrong at the start, since language design is extremely complicated. – David Thornley Mar 13 '11 at 23:19
  • I guess I'm having trouble understanding what you mean by "well defined". Is your complaint that different C++ compilers didn't actually compile the same language? – Sean McMillan Oct 07 '11 at 15:24
4

ALTER

When I learned COBOL, the ALTER statement was still a part of the standard. In a nutshell, this statement would allow you to modify procedure calls during runtime.

The danger was that you could put this statement in some obscure section of code that was rarely accessed and it had the potential to completely change the flow of the rest of your program. With multiple ALTER statements you could make it nearly impossible to know what your program was doing at any point in time.

My university instructor, very emphatically, stated that if he ever saw that statement in any of our programs he would automatically flunk us.

oosterwal
  • 1,713
  • 14
  • 16
  • It does have good use cases though - stubbing or memoization. Instead of writing `v() { if (not alreadyCalculatedResult) { result = long(operation); alreadyCalculatedResult = true; } result; }` you say `v() { result = long(operation); v = () => result; result; }` – configurator Oct 07 '11 at 17:36
3

Classes in C++ are some kind of forced design pattern in the language.

There is practically no difference at runtime between a struct and a class, and it is so confusing to understand what is the real true programming advantage of "information hiding" that I want to put it there.

I'm going to be downvoted for that, but anyway, C++ compilers are so hard to write this language feels like a monster.

jokoon
  • 2,262
  • 3
  • 19
  • 27
  • 2
    Information hiding is important because it lets you hide implementation specific details, which are likely to change, from the accessible parts of the API (the "UI" of the API) thus making changes to the program becomes easier and less painful. – Anto Mar 06 '11 at 20:57
  • 1
    The UI of the API... no, seriously, I don't buy it. – jokoon Mar 06 '11 at 21:36
  • 3
    This difference is not the most revolting part of C++, not even close. The only difference is a default access modifier (public for structs, private for classes). C++ is a horrible, monstrous language, but certainly not in this part. – SK-logic Mar 07 '11 at 12:45
  • sk-logic: well, I could say the horrors start there. – jokoon Mar 07 '11 at 13:46
  • 2
    Information hiding is good; you can find discussions of that all over. The only prominent software book I can think of that was against it was Brooks' "The Mythical Man-Month", and he later considered it the biggest mistake in the book. If you don't understand the advantages, you really aren't qualified to make the judgment you're making. – David Thornley Mar 07 '11 at 19:10
3

I feel like I'm opening myself up to get flamed, but I really hate the ability to pass plain old data types by reference in C++. I only slightly hate being able to pass complex types by reference. If I'm looking at a function:

void foo()
{
    int a = 8;
    bar(a);
}

From the calling point, there is no way to tell that bar, which may be defined in a completely different file, is:

void bar(int& a)
{
    a++;
}

Some might argue that doing something like this may just be a bad software design, and not to blame the language, but I don't like that the language lets you do this in the first place. Using a pointer and calling

bar(&a);

is much more readable.

Jeff
  • 916
  • 6
  • 10
  • +1 I don't agree with you, but I appreciate your reasoning. – Jon Purdy Mar 09 '11 at 22:12
  • @Jon I'd actually be very interested on what you think. Do you carry the "don't blame the language" view? – Jeff Mar 09 '11 at 22:20
  • 6
    @Jeff: For one thing, the primary reason that reference semantics made their way into C++ was operator overloading, for which uniform reference behaviour simply makes sense. More importantly, though, C++ is designed to be versatile and provide very fine-grained features, *even if* doing so incurs significant risk of programmer error. So yeah, at least in this particular case, don't blame the language. I'd rather be able to make mistakes than let a language get in my way. – Jon Purdy Mar 10 '11 at 04:47
  • @Jon agreed, it would very odd for pass by reference to apply to everything besides PODs. I'd prefer it if this feature was missing entirely from C++ as an alternative, but we can agree to disagree :). Thanks for the input! – Jeff Mar 10 '11 at 05:54
  • Java seems to not like pointers as much as you do. – Mateen Ulhaq Mar 12 '11 at 22:35
  • @muntoo I think Java's way is fine. With Java you will always know how your passed data will be handled by looking at the calling point of a method. My beef is with C++, probably because I'm crazy :). – Jeff Mar 13 '11 at 18:24
  • c# got this right – Codism Oct 07 '11 at 18:19
3

Although every language has it's faults, none are nuisances once you know about them. Except for this pair:

Complex syntax coupled with a wordy APIs

This is particularly true of a language like Objective-C. Not only is the syntax overwhelmingly complex but the API uses function names like:

- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath

I'm all for being explicit and un-ambiguous, but this is ridiculous. Each time I sit down with xcode, I feel like a n00b, and that's really frustrating.

Dimitry
  • 361
  • 1
  • 7
  • The Objective-C syntax is overwhelmingly complex? Haven't you seen C++? And the actual method name there is `tableView:cellForRowAtIndexPath:`, which is very descriptive in my opinion. –  Oct 07 '11 at 23:28
  • Learn to touch-type. – finnw Nov 21 '11 at 02:57
2
  1. signed chars in C - an abomination invented to let mathematicians have big collections of small items
  2. Using case to carry semantic content - again for mathematicians, who don't need to talk, and never have enough space for their formulas
  3. Let/Plain assignment vs. Set assignment in Basic dialects - no mathematicians involved here I think
Ekkehard.Horner
  • 221
  • 1
  • 4
  • I'm lost as to your signed chars comment, could you explain? – Winston Ewert Mar 05 '11 at 22:27
  • 1
    To a human, the concept of (un)signed chars and the necessity to tell the compiler to use unsigned chars as default is surely just as insane as to a mathematician the claim that 2 != 2, because the second 2 is uppercase, bold, or italic. – Ekkehard.Horner Mar 05 '11 at 23:16
  • 5
    The problem is that C confounds the concept of "char" (i.e., part of a text string) and "byte" (i.e, `(u)int_least8_t`). Signedness makes perfect sense for small integers, but no sense at all for characters. – dan04 Mar 06 '11 at 07:16
  • @dan04: I agree, I wish they had different types for small integers (signed and unsigned), byte and character. When you explain to newbies that to manipulate raw memory they need to cast to a `char*`... like a C-String, they get really confused. – Matthieu M. Mar 06 '11 at 13:40
  • C# got this right with separate `sbyte`, `byte`, and `char` types. – dan04 Mar 06 '11 at 19:06
  • There's no particular problem with `signed char` or `unsigned char`. The real problem is that `char` could be either (it's implementation-defined). If `char` always meant `unsigned char`, it would behave as we expect. If it always meant `signed char`, it would always behave the same way, even if it was the wrong way. – David Thornley Mar 07 '11 at 19:13
  • Why do you say these design choices were made (or not made) for mathematicians? What do mathematicians have to do with C, Java, or BASIC? – Mark C Mar 13 '11 at 15:44
  • Because programming languages are designed to fit their users needs. C's many numerical types are good for a mathematician who wants to work with big matrices without wasting bits or floating point ops on integer numbers; Javascript's design decision to allow all numbers as long as they are double is ok for Web stuff, but won't please number crunchers. (A more subjective reason: at school I was humilated once to often by my math teacher.) – Ekkehard.Horner Mar 13 '11 at 16:16
  • @Ekkehard Thank you for the explanation. I am sorry your teacher did that. (Put the notifier "@name" . For details on how it works, see ["How Do @comment Replies Work?](http://meta.stackoverflow.com/questions/43019/how-do-comment-replies-work/43020#43020).) – Mark C Mar 27 '11 at 19:06
2

Mine has to be UMTA Specification Language, a macro language that translated into ANSI Fortran. USL use of blanks was hideous.

USL would allow a blank in a name. So instead of "LASTTANGO" you could name your macro "LAST TANGO". But this could also mean a macro "LAST" followed by another macro named "TANGO". Read code like "LAST TANGO IN PARIS" and the combinatorial possibilities are horrid.

USL did not use begin/end or {} to indicate subsidiary code, it used spacing. Following an IF statement, all lines that were indented more than the IF statement were conditional upon that IF. Sounds easy, eh? But try tracking conditionals through several pages; try adding an ELSE with exactly the right indentation.

USL was born and died in a U.S. government agency back around 1980.

Andy Canfield
  • 2,083
  • 12
  • 10
  • 3
    Python seems to be doing quite well with indenting that represents block structure. But thankfully it doesn't allow spaces in identifiers! – Greg Hewgill Mar 06 '11 at 02:40
  • Side note: the possibilities aren't combinatorially horrid, they're just exponentially horrid. – Jon Purdy Mar 08 '11 at 00:13
  • 1
    @Jon: Andy said that "the combinatorial possibilities are horrid", where "combinatorial" is an adjective modifying the noun "possibilities". He didn't say that "the possibilities are combinatorially horrid", where "combinatorially" is an adverb modifying the adjective "horrid". (Firefox is saying that "combinatorially" is spelled wrong.) He didn't say anything about how horrid the possibilities were. – compman Mar 09 '11 at 23:52
  • @user9521: You're right, but I wasn't correcting, just adding more information. – Jon Purdy Mar 10 '11 at 04:35
  • [Time flies like an arrow.](http://en.wikipedia.org/wiki/Time_flies_like_an_arrow;_fruit_flies_like_a_banana) – Ed Staub Oct 07 '11 at 16:48
2

Perl's flattening of lists... Conversely, it's also one of it's best features.

red-dirt
  • 3,668
  • 1
  • 22
  • 26
2

In old Ultima Online/sphere server scripting...there was no divide at all or decimal points, though the game itself obviously used them. You could make a somewhat crude divide function but just barely.

It is difficult to say how much of a train wreck that one flaw made the whole language extremely cumbersome.

1

In C, and inherited by C++, Java, and C#:

You cannot parse the language without building a symbol table. Because code like this:

Foo Bar();

could be declaring a variable Bar of type Foo and calling the default constructor, or could be declaring a function Bar that returns a Foo and takes no arguments. (In C++, anyway. The others have similar flaws.)

This means you can't parse these languages without building a symbol table, making analysis tools much harder to write.

(Javascript and Go manage to avoid this problem.)

Sean McMillan
  • 5,075
  • 25
  • 26
1

The m4 macro language lacks a loop construct. Instead, the documentation recommends that you create your own using recursion. That kept me from bothering with the language, pretty much for ever.

Michael Kohne
  • 10,038
  • 1
  • 36
  • 45
1

Probably not the greatest design flaw but a flaw nonetheless in my opinion...

The keyword final in the context of variables is Java.

I wish there was a non-final / mutable / rassignable keyword instead to indicate that the reference to a variable can be changed.

Good practices suggest that you should use the keyword final liberally in all cases especially when you are writing multi-threaded applications.

I follow this advice and use the final keyword whenever I can. It is very rare that I need a variable to be non-final. This makes my source code a lot more cluttered and forces me to type more than I really should have to.

c_maker
  • 8,250
  • 4
  • 35
  • 53
1

C++'s making objects value types instead of reference types, and thereby ruining the chance to implement object-oriented programming in any sane way. (Inheritance and polymorphism simply don't mix with statically-allocated, fixed-size, pass-by-value data structures.)

Mason Wheeler
  • 82,151
  • 24
  • 234
  • 309
  • 2
    Plus, all the concern about correctly implementing copy constructors, `operator=`, and `std::swap` specializations just doesn't exist in a reference semantics language. – dan04 Mar 06 '11 at 19:05
  • 3
    @dan04: Wrong; the same issues are there (except for swap). They're just approached differently, with issues like assignment vs. cloning. The difference between having two pointers to the same thing and two pointers to two different but identical things doesn't go away by changing the language. – David Thornley Mar 07 '11 at 19:16
  • 1
    @David: In a reference-semantics language, a lot of objects just *don't need* to be copied. This includes any immutable type. – dan04 Mar 08 '11 at 00:43
  • 1
    @dan04: Correct: "a lot of objects". As long as you can use immutable types only (and C++ is not really a functional language), you're in great shape, and you don't have to worry about C++ copy constructors or assignment operators because the compiler-generated ones will work fine. The reason to write your own is normally that the object manages some sort of resource, and in that case there will be a difference between two pointers to the same thing and two pointers to different things. – David Thornley Mar 08 '11 at 15:53
  • "Reference types everywhere" and "systems programming language, garbage collection may not be available" don't mix. (It's not a coincidence that almost all languages that follow your prescription rely on garbage collection.) There would have been a much better case for omitting inheritance (an overrated feature) from C++. – rwallace Aug 25 '11 at 15:54
  • @rwallace: There are over [2 million Delphi programmers](http://www.sfgate.com/cgi-bin/article.cgi?f=/g/a/2011/08/15/prweb8712420.DTL) who would dispute your assertion that reference-type objects require garbage collection. – Mason Wheeler Aug 25 '11 at 16:11
1

Java, PHP, C#, Delphi, Vala: mixing "pointers to objects" versus "plain objects", what is usually called "references". In C++, and Object Pascal, you can create objects as static variables, and objects as dynamic variables, using pointer syntax.

Example (C++) :

x = new XClass();   
x->y = "something";
if (x == null) {
  x->doSomething();
}

Example (Java / C#) :

x = new XClass();
x.y = "something";
if (x == null) {
  x.doSomething();
}
umlcat
  • 2,146
  • 11
  • 16
1

In non-Java like languages. The concept of "GOTO" or Jumping. Being able to jump around the code in a non-linear way is perhaps the most ill-logical feature in any written language.

This has been the most misused and irrelevant concept. Once I see one of these in a 300 line function I know I'm in for a cooking lesson for spaghetti. The exception is error handling. This is the only acceptable use of the concept of jumping

It breaks good modern programming practices. Goto's are only acceptable for the purpose of error trapping. Not a lazy way to terminate loops or skip code.

Writing code is an art form that should is oriented for readability. Among many aspects for readability is linear execution. One entry point, one exit for any function, and it must flow down the page, no jumps or goto's.

Not only does this make the code more readable, it also by it's very nature helps you write higher quality code. One hack begets another and another. You'll generally find that once goto's are misued, you also get multiple exit statements out of functions. Tracing conditions and logic for testing becomes infinitely more difficult and immediately reduces the robustness of any code you may produce.

I wish Goto's would be banished forever, they were used in Assembly coding years ago, and that is where they should remain.

angryITguy
  • 253
  • 2
  • 11
  • Another one zealous `goto` basher. Go on, implement a state machine without it. And do not forget, code in modern languages is not only written by humans, it is often generated. And `goto` is a very important feature which simplifies compilation significantly. I wish `goto` bashing will be banished forever, and those responsible are exclude from the programmers caste. – SK-logic Mar 13 '11 at 11:08
  • And as for a readability - read this: http://www.literateprogramming.com/adventure.pdf – SK-logic Mar 13 '11 at 11:11
  • 1
    @SK-logic Generally speaking, would a GOTO be better replaced by a function call? – Mark C Mar 13 '11 at 16:06
  • 3
    @Mark C, no, of course not! See the state machine example - it will be severely obscured, and a potential for optimisation will be damaged. Also, it will make it even harder for higher level languages compilers to target a language that way. – SK-logic Mar 13 '11 at 16:49
  • 1
    @SK-logic Yes, but in the cases that GOTOs call reusable code, it is a function call with only slightly different syntax, right? I do not completely understand your example (FSM), but I understand intuitively that sometimes a lack of GOTOs could obscure program flow. I think you mean the case where the called block has GOTOs to other parts of the program. (Could you get equivalent functionality if function calls optionally did not return program control to the caller?) It seems GOTOs have two uses---function calls and control flow. – Mark C Mar 13 '11 at 20:12
  • 2
    @Mark C, in order to simulate `goto` behaviour with function calls, you'd have to translate the code into an SSA-form, then transform all the basic blocks into tiny functions with numerous parameters, each performing a tail-call of another basic block function. Firstly, it is a mess, and secondly, tail calls are not supported by many languages and runtimes. Actually, this transform is a great tool for reasoning about an imperative code with `goto`-s or any other control flow constructions (which should be lowered to `goto` first). – SK-logic Mar 13 '11 at 20:53
  • @SK-Logic .....Yet another "goto zealot". Note that I was referring to concepts where ppl jump around the code in a non-linear fashion. So, I think we agree on better "program flow". But, not *everyone* is building state machines, and there are other technologies than C/C++ where goto's cause more problems than solve. Goto's are like giving an uzi to a 5 year old. They are used indiscriminately by those who don't fully understand how to use it effectively, and the damage is random and widespread, I only support the use of Goto's for error handling...and that's about all. – angryITguy Mar 13 '11 at 21:59
  • @giulio, embrace Haskell if you're so much into bondage and discipline. And let the mature (> 5yo) programmers decide, what level of dangerous low level features they need to use. It is just unresposible to talk about "banishing forever" something that you simply unable to comprehend. You could not even understand my main point - code generation. – SK-logic Mar 14 '11 at 01:30
  • 1
    @SK-logic I want to say this without being critical of people: Not everyone has the same standard or background when they write programs. – Mark C Mar 14 '11 at 02:38
  • @SK-Logic. It's evident that you a. have too much time on your hands, and b. Don't get the point of the discussion. You're just itching for an argument. Your contributions are counter productive. – angryITguy Mar 14 '11 at 04:07
  • @giulio, now read once again, and try to think first before rushing into a zealous fight for your funny religion: firstly, it really, really *sucks* to compile a high-level language into a target which does not have any form of `goto`. If you knew anything about code generation, you would agree. Otherwise, educate yourself first. Secondly, really *a lot* of things are better expressed in form of state automatons. Your point that "not *everyone*" is building state machines is simply a lie - in fact, almost everyone do so. If you don't know it, you're not programming at all. – SK-logic Mar 14 '11 at 09:50
  • @Mark C, I believe in people. I always prefer to think that they're smart and they know what they're doing, instead of staying behind them with a whip and fiercely punishing them for doing anything that counts as heresy (that's exactly what @giulio whants to do). For this reason I even think that there is absolutely nothing wrong with, say, inline assembly. `goto` is such a minor offence that it does not even deserve a discussion. – SK-logic Mar 14 '11 at 09:53
1

The biggest flaw often seen in many programming languages is inability to 'bootstrap' - often it's not possible or practical to implement the language and its libraries using only that language itself.

When this is the case, language designers (and implementors) are skipping the hard part of making the language useful for really interesting and challenging tasks.

artem
  • 331
  • 1
  • 5
  • I remember seeing an ad campaign for a COBOL compiler, claiming it was written in itself. I thought that a bit excessive. – David Thornley Mar 11 '11 at 21:04
  • 1
    @DavidT DEFINITION COBOL. ARRAY OF CHARACTERS: {'C', 'O', 'B', 'O', 'L'}. (mostly-made-up syntax) – Mark C Mar 13 '11 at 20:18
1

Cramming array, list and dictionary into one abomination called PHP's "associative arrays".

vartec
  • 20,760
  • 1
  • 52
  • 98
1

I have two things which I dislike in C++,

Implicit conversion to bool (this is made worse by the fact that you can implement conversion operators for types such as operator int() const {...})

Default parameters, yes it aids backward compatibility of interfaces when adding new stuff, but then so does overloading - so why do this?

Now combine the two together you have a recipe for disaster.

Nim
  • 1,363
  • 1
  • 11
  • 17
  • +1 for implicit conversion to bool. It allows to write a class A with a conversion operator to bool, a function foo() with return type; and then if (foo()). When I found an example of this in some legacy code I thought foo() were just testing a condition. Instead it was constructing an object and was being misused to test if that object could be constructed or not. Very hard to read! – Giorgio Oct 07 '11 at 17:25
0

Coldfusion's nested CFLoop behavior.

John Cromartie
  • 481
  • 2
  • 9
0

The decision to implement the goto operator in PHP 5.3.
What reason could there possibly be to encourage bad programming practices that were -wisely- not implemented in previous versions?

Lucius
  • 166
  • 4
  • 4
    Overzealous `goto` bashing amuses me. It is not a *bad programming practice*, it is an essential form of control flow. PHP code is often generated rather than hand-written, and for a compilation target language `goto` is a must-have. – SK-logic Mar 07 '11 at 11:56
  • 2
    `goto` is only *essential* in languages that lack exceptions, and *very* essential in languages that lack `while`, `else`, `switch`, etc. (like old line-numbered BASIC). – dan04 Mar 07 '11 at 14:45
  • 3
    @dan04, exceptions are not a substitute for a `goto`. Think of implementing a state machine efficiently with no `goto`. And it is quite a common pattern when compiling from some higher level language. Many other thigs are also much easier to compile if your target language provides a proper `goto`, and can be a hell if only a structural control flow is available. – SK-logic Mar 07 '11 at 15:07
0

Lack of array handling for input variables in SQL. Even when a database vendor has added on a feature to support this (I'm looking at you, SQL Server), it can be poorly designed and awkward to use.

HLGEM
  • 28,709
  • 4
  • 67
  • 116
0

Delphi / Pascal language don't allow multiline strings without using concatenation.

0

Javascript's omission of many of the date formatting and manipulation functions that almost every other language (including most SQL implementations) have has been a source of frustration for me recently.

Joshua Carmody
  • 798
  • 1
  • 7
  • 16
0

Optional paramaters in VB. When adding features to code is is too easy to do it as an optional parameter, then another, then another, so you end up with all these parameters that are only used in newer cases that were added after the initial code was written, and probably aren't called at all by older code.

Thankfully this was solved for me by switching to C# and using overloads.

Bruce McLeod
  • 281
  • 2
  • 4
0

RoboCom, whose assembly language lack for bitwise operations.

While it hardly counts as any productive language with any real value other than learning and entertainment, RoboCom is a game where you are programming virtual robots to participate in a code battles. Programmers have to make most use of clock cycles to make their move before their opponent do.

If a language is illsuited for what it is designed for, that is of course a flaw in the design.

I found it quite irritating for a language to lack bitwise operations, especially when the goal of the game is elimination by optimization. That in my books, is a real design flaw, since many optimizations can be made using bitwise operations.

Wish I could have contributed something slightly more useful. :)

Statement
  • 245
  • 1
  • 9