41

So, there are a bunch of questions appearing asking is X evil, is Y evil.

My view is that there are no language constructs, algorithms or whatever which are evil, just ones which are badly used. Hell, if you look hard enough there are even valid uses of goto.

So does absolute evil, that is something which is utterly incompatible with best practice in all instances, exist in programming? And if so what is it? Or is it just bad programmers not knowing when something is appropriate?

Edit: To be clear, I'm not talking about things programmers do (such as not checking return codes or not using version control - they're choices made by bad programmers), I mean tools, languages, statements, whatever which are just bad...

Jon Hopkins
  • 22,734
  • 11
  • 90
  • 137
  • I share your view on this, but I look forward to seeing if any of the answers turn up a true monstrosity. – Adam Lear Dec 21 '10 at 14:35
  • 21
    **Null is evil! The Billion Dollar Mistake** http://programmers.stackexchange.com/questions/22912/null-references-the-billion-dollar-mistake-closed – Amir Rezaei Dec 21 '10 at 14:51
  • 22
    @Amir Resaei, null is necessary if you cannot know the value at the time the reecord is inserted! The ways to gert around using nulls are far worse. – HLGEM Dec 21 '10 at 14:53
  • 4
    @HLGEM: Can you share how alternatives like Haskell's "Maybe" are "far worse"? – LennyProgrammers Dec 21 '10 at 14:57
  • 24
    200 rep cap per day is truly evil. –  Dec 21 '10 at 15:07
  • 1
    It might help if you would define "evil". –  Dec 21 '10 at 15:08
  • @Developer Art - If you read the question I do - Something which is incompatible with best practice in all instances. – Jon Hopkins Dec 21 '10 at 15:14
  • 1
    @Lenny222, databases can't store maybe and not known is not the same thing as maybe. Worse is trying to find a default value for an unknown cost (not zero, you might have legitimate data that needs a value of zero, -1, then you have to write morecomplex code to exclude that from all reporting, etc. , same thing for unknown dates. Yes for unknow string values you could say 'Unknown' but other data types don't work that way. – HLGEM Dec 21 '10 at 15:24
  • 4
    @HLGEM: This might be true for current SQL-databases, i thought we were talking about programming languages. – LennyProgrammers Dec 21 '10 at 15:27
  • @Lenny222, and what part about the various flavors of SQL being programming languages did you not understand? – HLGEM Dec 21 '10 at 15:39
  • 2
    @HLGEM: my mistake, i took your "alternatives to null are not possible in SQL" for "there is no feasable alternative to null in any programming language" in the given context. The latter is a wrong statement, until proven. – LennyProgrammers Dec 21 '10 at 15:46
  • 2
    Yes - the device between the programmer and his computer, commonly referred to as a "keyboard". – Craige Dec 21 '10 at 20:00
  • 3
    Yes there is but i'll get downvoted and ppl arguing with me for saying them. Proof is my 100% serious answer http://stackoverflow.com/questions/406760/whats-your-most-controversial-programming-opinion/409825#409825. and yes singletons are evil no matter how you use them –  Dec 21 '10 at 20:25
  • @HLGEM: Using null to mean maybe is dumb, but it isn't evil. It's just architects (using the term loosely) not thinking through the consequences of a decision, but in certain contexts it can be a valid choice. ("Null/unknown is not possible, therefore we define null to mean maybe.") – SilverbackNet Dec 21 '10 at 22:41
  • 4
    When the highest voted answer is a self-admitted rant, it's time to close the question as [not constructive](http://blog.stackoverflow.com/2010/09/good-subjective-bad-subjective/). –  Dec 22 '10 at 01:16
  • 2
    Redundant code is evil. – Vanchinathan Chandrasekaran Dec 22 '10 at 02:05
  • Do you mean, besides C#, VBA, and Visual Studio? – orokusaki Dec 22 '10 at 04:50
  • 2
    @Mark - That's one answer that's not constructive. Both the question and many of the answers are constructive. If you don't like an answer then down vote it, don't close the question. – Jon Hopkins Dec 22 '10 at 09:15
  • Shouldn't this be tagged `worst-practices`? – SLaks Dec 22 '10 at 17:36
  • 2
    Fascinating mix of humor and humorless here... Also, Hitler loved using `goto`. – Shog9 Dec 22 '10 at 22:23
  • Bogosort is a pretty evil algorithm. – Carson Myers Dec 23 '10 at 11:04
  • You might find http://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf interesting (if "evil" include abusing trust) –  Jul 17 '12 at 09:01
  • 1
    Just PHP. Everything else is just annoying rather than truly evil. :-) – Brian Knoblauch Jul 17 '12 at 12:37

40 Answers40

77

There is no true evil in programming.

<rant>

The reason so many people think that there are evil things is that it is pounded into their heads when they first take their programming classes. "Don't use goto! Always normalize your databases! Never, ever use multiple inheritance!" These are hammered in because these "evil" practices are so easily abused, not because they are inherently bad. There are so few uses of them that you can get away with saying "never" at first. What is truly evil is saying, "There is no reason to consider anything that is not a 'best practice'", because there is always a place where that very way is perfect.

</rant>

Michael K
  • 15,539
  • 9
  • 61
  • 93
  • 10
    +1 - Best practice is a generalisation and for all generalisations there will be exceptions. – Jon Hopkins Dec 21 '10 at 14:47
  • 1
    ++ I second your rant. Bravo! – Mike Dunlavey Dec 21 '10 at 14:50
  • 3
    +1: A while back, I added a new goto to some C code I was working on that already had a bunch of them. It was quicker than refactoring it. I told my wife, who's also a programmer, and she asked, "Honey, are you okay? Do you have a fever?" I'm currently working on some code where another guy wrote a C goto that jumps into the middle of a loop. I'd never write it myself, and I curse inwardly every time I see it, but it meets the ultimate test: it works. – Bob Murphy Dec 21 '10 at 16:23
  • Someone will always come along and optimize a previously impossibly slow algorithm, that using previously was "evil". Times change, but superstitions rarely seem to. – SilverbackNet Dec 21 '10 at 22:44
  • 1
    +1 - I worked on some C code right out of school where "goto exit" or "goto error_exit" was used often. I have to say it made for cleaner looking code than, having multiple return statements. My motto is that everything has it's place. (even singletons ;-) ) – eSniff Dec 22 '10 at 01:48
  • 1
    @JonHopkins -- which is why I much prefer to talk of "good practices". – Richard Dec 22 '10 at 09:43
  • 1
    +1. Maybe tags are evil, for what isn't a rant in programmers.SE? – MAK Dec 22 '10 at 14:56
  • 1
    In school they teach you that goto is evil, for the same reason you tell little kids that there are monsters living in the well - for their own safety you want to keep them away from it until they know better. –  Aug 12 '11 at 14:16
  • I'm not saying what you said is wrong..... except your very first sentence which is bolded. Thats very wrong. See accepted answer and my answer. Murph is good as well although less evil then others. Evil = will cause you trouble and has a non obsolete replacement (although maybe not existing at the time such as exceptions doesnt exist in C) that will have none of the troubles or significantly less. –  Oct 07 '12 at 02:44
58

Guns dont kill people, people kill people.

In the same way, dev-tools are not evil, the things programmers do with them could be.

guiman
  • 2,088
  • 13
  • 17
  • 1
    That's a great analogy. – Maxim Zaslavsky Dec 21 '10 at 20:14
  • Then again, using Visual C++ 6 could be considered evil. I know some people who still do, because it's too much effort to bring the code up to standard C/C++, and in one case it breaks backward compatibility with plugins to use ANYTHING else. – SilverbackNet Dec 21 '10 at 22:47
  • Programmers kill programmers? –  Dec 22 '10 at 02:36
  • Depends on eachone but some might. So watch out while reviewing others code – guiman Dec 22 '10 at 03:12
  • 14
    Guns don't kill people. Bullets do. :) –  Dec 22 '10 at 04:56
  • 4
    Guns go a long way in helping people kill people. What’s worse, I don’t understand the point you’re trying to make: who ever claimed that dev-tools are evil?! There must be something clever in this answer since it got so many up-votes. But I completely fail to see it. – Konrad Rudolph Dec 22 '10 at 11:22
  • 1
    @fennec bullets don't kill people, people kill people. unless machines have finally become sentient, in which case, I'll be in my bunker. – antony.trupe Dec 22 '10 at 16:04
  • Guns don't kill people. [Magic missiles do](http://www.popcultcha.com.au/images/magic%20missiles.jpg). – Alex Budovski Dec 23 '10 at 11:44
  • 5
    Guns don't kill people, rappers do. – Jon Hopkins Dec 24 '10 at 09:51
  • People with guns kill people more easily than people without guns. – egarcia Feb 10 '12 at 21:53
  • Horrible analogy. Guns don't randomly fire, explode or do things YOU don't intend. –  Oct 07 '12 at 02:58
46
  1. Magic numbers.
  2. Implicitness is inherently evil, and here's the reason why:
user8865
  • 431
  • 6
  • 5
  • 8
    1. there are reason to have them. 2. Type inference in Haskell is a type of implicitness that I love. – Matt Ellen Dec 21 '10 at 14:57
  • 1
    +1 for magic numbers, bane of my life at a very big company. Also, "lost in the distant past" reasons for scaling by 1000 instead of 1024, leading to an enormous overhead in a critical loop that everyone was scared to eliminate because they didn't know what else would be affected. – geekbrit Dec 21 '10 at 15:01
  • 7
    Meh, implicitness is very neat sometimes. Explicitness is for assembly-programmers. – Macke Dec 21 '10 at 18:57
  • 6
    Re #2: I see what you did there...Sorry about the downvotes from people who I'm guessing didn't get the joke. – Larry Coleman Dec 21 '10 at 19:05
  • 5
    @Larry, I got the joke. I just disagree. – JSBձոգչ Dec 21 '10 at 21:44
  • Re #2: I used to think the same until I started using Scala. – missingfaktor Dec 22 '10 at 02:30
  • I'm assuming you don't use garbage collectors then? They go around implicitly deleting memory. Or what about function calls? They implicitly push things on the stack. I'm also assuming you turn off all compiler optimizations? They implicitly change your code without you knowing about it. – Jason Baker Dec 22 '10 at 20:46
  • You get the accepted answer for the best attempt at a specific, factual answer, though I'm still of the opinion that it's about misuse more than the actual functions/syntax/whatever. – Jon Hopkins Dec 23 '10 at 16:53
36

Is anything in programming truly evil?

Absolutely. Failure to use your brain and think about what you're doing and why you're doing it is the root of all programming evil.

Greg D
  • 231
  • 2
  • 5
25

Empty generic exception handlers i.e. :

catch(Exception ex)
{
}

I don't doubt that someone can give me a valid use case - but, to be honest, its going to be seriously creative... and at the very least you need an explanation.

Murph
  • 7,813
  • 1
  • 28
  • 41
  • 1
    +1 I was going to add this if someone else hadn't – Conrad Frix Dec 21 '10 at 22:22
  • 1
    Really? "I don't care if this fails or not" is never valid, not in any possible context? Not everything done needs to be guaranteed to succeed or fail, especially in small throwaway utilities. Another case where not thinking first is the real evil. – SilverbackNet Dec 21 '10 at 23:02
  • 4
    I see that in legacy code All. The. Time. What's worse is that people question me when I change it. – George Stocker Dec 22 '10 at 02:01
  • 5
    I add a email notification in it first that sends me a "empty catch block hit" type exception, and release it to prod so I can see WHY that catch is actually there, and under what conditions it's being it. Then I fix it. – CaffGeek Dec 22 '10 at 05:35
  • @SilverbackNet - I answered that: "at the very least you need an explanation". Ok, throwaway code - but then no-one will ever see it (-: In anything else? You might choose to ignore specific exceptions (with a comment explaining why) but generic ones should, at the very least, be logged (so you can convert them to specifics). And even in your throwaway code how do you know that the error your swallowing is the one you think it is? If nothing else System.Diagnostics.Debug.Writeline("Error: " + ex); gives you a handy breakpoint. – Murph Dec 22 '10 at 09:19
  • You see this occasionally in python, which is a bit more liberal with exceptions. But it still probably needs an explanation. – Jason Baker Dec 22 '10 at 20:48
  • 2
    I would vote for checked exceptions in Java :D – Nils Dec 22 '10 at 21:05
  • Hey, how am I supposed to debug that? I can set a breakpoint inbetw.. oh, wait, I see :) – lorenzog Dec 22 '10 at 21:12
  • How about when a background thread is calling `Control.BeginInvoke` for the purpose of updating a control on a form which might get asynchronously closed? Without access to the code for the control itself there's no clean way to prevent the exception from occurring, but all the exception really means is "Microsoft should have, but didn't, provide a `Control.TryBeginInvoke` method". – supercat Oct 06 '12 at 18:08
  • Quack! What an ugly duck.... – ncmathsadist Jun 08 '13 at 01:14
19

Perhaps I can flip the question around, and ask if there is anything in programming that is absolutely and perfectly good? If you can't think of one thing (I know I can't), then the concept of evil is also just as muddy.

There are common behaviors that lead to mistakes, misunderstandings, and other general confusion--but to say that language feature X is inherently evil is to admit that you really don't understand the purpose of feature X.

There are common behaviors that can save a lot of heartache and avoid some misunderstandings--but to say that language feature Y is inherently good is to admit that you don't fully understand all the implications of using feature Y.

We are a people of finite understanding, and strong opinions--a dangerous combination. Hyperbole is just a way of expressing our opinions, exagerating facts until they become fiction.

Nevertheless, if I can avoid behaviors that lead to problems and pursue behaviors that avoid them, I just might be a bit more productive. At the end of the day that's what it's all about.

Berin Loritsch
  • 45,784
  • 7
  • 87
  • 160
  • 3
    *is anything in programming absolutely and perfectly good?* Clear, well written documentation? – James Dec 21 '10 at 22:22
  • 4
    @James Clear, well-written documentation is often used for an excuse of unreadable code, and can become shackles that needs to be maintained together with the code if it's overdone. Clear, well-written, but completely outdated documentation can also become pure evil. – Eugene Yokota Dec 21 '10 at 22:49
  • Working code!!! – Scott Whitlock Dec 22 '10 at 21:25
  • A comment that explains why. That is good. – Tim Williscroft Dec 23 '10 at 05:22
  • Ask a few different developers whether the same code is good or bad, and you will get different answers. To say it is _absolutely_ good then would be somewhat inaccurate. – Berin Loritsch Dec 23 '10 at 11:14
13

The only thing that springs to mind is this:

#DEFINE TRUE FALSE
#DEFINE FALSE TRUE

But once again, that's just plain old misuse hehe.

Kyle Rosendo
  • 416
  • 4
  • 16
11

I think skinning, auto-updaters that perpetually sit in the systray, applications that hijack file associations and other system settings, are straight evil.

Along with flash-only websites.

whatsisname
  • 27,463
  • 14
  • 73
  • 93
10

Everything that happens to work just by accident is inherently evil.

Let's consider the following C program, which happens to actually work on my machine, using default compiler options:

#include <stdio.h>
int main(int argc, char *argv[]) {
   char string[10];
   int y;
   for (y=0; y<10; string[12]++) {
      printf("%d\n", y);   
   }
}

Nothing, really nothing could ever excuse the way this program increments the loop counter. It's just a undefined effect that happens to do the right thing on my machine, my compiler, my default options.

user281377
  • 28,352
  • 5
  • 75
  • 130
10
So does absolute evil, that is something which is utterly incompatible with best practice in all instances, exist in programming? And if so what is it?

Yes; the standard C library function gets(). It's evil enough that the C standards committee has officially deprecated it, and it is expected to be gone from the next version of the standard. The mayhem caused by that one library call is scarier than the prospect of breaking 30+ years' worth of legacy code -- that's how evil it is.

John Bode
  • 10,826
  • 1
  • 31
  • 43
  • For those of us who don't speak C care to elaborate on why it's so evil? – Jon Hopkins Dec 22 '10 at 09:12
  • 8
    `gets()` takes a single argument, which is the address of a buffer. Characters are read from standard input into the buffer until a newline is seen. Because all it receives is the address of the buffer, `gets()` has no idea how big the buffer is. If the buffer is sized for 10 characters and the input stream contains 100, those extra 90 characters are written to the memory immediately following the buffer, potentially clobbering the stack. As a result, it's a favored malware exploit. It is unsafe and insecure *by design*. – John Bode Dec 22 '10 at 15:28
8

Easy, IBM Rational ClearCase is an atrocity.

7

So does absolute evil, that is something which is utterly incompatible with best practice in all instances, exist in programming?

Of course not. It's like asking if anything in my toolbox is evil. My hammer is a great "good" to me, unless my four year old gets her hands on it.

AJ Johnson
  • 803
  • 2
  • 7
  • 17
  • I think that its even worse when an old man with full use of his capabilities uses it to damage something good. – guiman Dec 21 '10 at 14:51
6

Today's evil was yesterdays perfect. It's evolution.

Heath Lilley
  • 1,290
  • 9
  • 12
6

Not to be too serious, but ...

We have very myopic views of "evil". People who kill lots of other people are evil. People who steal from others are evil. Every nation (that I know of) has some evil in their past. Some would like to deny it.

Is there evil in programming? We innocent programmers might like to think "not really". However, once I had a conversation with the inventor of a widely-used hierarchical database, on this very subject. Want to know who was one of the best customers? The secret police of Communist Poland.

Is there evil in the world now? You bet. And are they using programmers? You bet.

Mike Dunlavey
  • 12,815
  • 2
  • 35
  • 58
  • 2
    Possibly a more evil variant of evil than I was thinking. By comparison with the secret police we're really just talking naughty. – Jon Hopkins Dec 21 '10 at 16:56
  • @Jon: ... and in this season everybody knows "naughty" and "nice" go together :) – Mike Dunlavey Dec 21 '10 at 18:06
  • Genocide is always truly evil, I feel differently about theft. Stealing, although it hurts others is not the intention, but to improve ones own outcome. If the thief could easily improve their outcome without hurting that of whom they are stealing from they might. I consider this immoral but not truly evil. – JD Isaacks Dec 21 '10 at 18:50
  • @John: Robin Hood? I know what you mean. In fact, [Pareto Efficiency](http://en.wikipedia.org/wiki/Pareto_efficiency) is all about that, I think. It's the opposite of popular theory today. OMG, I hope this doesn't start a flame war... – Mike Dunlavey Dec 21 '10 at 18:56
  • @John Isaacks - So you're saying that a thief who makes, say, $6,000 in one hour time by stealing a car from an average programmer who works for say, two months to make $6,000 is only "correcting the imbalance"?! Or could it be that some people like thief in question simply assume they are better than others, so they don't have to work hard? – Jas Dec 21 '10 at 19:50
  • @Jas In that scenario I am not saying that he is correcting the imbalance. I would call them very bad but not evil. I am saying that the thief is probably not stealing the car to intentionally hurt the victim but to better his own circumstances. He doesn't care what the victims circumstances are afterwards. In *my opinion* this is very immoral indeed but not truly evil unless the purpose of stealing the car is to worsen the victims situation. And I do think some stealing is evil just not all. I am not saying stealing is right either not at all. Not not *always evil* – JD Isaacks Dec 21 '10 at 20:06
  • @Jas, my point is this, not caring how your actions affect others is wrong and immoral but not necessarily evil. Intentionally doing things to negatively effect others is evil. Again that is just my opinion, I am not claiming to be authoritative on the subject. – JD Isaacks Dec 21 '10 at 20:06
  • @Jas: @John: I might add, on these fine points, an Oscar Wilde quote: "As long as war is regarded as wicked, it will always have its fascination. When it is looked upon as vulgar, it will cease to be popular." I wonder if the wicked/vulgar distinction has a parallel in immoral/evil. – Mike Dunlavey Dec 21 '10 at 22:01
  • @John Isaacks - Well, the person who does the work of a paid assassin (hitman) is also doing work "professionally" - to increase their wealth. They have nothing personal against their targets and they certainly do not do their KILLINGS to inflict hurt, therefore, according to your way of thinking about it, what they do IS NOT EVIL?! Some serious flaws in your logic there... – Jas Dec 21 '10 at 22:04
  • @Jas, if you want to go there, evil is defined as "profoundly immoral and wicked"; in other words, it's the farthest on the scale of morality. There is no hard and fast way to differentiate it from simply immoral, especially given how most people judge immorality based on their own experiences. (In my opinion, evil is causing suffering primarily for pleasure. It is the motivation, not the act, but not everyone agrees.) – SilverbackNet Dec 21 '10 at 22:58
  • @SilverbackNet - A person might want to further own goals without motivation to harm, that is - without really caring if someone else will be hurt in the process. So they know they MIGHT inflict harm, and might not wish to, but still they don't care if they do. In any society on this planet, moral itself is widely understood to be a set of socially acceptable rules of behavior whereby ruining others for personal gain (knowingly or not) is not one of them. Ergo, according to your own definition, this "passive infliction of harm" is still evil. – Jas Dec 21 '10 at 23:39
  • @Jas: No, my definition isn't just infliction of harm; it's infliction of harm for its own sake - obtaining pleasure in making others suffer greatly. Other rewards are secondary to the primary goal of hurting others. That's evil, in my book. You can call careless & reckless behavior evil, that's for each to judge. – SilverbackNet Dec 22 '10 at 01:29
6

Null is the root of Evil!

https://softwareengineering.stackexchange.com/questions/22912/null-references-the-billion-dollar-mistake-closed

The Billion Dollar Mistake: I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. In recent years, a number of program analysers like PREfix and PREfast in Microsoft have been used to check references, and give warnings if there is a risk they may be non-null. More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965. C.A.R. Hoare, 2009

Amir Rezaei
  • 10,938
  • 6
  • 61
  • 86
  • Amir, obviously a lot of people don’t get this (as evidenced by HLGEM’s comment below the question, and by the low number of votes you have received). Since nullable references are so ingrained in our thinking, it may be hard to realize that they aren’t really natural. So maybe you should explain how the alternatives work, or at least link to articles that do. – If people understood your answer, I’m pretty sure it would be one of the highest ranked here, since the point you made is 100% valid. – Konrad Rudolph Dec 22 '10 at 15:16
  • @Konrad Rudolph There are answers regarding this under the "Null References, The Billion Dollar Mistake" that I have sent. There are languages that have removed support for Null. – Amir Rezaei Dec 22 '10 at 15:21
  • +1 the sad truth is that most programmers have no idea about the alternatives. I'm not against the *existence* of null references, just the notion that it *means* something. It's simply a pointer that points to an invalid memory location - nothing more. Most programmers treat it as having some meaning in the domain (eg the middle name field is not required, therefore it can be null). In my programs *null* carries no additional meaning on top of what is specified in the language specification. To me a null reference is always an error. If a field is optional, I use the appropriate option type. – MattDavey Aug 01 '12 at 09:28
  • @AmirRezaei: What should be done with references which come into existence before there is anything useful for them to point at? I would suggest that it is better to have such references point to something which can obviously not be dereferenced, than require that they point to an object which could be dereferenced legally but not usefully. While there are times it's useful to have references declared in a fashion that requires that they be initialized before anything can see them, requiring that to always be the case would create chicken-and-egg problems. – supercat Oct 06 '12 at 18:14
6

Copy-paste code.

If you don't itch when you are doing that, you are not a real programmer.

egarcia
  • 291
  • 2
  • 7
5

I personally find Donald Knuth's phrase: "premature optimization is the root of all evil" as the first evil thing in programming. In an expirienced point of view (that says that i have failed for this).

Actually, the phrase says something like: Don't try to understand the problem in a particular enviroment, particular PC or set of users before you get in deep into the problem.

4

I'm surprised no one has floated Globals as a true evil. No better way to be programming in an environment about which you have no idea of the parameters and virtually no control over what happens to them. Chaos! I have a strict ban on the use of global variables in all of my coding.

Richard J
  • 101
  • 2
  • +1000 I'd vote this up higher if I could. Single biggest problem in programming today. And countless articles, dialog, and effort goes into educating people about the problem and I don't think it ever changes. Everyone starts out using them as beginners and seldom see a problem with them. Almost every project I come into in a professional settings suffers from over used globals (minus Spring based projects -- mostly). Although you gotta love it when you find BlahBlahManager.getInstance() in a Spring project. Bought the book, still didn't get it. – chubbsondubs Dec 22 '10 at 02:31
  • 6
    Find me a non-trivial application that doesn't have globals. Oh they're wrapped up and neater and protected and in a class and in a framework and generally better... but if there's just one for the application its a pretty much global. Where the issue is is with use of inappropriately scoped variables. – Murph Dec 22 '10 at 09:23
  • 2
    Global variables don't deserve such a bad reputation. Like @Murph said, *some things are global*. The file system is global (all processes use the same); your process is global to all the threads; the memory is global (another process can use up your memory and crash you while he safely use memory parachute); the _user_ himself is global (think in terms of UI design). It's not wrong to model something inherently global as global variables -- or Singleton, or Service Locator or maybe Registry. – kizzx2 Dec 22 '10 at 18:09
3

No tool is inherently evil. Its existence may be utterly foolish for all but a single use case but that does not make it evil. It puts the onus of deciding the proper use on the programmer.

Bob Roberts
  • 1,777
  • 2
  • 11
  • 12
3

Well, I thought Microsoft is/was considered evil and now recently Oracle is the most evil thing in the world.

Uwe Keim
  • 347
  • 4
  • 13
3

I know i said i wouldnt make a post but i'll write one answer. As much as everyone else says no there are no evils i'll say yes there are some absolute evils

Setjmp/LongJmp is pure evil.

  • 1
    +1 I have yet to see a proper use case for Setjmp/LongJmp. – Oliver Weiler Dec 22 '10 at 01:26
  • @Helper Method: haha, thanks. I am glad someone agrees with me –  Dec 22 '10 at 03:14
  • 4
    try/finally/catch+throw are implemented via Setjmp+LongJmp. – comonad Dec 22 '10 at 10:41
  • @comonad: WTF NO! exceptions/throw unwinds the stack calling destructors and such. LongJmp throws the stack away pretending it doesnt exist. –  Dec 22 '10 at 21:51
  • @acidzombie24: huh? whow! did not think of that. exceptions seem to be even more complex than I thought. Still, I could think of an implementation that calls all destructors and makes then a LongJmp instead of unwinding the stack. that would probably be even more efficient (depending on the LongJmp implementation), but would need a 2nd stack for object and function pointers. – comonad Dec 24 '10 at 09:05
  • 3
    SetJmp+LongJmp can be used to realize CPS (Continuation passing style) via Tail Calls using languages/compilers without PTC (Proper tail calls). It is an optimized version of trampolining.see http://en.wikipedia.org/wiki/Tail_call#Through_trampolining and http://en.wikipedia.org/wiki/Continuation#Kinds_of_continuations – comonad Dec 24 '10 at 09:19
  • @comonad: hmm interesting, about the continuations/yield. It might be possible to use it for creating continuations but I never heard of anyone creating a stack (for the cpu) and jumping to it. Actually just thinking of an implementation you can do it without having another stack and never use a goto (or setlong/longjmp) so this wouldnt be useful there either and you definitely shouldnt use it for trampolining (i cant even imagine WHERE to use it. Its just return values and regular func calls). So still i say Setjmp/LongJmp is pure evil and i still havent seen a use for it in any condition(s). –  Dec 25 '10 at 07:33
  • PS: bonus information. Exceptions take long not because it has to find where in the stack to unwind but to collect stack information. It has to iterate where it is all the way back to main() (or start of thread) and get each of the function name and line for the stack trace. You dont need to throw to get the stack information (in .NET). Although i see some javascript tricks to get the methodname by doing just that (throw then parse the stacktrace message) http://msdn.microsoft.com/en-us/library/system.environment.stacktrace.aspx –  Dec 25 '10 at 07:45
  • @acidzombie24: You only need stacktrace informations for debugging, so either you would use a debugger or won't need stacktraces at all. In C, C++, and probably in all functional languages, too, you can't have stack traces while not debugging, but you are allowed to implement infinite recursions and CPS thanks to some TCOs (tail call optimizations), especially PTC and TCR (tail call recursion). I don't know about C# or .NET, but at least Java's Byte Code and ECMAScript are not designed to be optimized in these ways at all; C/most other languages do not have this very questionable restriction. – comonad Dec 25 '10 at 19:29
  • @acidzombie24: CPS is not about creating another stack at all: In functional languages the program is a graph where a function is a node/object that has an eval-method and some references to other nodes. The execution is a graph-reduction, where the sequence of the methods to call is an evolving datastructure instead of a hard-coded list of commands. Each node has a reference to its continuation (the next to run node) passed as parameter; no method returns, and without PTC the stack will always grow. LongJmp cuts the stack, and trampolines are a slow alternative implementation via while-loop. – comonad Dec 25 '10 at 20:08
  • 1
    Even more evil than setjmp/longjmp is using setjmp/longjmp to implement a poor man's super lightweight threading library (as I've seen done way back in the DOS days). Delightfully evil code! :-) – Brian Knoblauch Jul 17 '12 at 12:45
  • Setjmp/longjmp can be very useful in systems which simulate multitasking by having most "threads" be state machines, and having the main thread call a 'poll' routine to run those state machine any time it has nothing better to do. If the command processor is in a state machine, a setjmp/longjmp may be the best way to handle commands that would require something akin to a soft boot. For example, I implemented once system which would talk to a bunch of remote devices and relay information to/from a main controller. The main thread talked to the devices, and a poll routine... – supercat Oct 06 '12 at 17:59
  • ...talked to the main controller. If the main controller sent a command to reconfigure the devices, all the memory data structures related to the devices would get reformatted, and any pointers held by the main thread would become invalid. Since any operations in progress would be rendered effectively meaningless, I simply used setjmp/longjmp to have the main thread abandon anything it was doing and restart with the new data structures. I can't think of any cleaner alternative in straight C. – supercat Oct 06 '12 at 18:01
  • @supercat it sounds like you were working with evil. But this doesn't make sense. Why would all the memory get reformatted? It cant? because then the memory holding the address for longjmp would be invalid as well. But when working with evil i guess you have no choice. Did the HW have any interrupts? I can't think of any hardware without at least 1 –  Oct 06 '12 at 21:58
  • @acidzombie24: The objects being controlled were subdivided into up to IIRC 16 groups whose length could be independently configured to 254 units each, but there was only enough memory to handle about 1000 units total. Because some systems would have many small groups and others would have a few large ones, I couldn't statically allocate the memory, but the memory configuration wouldn't change except when a "restart using specified memory layout" command was received. What I did was define a `unsigned char main_buffer[MAX_DATA_SIZE];` and allocate space from that. – supercat Oct 06 '12 at 22:17
  • @acidzombie24: When the configuration changed, all previous allocations from `main_buffer` (which took about 3/4 of RAM) would be ignored, and new data structures would be set up there, but most other things in memory would be left alone. Since all pointers to addresses within `main_buffer` would become invalid when new data structures replaced the old ones, I needed to ensure that any functions which had local variables that pointed within that structure would exit before attempting to use those pointers again. In C++, exceptions would have been better, but in C, setjmp/longjmp worked. – supercat Oct 06 '12 at 22:21
  • @supercat I think i get it. You didnt implement threads with it did you? You just simply jump to the start of the thread so it can exit and thus reallocate the units for that object group. I suppose your saying if you had C++ throwing an exception and catching it at the thread root would have been the same idea? –  Oct 06 '12 at 23:24
  • @acidzombie24: That's basically what I'm saying. I was debating between having nearly every single call to `pc_poll()` be followed by `if (pc_restart) return;`, but that would cluttered up the source code and created the possibility of Heisenbugs if I missed one. The other possibility I considered would have been to mark the area of RAM holding the configuration so it wouldn't get cleared on startup and then have the startup code do a checksum validation on that area and allocate space accordingly. When the config changes, update that area and do a hard reset. – supercat Oct 06 '12 at 23:28
  • @acidzombie24: I thought `setjmp`/`longjmp` provided a nice compromise position--it ended up doing about 90% of what would be done on a reset, but with the loaded configuration intact. An alternative would have been to have the unit start in "ready to configure" mode, and require that once a "set configuration" command was received, a "reset" command would be required before changing configuration. That might have been better, except that I had to conform to an existing communications protocol which didn't work that way. – supercat Oct 06 '12 at 23:31
  • @supercat ah ha yeah i see. Thats not a good situation to be in. I still call Setjmp/LongJmp evil. Exceptions are the replacement which isn't evil (well, according to others they are). I believe that would have solved your problem but i am not 100% sure –  Oct 06 '12 at 23:38
  • @acidzombie24: They would have solved my problem perfectly, but at the time I knew nothing about C++ and even if I had, I doubt there were any C++ compilers for the target processor. Setjmp is tricky and often hard to use properly, but there are some cases where there's no good alternative, so it's not evil. More evil in my mind are some ideas, like the notion that `Equals` should match the behavior of `==`, notwithstanding the fact that `Equals` is supposed to behave as an equivalence relation and `==` doesn't, or that structs which will behave as PODs should wrap fields in properties. – supercat Oct 06 '12 at 23:59
  • lol what, PODs should wrap fields in properties? I dont understand the wrap part and i dont think I understand what you mean by properties (first i thought you mean using structVar.struct_a=mystruct_a; which i think is fine but wrap has nothing to do with it and now i doubt you'd call that a property) –  Oct 07 '12 at 00:05
  • @acidzombie24: Structures in .net like `Drawing.Point`, rather than simply exposing fields `X` and `Y`, have `X` and `Y` properties that get/set the value of a private fields (a property is a special pair of get/set functions). Some people insist mutable structures are evil, but structures with exposed fields work fine; the problem is with struct member functions that alter the value of the underlying struct. – supercat Oct 07 '12 at 04:11
3

What does the FAQ mean by "such and such is evil"?

It means such and such is something you should avoid most of the time, but not something you should avoid all the time. For example, you will end up using these "evil" things whenever they are "the least evil of the evil alternatives." It's a joke, okay? Don't take it too seriously.

mmdemirbas
  • 507
  • 1
  • 5
  • 10
3

Redundant code is very very evil.

2

Code in your native language (not English), write documentation in your native language. And then outsource the project to an Indian company.

That's evil for you!

P.S.: For the record, it happened, and the Indians didn't find it very funny.

talonx
  • 2,762
  • 2
  • 23
  • 32
Demian Kasier
  • 794
  • 3
  • 13
  • 1
    Maybe they should have conducted business in such a way that they understood that before they took on the project... – Merlyn Morgan-Graham Dec 22 '10 at 01:01
  • How would they even expect that? Do you think it's good to have non-english code/documentation in a global sense? – k25 Jul 13 '11 at 16:34
2

Though any tool can be used for good and evil, some tools are evil because they often surprise programmers who don't use them frequently.

I consider the unsigned right shift operator (>>>) in Java evil (surprisingly improper) when working with integers that are shorter than 32 bit.

Say you have a byte b with value -1.

byte b = -1;  // binary: 1111 1111

The unsigned right shift operator shifts zeroes into the leftmost bits. So one assumes a shift by 7 to result in 1.

b >>>= 7;  // binary: 0000 0001 ?

But instead this operation does nothing at all. b is still -1.

Even all of the following 25 shifts do nothing:

byte b = -1;
for (int i = 0; i < 25; ++i) {
    b >>>= i;
    System.out.println(b); // always outputs -1
}

This happens because b>>>=7 roughly translates to

                                  1111 1111

1) the byte gets widened to a 32 bit int to make shifting possible
    1111 1111 1111 1111 1111 1111 1111 1111

2) the shift happens
    0000 0001 1111 1111 1111 1111 1111 1111

3) the resulting int gets narrowed to a byte again
                                  1111 1111

You would have to replace

b >>>= i;

by

b = (b & 0xFF) >>> (i % 8);     // >> would also work this time

to make it work as 'expected'.

Robert
  • 101
  • 3
  • 1
    i dont know java but why is this evil? are you saying it should only be used on ints and byte isnt an int? I find >>> a but weird since i know it as D's unsigned shift right which should not result in a -1. –  Dec 21 '10 at 21:52
  • @acidzombie24 Absolutely correct. >>> is an unsigned shift and >> is a signed shift. But when shifting a byte it first gets extended to an int (filled up with the sign) and then shifted. So even the 'unsigned' shift extends it with its sign. In fact the result is a positive int but after storing it into a byte it's negative again. – Robert Dec 21 '10 at 22:14
  • whoa, that is kind of evil. Or at least very likely to give you unintended results. I'm just thinking cant it apply the shift to a byte? if not converting it to an int and back is... not good. –  Dec 21 '10 at 22:32
  • @acidzombie24 There is no machine code for doing anything with bytes (except loading/storing). On the JVM level 32 bit is the smallest unit. So all the work has to be done either by the compiler or by the programmer. The compiler won't change. I've added the needed adjustment to my post. – Robert Dec 21 '10 at 22:56
2

I'm gonna turn this around and say that while there's no absolute evil, there are tools and constructs which make it more plausible for our feeble humans with such a limited skull size to make mistakes than others.

So I'd say you could talk about the evilness of a construct based on how likely people are to make mistakes with it. Sure you can cut bread with a knife or with a chainsaw with blades as grip, but one is more likely to cause damage than the other, even though you may be able to pull it off with enough care.

Kenji Kina
  • 100
  • 6
1

I'm tempted to say continuations, but I think the correct answer is that there is no objective, absolute evil in programming. On the other hand, even the best tools can be abused.

Larry Coleman
  • 6,101
  • 2
  • 25
  • 34
  • Not many people understand continuations much less had a chance to use them extensively. What situations have you encountered that convinced you they were evil? The one instance where continuations really change the game is asynchronous programming ,especially in UI programming, where you have to keep pumping events so your UI doesn't freeze up. Asynchronous programming causes all sorts of headaches because reuse works really well in synchronous style development. Continuations can be used to restore synchronous style but keep it from freezing the UI. That's pretty awesome. – chubbsondubs Dec 22 '10 at 02:41
  • 1
    I don't think continuations are evil, but I think plenty in the community might, given that any code written using them will automatically be "clever." – Larry Coleman Dec 22 '10 at 10:59
1

Programming, per se, I think, is not inherently evil. However, programming is very often a social activity, and disrespecting those around you can be very evil. People often forget that most code is going to be shared with others; mostly read, sometimes written too. Be it open source, a product that a company is releasing, or a small piece of patching up a consultant is hired for, programs are going to be read.

That's half the reason why so many "considered harmful" articles exist, or why people say "never". Making life difficult for others is the very root of all evil. Isn't it?

pyNem
  • 399
  • 2
  • 2
1

Yes, there is plenty of evil to be had. For example:

Type1 variable1 = function12()
variable5 = variable1.myMethod(variable1+aGlobal);
variable2.otherMethod(anotherGlobal);
Mud
  • 438
  • 3
  • 8
1

Increasing the total cost of the system for insufficient benefit. It could be too much copying and pasting, too complex an architecture, or using pricey but ineffective commercial products. Generally speaking all software techniques are aimed at reducing the total cost of a system, and if we end up with a overly expensive system then we have done wrong.

David Plumpton
  • 171
  • 3
  • 4
1

I think there are evil things in programming, but I don't use the term pejoratively.

Evil is when code pretends to behave in one way, but in reality behaves in a very different fashion, and in a way that hurts a unenlightened rational programmer. I often refer to this as a type of "Magic." Magic is anything who's functionality is "hidden" from the programmer, and it comes in different styles.

Example: in Scheme the functions "car" and "cdr" could be implemented using functions only, however, they are not. Instead they are implemented at a lower level imperatively because that runs faster on most computers. I'd call this "white magic." Its not evil, but its definitely magic.

By comparison the unique number NAN in Javascript is not equal to any other number... even itself. This is "black magic." I don't want to get into a discussion of why you have NAN in Javascript (or why you have both Infinity and NaN), but you can see why a such a simple concept would be useful to a language with only floating point numbers. However, having a constant number which cannot be tested for in the same way as other constant numbers is not something one would expect. Fortunately Javascript provides isNAN to help solve this issue, but if you are unaware of NaN's unique property you might write the following code and get burned:

if(x == NaN) 

or if you're clever you might try the following with the same results

if(x === NaN)

I joking refer to this as getting "mana burned" (it is magic afterall...).

I realize there are good reasons why you want things which are not numbers to be automatically equal to themselves, but you have to remember that for IEEE floating point numbers NaN has a specific bit sequence and it is similar to other numbers in this respect. If you treat Javascript NaN the way you might treat an IEEE floating point NaN you are liable to get burned. This is both deceptive and frustrating, the former being the reason I refer to this as Evil.

Then again, its possible people think otherwise...

tzenes
  • 196
  • 4
  • Your point about NaN's brings up something I see as a limitation in many languages and frameworks: a lack of clearly-distinguished comparison operators, often combined with a misguided notion that different methods of comparison should be expected to yield the same results. For example, with floating-point numbers, I'd like to have both a comparison operator where NaN != NaN, but +0 == -0, and an equivalence operator where any value, including NaN, will be equalivalent to itself, but where +0 and -0 are distinct. – supercat Oct 06 '12 at 17:48
1

When the deadline is near, and requirements change, design changes, and you spend 16 hours in office, that is evil.

Manoj R
  • 4,076
  • 22
  • 30
0

Is anything in programming truly evil? I mean tools, languages, statements, whatever

Nothing on that list is evil. Here is why.

Humans were created as beings with a free will. They may use their powers to go into the dark side or embrace the light side. This choice defines what comes out of their hands.

Now, those tools and frameworks were created by benevolent people who were genuinely trying to make things somehow better with a good will. Consequently, their creations (either successful or not is irrelevant) do not bear the imprint of evil. At the least they are neutral, but not malicious.

Then those tools come into the hands of other individuals. And whatever they do with them depends on those people. Even a debugger could be turned into an evil instrument in the hands of a hacker trying to remove the serial number check in some software.

But it does not redefine in any way the characteristic of the tool. They are all good. Some are more useful, some less. Some dangerous, some less. But still, they are all good and quite useful in certain scenarios.

And if a programmer by a mistake misuses a tool and causes damage, the tool does not become evil. It's just a programmer mistake, lack of knowledge, ignorance, whatever. But without any evil intent.

Bottom line is, all tools are good. We must love them all.

0

There's no objective absolute evil.

The problem with things like GOTO is that it's awfully hard to USE it in a way that's not nasty. Enough so that it's hard for me sitting right here now to think of an example of GOTO that isn't code smell. But I'm willing to accept that there might be such uses, and that the problem isn't the tool itself but rather the likelihood of its abuse.

Dan Ray
  • 9,106
  • 3
  • 37
  • 49
  • GOTO is nasty *where there is an alternative* - but there isn't always an alternative (there should be in a modern, high level language, but we don't always get to use modern, high level, languages) – Murph Dec 21 '10 at 19:08
0

Some programming instructions / style / conventions are very often misused to make terrible, evil if you will code, but in the hands of a master programmer they can be used in a very elegant fashion, good if you will.

On the other hand I know of nothing that can prevent incompetent or inexperienced programmers from from producing bad, evil if you will code even with the best tools, languages, and so on.

By avoiding often misused elements, such as goto, the hope is that mediocre programmers can avoid the evil and perhaps make it somewhat easier to write good code.

Jim C
  • 883
  • 4
  • 12
0

Nothing is really evil not even weapons, yet we sometimes consider them as such. However with weapons we usually have a certain level of respect, we are very much aware of their danger and use them with caution; however there are still people too stupid to use them.

The same applies to any tool, the more difficult the consequences of abusing those tools are, the more likely people are abusing them.

In programming everything is more or less virtual, a program is a representation of our thought process and the long term consequences of not understanding something entirely or getting it slightly wrong are a lot harder to determine, then the immediate danger of death we are faced with when handling a gun.

This makes the tools we use and have at our disposal a lot harder to use but it also gives us an easy way to measure the skill of a programmer. The knowledge of when and how to use the tools you have is crucial to become a good programmer. You can always play it safe by restricting the set of tools but just to the guy to whom everything looks like a nail, eventually you will encounter something that no amount of hitting will fix.

DasIch
  • 141
  • 3
0

The only thing 'evil' is thinking that certain programming things are 'evil'. Various approaches - use this technique, dont use that statement, use this methodology, dont use that language, use this 'best practice', etc - are opinions.

Unfortunately, too many developers mistake opinions for objective truth.

I cringe every time I see a question on here such as 'why is [x] considered evil?'.

GrandmasterB
  • 37,990
  • 7
  • 78
  • 131
0

Everything has it's own valid use, or it wouldn't be implemented. Things are called evil because people do not use it in a safe or proper manner. Consider this...

Is a sword evil? Nope. Just because someone may try to use a sword in a hospital does not mean it is evil. Is a can opener evil? Nope.

Everything has it's proper use. Hence, no evil.

Yoshiyahu
  • 101
  • 2
0

Visible side-effects in unexpected places.
In .Net, examples would be ToString(), Equals() / GetHashCode(), property getters, and implicit conversions.
(To be clear, I'm talking about something more than lazy instantiation or logging code)


throw new Exception(ex.Message);

Explanation

SLaks
  • 1,204
  • 11
  • 16
0

I agree with the c++ FAQ on this:

[6.16] Will I sometimes use any so-called "evil" constructs?

Of course you will!

One size does not fit all. Stop. Right now, take out a fine-point marker and write on the inside of your glasses: Software Development Is Decision Making. "Think" is not a four-letter word. There are very few "never..." and "always..." rules in software — rules that you can apply without thinking — rules that always work in all situations in all markets — one-size-fits-all rules.

In plain English, you will have to make decisions, and the quality of your decisions will affect the business value of your software. Software development is not mostly about slavishly following rules; it is a matter of thinking and making tradeoffs and choosing. And sometimes you will have to choose between a bunch of bad options. When that happens, the best you can hope for is to choose the least bad of the alternatives, the lesser of the "evils."

You will occasionally use approaches and techniques labeled as "evil." If that makes you uncomfortable, mentally change the word "evil" to "frequently undesirable" (but don't quit your day job to become an author: milquetoast terms like that put people to sleep :-)

Jason Baker
  • 9,625
  • 8
  • 44
  • 67