88

As a long-time C# programmer, I have recently come to learn more about the advantages of Resource Acquisition Is Initialization (RAII). In particular, I have discovered that the C# idiom:

using (var dbConn = new DbConnection(connStr)) {
    // do stuff with dbConn
}

has the C++ equivalent:

{
    DbConnection dbConn(connStr);
    // do stuff with dbConn
}

meaning that remembering to enclose the use of resources like DbConnection in a using block is unnecessary in C++ ! This seems to a major advantage of C++. This is even more convincing when you consider a class that has an instance member of type DbConnection, for example

class Foo {
    DbConnection dbConn;

    // ...
}

In C# I would need to have Foo implement IDisposable as such:

class Foo : IDisposable {
    DbConnection dbConn;

    public void Dispose()
    {       
        dbConn.Dispose();
    }
}

and what's worse, every user of Foo would need to remember to enclose Foo in a using block, like:

   using (var foo = new Foo()) {
       // do stuff with "foo"
   }

Now looking at C# and its Java roots I am wondering... did the developers of Java fully appreciate what they were giving up when they abandoned the stack in favor of the heap, thus abandoning RAII?

(Similarly, did Stroustrup fully appreciate the significance of RAII?)

Jesse C. Slicer
  • 6,002
  • 2
  • 32
  • 44
JoelFan
  • 7,025
  • 4
  • 38
  • 53
  • 5
    I am not sure what you are talking about with not enclosing resources in C++. The DBConnection object probably handles closing all resources in its destructor. – maple_shaft Nov 07 '11 at 14:54
  • 18
    @maple_shaft, exactly my point! That is the advantage of C++ that I am addressing in this question. In C# you need to enclose resources in "using"... in C++ you do not. – JoelFan Nov 07 '11 at 14:55
  • 13
    My understanding is that RAII, as a strategy, was only understood once C++ compilers were good enough to actually use advanced templating, which is well after Java. the C++ that was actually available for use when Java was created was a very primitive, "C with classes" style, with *maybe* basic templates, if you were lucky. – Sean McMillan Nov 07 '11 at 17:48
  • 4
    Its even worse than you think in java. As you need a nested try block inside finally to make it work really correctly: http://stackoverflow.com/questions/161177/does-c-support-finally-blocks-and-whats-this-raii-i-keep-hearing-about/161247#161247 – Martin York Nov 07 '11 at 17:49
  • 8
    "My understanding is that RAII, as a strategy, was only understood once C++ compilers were good enough to actually use advanced templating, which is well after Java." - That's not really correct. Constructors and destructiors have been core features of C++ since day one, well before widespread use of templates and well before Java. – Jim In Texas Nov 07 '11 at 18:15
  • C++/CLI (.NET) has garbage collection as well as RAII... I wonder if it is unique among languages in that way. – JoelFan Nov 07 '11 at 18:38
  • 8
    @JimInTexas: I think Sean has a basic seed of truth in there somewhere (Though not templates but exceptions is the crux). Constructors/Destructors were there from the beginning, but there importance and the concept of RAII was not initially (whats the word I am looking for) realized. It took a few years and some time for the compilers to get good before we realized how crucial the whole RAII is. – Martin York Nov 07 '11 at 20:15
  • @Jim - the term RAII was originally coined relating to a strategy for handling exceptions that occur during a constructor - the principle being that the constructor should be written to never leave semi-constructed objects needing partial destruction. This seems to me like the final ingredient in the RAII resource-management recipe, which is I guess why the term gained its current wider meaning. It's an important development, but neither C++-style RAII nor GC is universally a better solution. –  Nov 07 '11 at 21:08
  • I do not know the details of how Java was developed and why it does not have RAII. I am also not a Common Lisp expert. But it seems to me that RAII can be implemented very easily using an idiom like with-open-file (see cl-cookbook.sourceforge.net/files.html). This might explain why the need for RAII was not felt by the Java designers (some of which were also Lisp programmers). I would like to read comments on this topic from someone who has more information. – Giorgio May 10 '13 at 08:16
  • @Giorgio, the "with-open-file" macro in Lisp is similar to the "using" construct in C#. See the original question for why "using" is not a complete replacement for RAII – JoelFan May 10 '13 at 11:30
  • 1
    @JoelFan: This isn't so much "an advantage of C++" but rather a feature which has up and downsides. You don't _need_ to use a `using` block in C#, it's just that because C# has automatic garbage collection, you don't quite know exactly when the garbage is collected (and thus disposed), which means you can never be sure if your resource is already deallocated. There are plenty of reasons why automatic garbage collection dramatically simplifies coding and reduces the chance of memory leaks and subsequent bugs, which are all the negatives of C++ that you are glossing over in your assessment. – Flater Jan 11 '19 at 07:07

11 Answers11

62

Yes, the designers of C# (and, I'm sure, Java) specifically decided against deterministic finalization. I asked Anders Hejlsberg about this multiple times circa 1999-2002.

First, the idea of different semantics for an object based on whether its stack- or heap-based is certainly counter to the unifying design goal of both languages, which was to relieve programmers of exactly such issues.

Second, even if you acknowledge that there are advantages, there are significant implementation complexities and inefficiencies involved in the book-keeping. You can't really put stack-like objects on the stack in a managed language. You are left with saying "stack-like semantics," and committing to significant work (value types are already hard enough, think about an object that is an instance of a complex class, with references coming in and going back into managed memory).

Because of that, you don't want deterministic finalization on every object in a programming system where "(almost) everything is an object." So you do have to introduce some kind of programmer-controlled syntax to separate a normally-tracked object from one that has deterministic finalization.

In C#, you have the using keyword, which came in fairly late in the design of what became C# 1.0. The whole IDisposable thing is pretty wretched, and one wonders if it would be more elegant to have using work with the C++ destructor syntax ~ marking those classes to which the boiler-plate IDisposable pattern could be automatically applied?

Stephen C
  • 25,180
  • 6
  • 64
  • 87
Larry OBrien
  • 4,927
  • 2
  • 21
  • 25
  • 2
    What about what C++ / CLI (.NET) has done, where the objects on the managed heap also have a stack-based "handle", which provides RIAA? – JoelFan Nov 07 '11 at 18:48
  • 3
    C++/CLI has a very different set of design decisions and constraints. Some of those decisions mean that you can demand more thought about memory allocation and performance implications from the programmers: the whole "give em enough rope to hang themselves" trade-off. And I imagine that the C++/CLI compiler is considerably more complex than that of C# (especially in its early generations). – Larry OBrien Nov 07 '11 at 20:48
  • 1
    No... just put the relevant code in the destructor – JoelFan Nov 07 '11 at 20:52
  • (And, as far as it goes, I have no idea how they can GC "through" reference types whose lifetime is determined by the stack.) – Larry OBrien Nov 07 '11 at 20:55
  • 5
    +1 this is the only correct answer so far - it's because Java intentionally doesn't have (non-primitive) stack-based objects. – BlueRaja - Danny Pflughoeft Nov 07 '11 at 20:58
  • 1
    @Larry Obrien... once the stack variable goes out of scope the destructor is called. – JoelFan Nov 07 '11 at 21:13
  • C# does have a destructor syntax, it just doesn't have guarantees on when it will be called. It's a common pattern to make classes which implement `IDisposable` also implement a destructor as a fallback in case the user forgets the `using` clause. – Peter Taylor Nov 07 '11 at 22:33
  • 8
    @Peter Taylor -- right. But I feel that C#'s non-deterministic destructor is worth very little, since you cannot rely on it to manage any kind of constrained resource. So, in my opinion, it might have been better to use the `~` syntax to be syntactic sugar for `IDisposable.Dispose()` – Larry OBrien Nov 07 '11 at 22:59
  • 3
    @Larry: I agree. C++/CLI *does* use `~` as syntactic sugar for `IDisposable.Dispose()`, and it's much more convenient than the C# syntax. – dan04 Nov 08 '11 at 01:59
  • 2
    @dan04: From what I understand, it's more than syntactic sugar, since C++/CLI defines am `IDisposable.Dispose` method to include not only the user-specified dispose code, but also disposal calls for `IDisposable` fields that the class owns. – supercat Jul 12 '12 at 02:36
  • 1
    +1 Here's an MSDN link from 2003 [explaining why a deliberate decision was made](http://msdn.microsoft.com/en-us/library/0t81zye4(v=vs.71).aspx) to move away from the COM reference-counting model, so that the garbage collector could cope with circular references and also for performance reasons. "If a group of objects contain references to each other, but none of these object are referenced directly or indirectly from stack or shared variables, then garbage collection will automatically reclaim the memory." – MarkJ Oct 30 '13 at 21:02
43

Keep in mind that Java was developed in 1991-1995 when C++ was a much different language. Exceptions (which made RAII necessary) and templates (which made it easier to implement smart pointers) were "new-fangled" features. Most C++ programmers had come from C and were used to doing manual memory management.

So I doubt that Java's developers deliberately decided to abandon RAII. It was, however, a deliberate decision for Java to prefer reference semantics instead of value semantics. Deterministic destruction is difficult to implement in a reference-semantics language.

So why use reference semantics instead of value semantics?

Because it makes the language a lot simpler.

  • There is no need for a syntactic distinction between Foo and Foo* or between foo.bar and foo->bar.
  • There is no need for overloaded assignment, when all assignment does is copy a pointer.
  • There is no need for copy constructors. (There is occasionally a need for an explicit copy function like clone(). Many objects just don't need to be copied. For example, immutables don't.)
  • There is no need to declare private copy constructors and operator= to make a class noncopyable. If you don't want objects of a class copied, you just don't write a function to copy it.
  • There is no need for swap functions. (Unless you're writing a sort routine.)
  • There is no need for C++0x-style rvalue references.
  • There is no need for (N)RVO.
  • There is no slicing problem.
  • It's easier for the compiler to determine object layouts, because references have a fixed size.

The main downside to reference semantics is that when every object potentially has multiple references to it, it becomes hard to know when to delete it. You pretty much have to have automatic memory management.

Java chose to use a non-deterministic garbage collector.

Can't GC be deterministic?

Yes, it can. For example, the C implementation of Python uses reference counting. And later added tracing GC to handle the cyclic garbage where refcounts fail.

But refcounting is horribly inefficient. Lots of CPU cycles spent updating the counts. Even worse in a multi-threaded environment (like the kind Java was designed for) where those updates need to be synchronized. Much better to use the null garbage collector until you need to switch to another one.

You could say that Java chose to optimize the common case (memory) at the expense of non-fungible resources like files and sockets. Today, in light of the adoption of RAII in C++, this may seem like the wrong choice. But remember that much of the target audience for Java was C (or "C with classes") programmers who were used to explicitly closing these things.

But what about C++/CLI "stack objects"?

They're just syntactic sugar for Dispose (original link), much like C# using. However, it doesn't solve the general problem of deterministic destruction, because you can create an anonymous gcnew FileStream("filename.ext") and C++/CLI won't auto-Dispose it.

Glorfindel
  • 3,137
  • 6
  • 25
  • 33
dan04
  • 3,748
  • 1
  • 24
  • 26
  • 4
    Also, nice links *(especially the first one, which is **highly** relevant to this discussion)*. – BlueRaja - Danny Pflughoeft Nov 08 '11 at 23:11
  • The `using` statement handles many cleanup-related problems nicely, but many others remain. I would suggest that the right approach for a language and framework would be to declaratively distinguish between storage locations which "own" a referenced `IDisposable` from those which do not; overwriting or abandoning a storage location which owns a referenced `IDisposable` should dispose the target in the absence of a directive to the contrary. – supercat Jul 13 '12 at 01:29
  • 1
    "No need for copy constructors" sounds nice, but fails badly in practice. java.util.Date and Calendar are perhaps the most notorious examples. Nothing lovelier than `new Date(oldDate.getTime())`. – kevin cline May 15 '13 at 03:35
  • 2
    iow RAII was not "abandoned", it simply didn't exist to be abandoned :) As to copy constructors, I've never liked them, too easy to get wrong, they're a constant source of headaches when somewhere deep down someone (else) forgot to make a deep copy, causing resources to be shared between copies that should not be. – jwenting May 16 '13 at 08:41
  • @jwenting copy constructors are fine if you default to subobjects or `unique_ptr` instead of `shared_ptr` – Caleth Apr 18 '21 at 11:38
  • @Caleth as stated, they're finnicky and error prone, usually more a pain than a cure – jwenting May 11 '21 at 10:59
  • @jwenting I have no trouble with the implicit copy-constructors for all my types, even the implicitly deleted ones – Caleth May 11 '21 at 11:07
39

Now looking at C# and its Java roots I am wondering... did the developers of Java fully appreciate what they were giving up when they abandoned the stack in favor of the heap, thus abandoning RAII?

(Similarly, did Stroustrup fully appreciate the significance of RAII?)

I am pretty sure Gosling did not get the significance of RAII at the time he designed Java. In his interviews he often talked about reasons for leaving out generics and operator overloading, but never mentioned deterministic destructors and RAII.

Funny enough, even Stroustrup wasn't aware of the importance of deterministic destructors at the time he designed them. I can't find the quote, but if you are really into it, you can find it among his interviews here: http://www.stroustrup.com/interviews.html

Nemanja Trifunovic
  • 6,815
  • 1
  • 26
  • 34
  • 2
    Interesting link... I wonder what the challenges would be to fundamentally rearchitect the JVM to assure that `finalize` becomes deterministic? Is it possible? Would Java applications be backwards compatible? What are the drawbacks? Maybe this would a good question to post separately? – maple_shaft Nov 07 '11 at 15:52
  • 11
    @maple_shaft: In short, it's not possible. Except if you invented a way to have deterministic garbage collection (which seems impossible in general, and invalidates all the GC optimizations of the last decades in any case), you'd have to introducing stack-allocated objects, but that opens several cans of worms: these objects need semantics, "slicing problem" with subtyping (and hence NO polymorphism), dangling pointers unless perhaps if you place significant restrictions on it or make massive incompatible type system changes. And that's just off the top of my head. –  Nov 07 '11 at 16:49
  • 7
    @delnan: the "slicing problem" only occurs if you're being an idiot. It's not a real problem. Just like memory leaks are not a real problem. You can still have polymorphism with stack-based objects. Dangling pointers are also not a real problem- everyone who has a brain cell uses smart pointers for heap objects, so the type system can tell you what you can and can't do with a pointer. – DeadMG Nov 07 '11 at 17:04
  • 13
    @DeadMG: So you suggest we go back to manual memory mangement. That's a valid approach to programming in general, and of course it allows deterministic destruction. But that doesn't answer this question, which concerns itself with a GC-only setting that wants to provide memory safety and well-defined behaviour even if we all act like idiots. That requires GC for everything and no way to kick off object destruction manually (and *all* Java code in existance relies at least on the former), so either you make GC deterministic or you're out of luck. –  Nov 07 '11 at 17:10
  • 26
    @delan. I would not call C++ smart pointers `manual` memory management. They are more like a deterministic fine grain controllable garbage collector. If used correctly smart pointers are the bees knees. – Martin York Nov 07 '11 at 17:54
  • 10
    @LokiAstari: Well, I'd say they're slightly less automatic than full GC (you have to think about which kind of smartness you actually want) and implementing them as library requires raw pointers (and hence manual memory management) to build on. Also, I'm not aware of any smart pointer than handles cyclic references automatically, which is a strict requirement for garbage collection in my books. Smart pointers are certainly incredibly cool and useful, but you have to face they can't provide some guarantees (whether you consider them useful or not) of a fully and exclusively GC'd language. –  Nov 07 '11 at 18:04
  • 11
    @delan: I have to disagree there. I think they are more automatic than GC as they are deterministic. OK. To be efficient you need to make sure you use the correct one (I will give you that). std::weak_ptr handles cycles perfectly well. Cycles is always trotted out but in reality it is actually hardly ever a problem because the base object is usually stack based and when this goes it will tidy the rest up. For the rare cases were it can be a problem std::weak_ptr. – Martin York Nov 07 '11 at 18:14
  • 7
    @delan: I have to disagree with your last point. It is actually GC languages that do not provide any grantees. They provide best effort at retrieving the memory. I find GC is very good for simple example applications but once you migrate to big projects you need to understand the GC in much more detail to make sure it works as efficiently as possible which sort of defeats the purpose of have in the GC in the first place. Also when new version of the language appear with a new GC you need to relearn how the GC works. – Martin York Nov 07 '11 at 18:19
  • 5
    @LokiAstari: As for your first point, I never doubted smart pointers aren't a very nifty way to have deterministic, hassle-free memory mangement. But fact is, you have to explicitly use them (over raw pointers), and you have to explicitly choose the "right" one (e.g. place a weak_ptr correctly to break a cycle, not use a scoped_ptr if you need copying/refcounting, etc.). That's not 100% automatic to me. Yes, it's deterministic and pretty easy if you know what you're doing, but automatic has a specific meaning (and that meaning isn't "deterministic"). –  Nov 07 '11 at 18:32
  • let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/1737/discussion-between-delnan-and-loki-astari) –  Nov 07 '11 at 18:32
  • 1
    @delnan: Since you should never use RAW pointers it becomes automatic. Then non automatic part is choosing that it is a good idea to use a RAW pointer. – Martin York Nov 07 '11 at 19:05
  • 1
    @delnan: Interesting point. Maybe a good idea would be to have a language that forbids raw pointers and only uses smart pointers. – Giorgio Nov 07 '11 at 21:49
  • 1
    @delnan: Those requirements are unnecessary. If the programmer is an idiot, de-referencing NULL is the least of your worries, and smart pointers are *more* automatic than GC. As mentioned often in this question, it's answers, and comments, good luck getting the GC to collect, say, file handles, database connections, etc. RAII does do those things. – DeadMG Nov 07 '11 at 23:52
  • GC has a problem with reference cycles which means it can't provide reliable destructors. Of course in C++, smart pointers have a reference-cycle issue, but the result is a memory leak rather than collection without destruction. This is usually a non-issue because these smart-pointer reference cycles just don't happen unless you're doing something atypical or stupid - but then non-memory resource management is usually easy enough in GC languages (without reliable destruction) except in atypical/stupid cases too. –  Nov 08 '11 at 00:44
  • 2
    Python actually switched - a very long time ago (probably around version 1.5), Pythons "garbage collection" was simple reference-counting, a bit like C++ smart pointers but done implicitly. That could leak memory. A change of policy happened at some point - Python still reference-counts, but now it can collect garbage in reference-cycles - the down-side being that destructors aren't guaranteed to occur before collection. Probably very few people noticed the difference. –  Nov 08 '11 at 00:54
  • 2
    @DeadMG: I know that GC can't provide deterministic cleanup; I always said so. The "which is more automatic" question seems to revolve solely around the definition of "automatic". Apart from that, you seem to repeat stuff that I agree and ignore my points that this is correct but irrelevant. BEFORE REPLYING, please think hard about what you're trying to say. If you want to say GC doesn't work with RAII, you're just repeating a fact that's universally agreed-upon. If you want to say RAII is better for resource management than whatever GC'd languages offer, then you're off topic. –  Nov 08 '11 at 14:01
  • @Steve314: Yes, GC can provide neither deterministic not reliable (in that some finalizers may *never* be called - good point to raise, wasn't mentioned yet) destruction. That's fact, and nobody (except some poor sabs who misunderstood GC in a certain way) claims otherwise. And smart pointers, used correctly, provide deterministic and reliable destruction. Both facts are widely acknowledged. But that wasn't the question. And GC doesn't have "a problem with cycles", its [purpose](http://blogs.msdn.com/b/oldnewthing/archive/2010/08/09/10047586.aspx) is entirely different. –  Nov 08 '11 at 14:06
  • Marking this one as the answer just because it seems more plausible than Larry's, even though I am pretty torn between the 2. I guess they are both correct in a way. What I take away from them both is (1) Gosling et. al. most probably didn't have full knowledge of the significance of RAII (e.g. in preventing such future awkward syntax like "using"), however (2) they did consciously forgo "deterministic finalization" as it was known then, even though most people at the time thought it mainly involved memory management, and... – JoelFan Dec 27 '11 at 15:54
  • 3
    ... (3) even if they would have had today's perspective of RAII, they probably wouldn't have changed their design decision in the end, due to the trade-offs involved, e.g. compiler complexity, additional programmer responsibility, and compromise in the simplicity of the language model – JoelFan Dec 27 '11 at 15:55
  • @Steve314: why do you say "GC has a problem with reference cycles"? No it doesn't. When the last root object (that references any object in the cycle) dies, all objects in the cycle are collected. Indeed, one of the significant advantages of GC languages is that you can't cause a memory leak by having a reference cycle, which will happen in C++ (unless you see there will be a cycle, and use a weak-ptr to break it). I am skeptical of the claim by others that this is minor issue in practice. Data structure is distributed between OS, third party libraries, and end programmers. – ToolmakerSteve Nov 21 '13 at 03:14
  • 2
    @ToolmakerSteve - the complete quote is "GC has a problem with reference cycles **which means it can't provide reliable destructors**." In a reference cycle, there's object that can be safely destructed first - all destructors for objects in the cycle may try to call something in the objects they reference. That's why GC is *only* suitable for cleanup of memory - not other resources (as in C++-style RAII). –  Nov 21 '13 at 12:09
  • 1
    @ToolmakerSteve - In practice, the problem isn't very severe because objects such as file handles can't be in reference cycles. The possibility of reference cycles only tends to happen with data structure nodes, where the only relevant resource is memory. Even so, "tends to" isn't much of a language guarantee, and static proof that a type neither needs reliable destruction nor potentially exists in a reference cycle doesn't really work - nodes that are never cyclic in practice nevertheless have links to nodes of the same type. Enforcing non-cyclic linking at run-time would be costly. –  Nov 21 '13 at 12:16
  • @ToolmakerSteve - so basically, Java file objects can guarantee that their underlying platform-specific handles get released because the library because the Java designers knew what guarantees they could require from the GC, and you don't need potentially-cyclic linking to implement a file object. But you still don't get a general guarantee of reliable destruction in Java or other GC languages. If you get a reference cycle, the memory may still be freed, but the destructors/finalizers will not be called, so cleanups for owned resources don't get done. –  Nov 21 '13 at 12:20
  • @ToolmakerSteve - BTW - since nulling a pointer when the pointed to object is no longer needed is about as difficult as freeing that pointer, GC languages still have memory leaks anyway. Neither approach is inherently superior. The C++ approach is particularly appropriate to low-level code where you tend to have more resource-management issues, the Java approach is particularly suited to high-level code, and a good programmer should know how to work with both, including the downsides and how to manage them. –  Nov 21 '13 at 12:23
  • @ToolmakerSteve - as it happens, delnan understood my point and objected *correctly* by saying "And GC doesn't have "a problem with cycles", its purpose is entirely different." - though it's within the subjective meaning of "a problem" that I think me and delnan were both correct. Because of reference cycles, GC and reliable destruction are incompatible. But whether that's a problem or not depends on your viewpoint and goals. –  Nov 21 '13 at 12:38
  • @Steve314 - thanks for the very informative responses. Now I know what situation to test for. When I get a chance, will test my C# code to see if reference cycle can cause it a problem with (native, not managed) resource freeing. – ToolmakerSteve Nov 27 '13 at 20:13
  • 1
    @delnan: C++ style memory management has two major problems: (1) While it works well when objects naturally have clearly-defined ownerships, immutable instances of objects often have no such ownership; (2) There's no way of knowing whether a pointer's target has has been deleted; use of a pointer after the target is deleted is Undefined Behavior. GC completely solves both of those problems, but C++ style management is better for things which naturally require clearly-defined ownership for proper usage. – supercat Feb 15 '14 at 23:35
19

Java7 introduced something similar to the C# using: The try-with-resources Statement

a try statement that declares one or more resources. A resource is as an object that must be closed after the program is finished with it. The try-with-resources statement ensures that each resource is closed at the end of the statement. Any object that implements java.lang.AutoCloseable, which includes all objects which implement java.io.Closeable, can be used as a resource...

So I guess they either didn't consciously choose not to implement RAII or they changed their mind meanwhile.

gnat
  • 21,442
  • 29
  • 112
  • 288
Patrick
  • 1,873
  • 11
  • 14
  • Interesting, but it looks like this only works with objects that implement `java.lang.AutoCloseable`. Probably not a big deal but I don't like how this feels somewhat constrained. Maybe I have some other object that should be relased automatically, but it's very semantically weird to make it implement `AutoCloseable`... – FrustratedWithFormsDesigner Nov 07 '11 at 19:03
  • 9
    @Patrick: Er, so? `using` is not the same as RAII - in one case the caller worries about disposing resources, in the other case the callee handles it. – BlueRaja - Danny Pflughoeft Nov 07 '11 at 20:52
  • 1
    +1 I didn't know about try-with-resources; it should be useful in dumping more boilerplate. – jprete Nov 07 '11 at 22:06
  • 3
    -1 for `using`/try-with-resources not being the same as RAII. – Sean McMillan Nov 07 '11 at 22:35
  • 4
    @Sean: Agreed. `using` and it's ilk are nowhere near RAII. – DeadMG Nov 08 '11 at 00:29
  • When you declare the variable in the try, it gives you the essential feature of RAII, which is not worrying about releasing the resource in all code paths. It's not RAII, but it gives me what RAII gives me, which is what matters. – Kartick Vaddadi Apr 27 '16 at 15:18
  • 1
    @kartick if you declare variable in try you have already worried about it. That the whole point – RiaD Apr 18 '21 at 13:11
  • I see your point, but even with RAII, you have to worry about it by deciding whether the DbConnection object (in the given example) should live on the stack or heap. So, RAII or not, you have to worry about it. Am I missing something? – Kartick Vaddadi Apr 26 '21 at 04:21
19

Java intentionally does not have stack-based objects (aka value-objects). These are necessary to have the object automatically destructed at the end of the method like that.

Because of this and the fact that Java is garbage-collected, deterministic finalization is more-or-less impossible (ex. What if my "local" object became referenced somewhere else? Then when the method ends, we don't want it destructed).

However, this is fine with most of us, because there's almost never a need for deterministic finalization, except when interacting with native (C++) resources!


Why does Java not have stack-based objects?

(Other than primitives..)

Because stack-based objects have different semantics than heap-based references. Imagine the following code in C++; what does it do?

return myObject;
  • If myObject is a local stack-based object, the copy-constructor is called (if the result is assigned to something).
  • If myObject is a local stack-based object and we're returning a reference, the result is undefined.
  • If myObject is a member/global object, the copy-constructor is called (if the result is assigned to something).
  • If myObject is a member/global object and we're returning a reference, the reference is returned.
  • If myObject is a pointer to a local stack-based object, the result is undefined.
  • If myObject is a pointer to a member/global object, that pointer is returned.
  • If myObject is a pointer to a heap-based object, that pointer is returned.

Now what does the same code do in Java?

return myObject;
  • The reference to myObject is returned. It doesn't matter if the variable is local, member, or global; and there are no stack-based objects or pointer cases to worry about.

The above shows why stack-based objects are a very common cause of programming errors in C++. Because of that, the Java designers took them out; and without them, there is no point in using RAII in Java.

  • 7
    I don't know what you mean by "there is no point in RAII"... I think you mean "there is no ability to provide RAII in Java"... RAII is independent of any language... it does not become "pointless" because 1 particular language does not provide it – JoelFan Nov 07 '11 at 23:36
  • @Joel: I've edited to make it (even) more clear. – BlueRaja - Danny Pflughoeft Nov 07 '11 at 23:46
  • 4
    That's not a valid reason. An object does not have to actually live on the stack to use stack-based RAII. If there is such a thing as a "unique reference", the destructor can be fired once it goes out of scope. See for instance, how it works with D programming language: http://d-programming-language.org/exception-safe.html – Nemanja Trifunovic Nov 08 '11 at 01:44
  • 3
    @Nemanja: An object doesn't *have* to live on the stack to have stack-based semantics, and I never said it did. But that's not the problem; the problem, as I mentioned, is the stack-based semantics themself. – BlueRaja - Danny Pflughoeft Nov 08 '11 at 17:46
  • 1
    @NemanjaTrifunovic: The D version requires you to declare the initialization as `scope`, which looks suspiciously like `using` in C#. The only difference is, in C# you actually create a new scope, whereas in D it uses whatever scope you declared it in. Whether that's better or worse is, quite frankly, a matter of personal preference. – Aaronaught Nov 08 '11 at 20:52
  • 1
    @Aaronaught: The D version comes closer to the C++ version of RAII because it doesn't introduce the scope. One more step would be having the `scope` as default for local variables and using special syntax for ones that don't invoke the destructors at the end of the scope: that's pretty much what C++/CLI does. – Nemanja Trifunovic Nov 08 '11 at 21:00
  • @NemanjaTrifunovic: Now you're arguing defaults, and missing the point in the process; since managed languages *do* have *non-deterministic* "destructors" through finalizers, it is almost always safer to let the GC make the decision, rather than introduce a new class of bugs by forcing programmers to use an explicit syntax when they *don't* want auto-destruction. In fact, most of the time, forgetting to `Dispose` or `Close` is completely harmless because the object is still in gen 0 and the GC will run the finalizer as soon as it goes out of scope. – Aaronaught Nov 08 '11 at 21:24
  • 4
    @Aaronaught: THe devil is in "almost always" and "most of the time". If you don't close your db connection and leave it to the GC to trigger the finalizer, it will work just fine with your unit-tests and break severily when deployed in production. Deterministic cleanup is important regardless of the language. – Nemanja Trifunovic Nov 09 '11 at 13:52
  • 8
    @NemanjaTrifunovic: Why are you unit testing on a live database connection? That's not really a unit test. No, sorry, I'm not buying it. You shouldn't be creating DB connections all over the place anyway, you should be passing them in through constructors or properties, and that means you *don't* want stack-like auto-destruct semantics. Very few objects that depend on a database connection should actually own it. If non-deterministic cleanup is biting you that often, that hard, then it's because of bad application design, not bad language design. – Aaronaught Nov 09 '11 at 13:56
  • 2
    The creators of Java deliberately left out any distinction between a field that identifies an object to which no other references exist, one that identifies an object which it owns, but to which other non-owning references exist, one that identifies an object owned by someone else, and one that identifies an object that has no owner. Omitting such things simplified the compiler, but did nothing to eliminate programmers' need to make such distinctions if they want to write correct and efficient code. – supercat Sep 12 '14 at 19:54
  • RAII is great anywhere you want an action to happen at the end of a scope no matter how that scope is left. That isn't just "reclaim memory", but things like "re-enable the button this is the onclick for" – Caleth Apr 18 '21 at 11:43
  • @Caleth that's what ARM blocks in Java are for – BlueRaja - Danny Pflughoeft Apr 18 '21 at 21:32
17

Your description of the holes of using is incomplete. Consider the following problem:

interface Bar {
    ...
}
class Foo : Bar, IDisposable {
    ...
}

Bar b = new Foo();

// Where's the Dispose?

In my opinion, not having both RAII and GC was a bad idea. When it comes to closing files in Java, it's malloc() and free() over there.

DeadMG
  • 36,794
  • 8
  • 70
  • 139
  • 2
    I agree that RAII is the bees knees. But the `using` clause is a great step forward for C# over Java. It does allow deterministic destruction and thus correct resource management (its not quite as good as RAII as you need to remember to do it, but its definitely a good idea). – Martin York Nov 07 '11 at 17:58
  • 8
    “When it comes to closing files in Java, it's malloc() and free() over there.” – Absolutely. – Konrad Rudolph Nov 07 '11 at 17:59
  • 9
    @KonradRudolph: It is worse than malloc and free. At least in C you don't have exceptions. – Nemanja Trifunovic Nov 07 '11 at 18:52
  • 1
    @Nemanja: Let's be fair, you can `free()` in the `finally`. – DeadMG Nov 07 '11 at 19:10
  • 4
    @Loki: The base class problem is much more important as a problem. For example, the original `IEnumerable` didn't inherit from `IDisposable`, and there were a bunch of special iterators which could never be implemented as a result. – DeadMG Nov 07 '11 at 19:10
  • 1
    In this case, you could still write `using (b as IDisposable)` because you can put `null` in a `using` clause. Seems a bit anal-retentive, though; if you're so worried about the "base class problem", use an IoC container. No more worries. – Aaronaught Nov 08 '11 at 20:46
14

I'm pretty old. I've been there and seen it and banged my head about it many times.

I was at a conference in Hursley Park where the IBM boys were telling us how wonderful this brand new Java language was, only someone asked ... why isn't there a destructor for these objects. He didn't mean the thing we know as a destructor in C++, but there was no finaliser either (or it had finalisers but they basically didn't work). This is way back, and we decided Java was a bit of a toy language at that point.

now they added Finalisers to the language spec and Java saw some adoption.

Of course, later everyone was told not to put finalisers on their objects because it slowed the GC down tremendously. (as it had to not only lock the heap but move the to-be-finalised objects to a temp area, as these methods could not be called as the GC has paused the app from running. Instead they would be called immediately before the next GC cycle)(and worse, sometimes the finaliser would never get called at all when the app was shutting down. Imagine not having your file handle closed, ever)

Then we had C#, and I remember the discussion forum on MSDN where we were told how wonderful this new C# language was. Someone asked why there was no deterministic finalisation and the MS boys told us how we didn't need such things, then told us we needed to change our way of designing apps, then told us how amazing GC was and how all our old apps were rubbish and never worked because of all the circular references. Then they caved in to pressure and told us they'd added this IDispose pattern to the spec that we could use. I thought it was pretty much back to manual memory management for us in C# apps at that point.

Of course, the MS boys later discovered that all they'd told us was... well, they made IDispose a bit more than just a standard interface, and later added the using statement. W00t! They realised that deterministic finalisation was something missing from the language after all. Of course, you still have to remember to put it in everywhere, so its still a bit manual, but it's better.

So why did they do it when they could have had using-style semantics automatically placed on each scope block from the start? Probably efficiency, but I like to think that they just didn't realise. Just like eventually they realised you still need smart pointers in .NET (google SafeHandle) they thought that the GC really would solve all problems. They forgot that an object is more than just memory and that GC is primarily designed to handle memory management. they got caught up in the idea that the GC would handle this, and forgot that you put other stuff in there, an object isn't just a blob of memory that doesn't matter if you don't delete it for a while.

But I also think that the lack of a finalise method in the original Java had a bit more to it - that the objects you created were all about memory, and if you wanted to delete something else (like a DB handle or a socket or whatever) then you were expected to do it manually.

Remember Java was designed for embedded environments where people were used to writing C code with lots of manual allocations, so not having automatic free wasn't much of a problem - they never did it before, so why would you need it in Java? The issue wasn't anything to do with threads, or stack/heap, it was probably just there to make memory allocation (and therefore de-alloc) a bit easier. In all, the try/finally statement is probably a better place to handle non-memory resources.

So IMHO, the way .NET simply copied Java's biggest flaw is its biggest weakness. .NET should have been a better C++, not a better Java.

gbjbaanb
  • 48,354
  • 6
  • 102
  • 172
  • IMHO, things like 'using' blocks are the right approach for deterministic cleanup, but a few more things are needed as well: (1) a means of ensuring that objects get disposed if their destructors throw an exception; (2) a means of auto-generating a routine method to call `Dispose` on all fields marked with a `using` directive, and specifying whether `IDisposable.Dispose` should automatically call it; (3) a directive similar to `using`, but which would only call `Dispose` in case of an exception; (4) a variation of `IDisposable` which would take an `Exception` parameter, and... – supercat Jul 13 '12 at 01:34
  • ...which would be used automatically by `using` if appropriate; the parameter would be `null` if the `using` block exited normally, or else would indicate what exception was pending if it exited via exception. If such things existed, it would be much easier to manage resources effectively and avoid leaks. – supercat Jul 13 '12 at 01:36
11

Bruce Eckel, author of "Thinking in Java" and "Thinking in C++" and a member of the C++ Standards Committee, is of the opinion that, in many areas (not just RAII), Gosling and the Java team didn't do their homework.

...To understand how the language can be both unpleasant and complicated, and well designed at the same time, you must keep in mind the primary design decision upon which everything in C++ hung: compatibility with C. Stroustrup decided -- and correctly so, it would appear -- that the way to get the masses of C programmers to move to objects was to make the move transparent: to allow them to compile their C code unchanged under C++. This was a huge constraint, and has always been C++'s greatest strength ... and its bane. It's what made C++ as successful as it was, and as complex as it is.

It also fooled the Java designers who didn't understand C++ well enough. For example, they thought operator overloading was too hard for programmers to use properly. Which is basically true in C++, because C++ has both stack allocation and heap allocation and you must overload your operators to handle all situations and not cause memory leaks. Difficult indeed. Java, however, has a single storage allocation mechanism and a garbage collector, which makes operator overloading trivial -- as was shown in C# (but had already been shown in Python, which predated Java). But for many years, the partly line from the Java team was "Operator overloading is too complicated." This and many other decisions where someone clearly didn't do their homework is why I have a reputation for disdaining many of the choices made by Gosling and the Java team.

There are plenty of other examples. Primitives "had to be included for efficiency." The right answer is to stay true to "everything is an object" and provide a trap door to do lower-level activities when efficiency was required (this would also have allowed for the hotspot technologies to transparently make things more efficient, as they eventually would have). Oh, and the fact that you can't use the floating point processor directly to calculate transcendental functions (it's done in software instead). I've written about issues like this as much as I can stand, and the answer I hear has always been some tautological reply to the effect that "this is the Java way."

When I wrote about how badly generics were designed, I got the same response, along with "we must be backwards compatible with previous (bad) decisions made in Java." Lately more and more people have gained enough experience with Generics to see that they really are very hard to use -- indeed, C++ templates are much more powerful and consistent (and much easier to use now that compiler error messages are tolerable). People have even been taking reification seriously -- something that would be helpful but won't put that much of a dent in a design that is crippled by self-imposed constraints.

The list goes on to the point where it's just tedious...

gnat
  • 21,442
  • 29
  • 112
  • 288
Gnawme
  • 1,333
  • 8
  • 7
  • 6
    This sounds like a Java versus C++ answer, rather than focusing on RAII. I think C++ and Java are different languages, each with its strengths and weaknesses. Also the C++ designers didn't do their homework in many areas (KISS principle not applied, simple import mechanism for classes missing, etc). But the focus of the question was RAII: this is missing in Java and you have to program it manually. – Giorgio Nov 07 '11 at 20:15
  • 4
    @Giorgio: The point of the article is, Java seems to have missed the boat on a number of issues, some of which relate directly to RAII. Regarding C++ and its impact on Java, Eckels notes: "You must keep in mind the primary design decision upon which everything in C++ hung: compatibility with C. This was a huge constraint, and has always been C++'s greatest strength... and its bane. It also fooled the Java designers who didn't understand C++ well enough." The design of C++ influenced Java directly, while C# had the opportunity to learn from both. (Whether it did so is another question.) – Gnawme Nov 07 '11 at 20:29
  • 2
    @Giorgio Studying existing languages in a particular paradigm and language family is indeed a part of the homework required for new language development. This is one example where they simply whiffed it with Java. They had C++ and Smalltalk to look at. C++ didn't have Java to look at when it was developed. – Jeremy Nov 07 '11 at 20:33
  • @Gnawme, but C# didn't learn RAII :) – JoelFan Nov 07 '11 at 20:53
  • I did not say that Java designers did not do their homework. I said that C++ designers did not do it either. C++ has RAII and I miss it when using Java. Java is simpler and cleaner and I miss it when using C++. – Giorgio Nov 07 '11 at 20:58
  • Typo, I meant: "I did not say that Java designers did their homework." – Giorgio Nov 07 '11 at 21:04
  • 1
    @Gnawme: "Java seems to have missed the boat on a number of issues, some of which relate directly to RAII": can you mention these issues? The article you posted does not mention RAII. – Giorgio Nov 07 '11 at 21:22
  • 2
    @Giorgio Sure, there have been innovations since the development of C++ that account for many of the features you find lacking there. Are any of those features that they *should* have found looking at languages established before the development of C++? That's the kind of homework we are talking about with Java - there is no reason for them not to consider every C++ feature in the develpoment of Java. Some like multiple inheritance they intentionally left out - others like RAII they seem to have overlooked. – Jeremy Nov 08 '11 at 18:24
  • 1
    @Jeremy: Well, for example, Modula2 appeared in 1978 and, still, C++ does not have a decent module concept: I cannot say "import ClassA from ModuleB;" and have the compiler search for the appropriate module / class definition. ;-) – Giorgio Nov 08 '11 at 19:53
  • @Giorgio I believe we can expect modules... sometime this century, at the rate the proposal is advancing. It's been a work in progress for a while, but they keep postponing it yet because it's not ready. – Justin Time - Reinstate Monica Dec 22 '16 at 05:18
11

The best reason is much simpler than most of the answers here.

You can't pass stack allocated objects to other threads.

Stop and think about that. Keep thinking.... Now C++ didn't have threads when everyone got so keen in RAII. Even Erlang ( separate heaps per thread) gets icky when you pass too many objects around. C++ only got a memory model in C++2011; now you can almost reason about concurrency in C++ without having to refer to your compiler's "documentation".

Java was designed from (almost) day one for multiple threads.

I've still got my old copy of "The C++ Programming language" where Stroustrup assures me I won't need threads.

The second painful reason is to avoid slicing.

Tim Williscroft
  • 3,563
  • 1
  • 21
  • 26
  • 1
    Java being designed for multiple threads also explains why the GC isn't based on reference counting. – dan04 Nov 08 '11 at 01:01
  • 1
    You don't need stack allocated objects to make use of RAII - it is enough to have a stack-based reference to an object that would trigger the destructor when it goes out of scope; see how it is handled with C++/CLI. – Nemanja Trifunovic Nov 08 '11 at 13:46
  • 4
    @NemanjaTrifunovic: You can't compare C++/CLI to Java or C#, it was designed almost for the express purpose of interoperating with unmanaged C/C++ code; it's more like an unmanaged language that happens to give access to the .NET framework than vice versa. – Aaronaught Nov 08 '11 at 20:58
  • @Nemanja Triunovic: It you program in embedded Java you can use scoped memory which operates exactly like this. – Tim Williscroft Nov 08 '11 at 22:12
  • 1
    @Aaronaught: I learned a little about the original design goals of C++/CLI from some of its authors, but it is not the topic here. What I am trying to show is that GC and RAII are orthogonal features - C++/CLI is just one example of how it can be done. – Nemanja Trifunovic Nov 09 '11 at 13:56
  • 3
    @NemanjaTrifunovic: Yes, C++/CLI is one example of how it can be done in a way that is *totally inappropriate for normal applications*. It's *only* useful for C/C++ interop. Not only should normal developers *not* need to be saddled with a totally irrelevant "stack or heap" decision, but if you ever try to refactor it then it's trivially easy to accidentally create a null pointer/reference error and/or a memory leak. Sorry, but I have to wonder if you've ever actually programmed in Java or C#, because I don't think *anyone* who has would actually *want* the semantics used in C++/CLI. – Aaronaught Nov 09 '11 at 14:05
  • 3
    @Aaronaught: I've programmed with both Java (a little) and C# (a lot) and my current project is pretty much all C#. Believe me, I know what I am talking about, and it has nothing to do with "stack vs. heap" - it has everything to do with making sure that all your resources are released as soon as you don't need them. Automatically. If they are not - you *will* get into trouble. – Nemanja Trifunovic Nov 09 '11 at 14:19
  • @Aaronaught: And I am afraid I have to rest my case now and do some real work :) – Nemanja Trifunovic Nov 09 '11 at 14:20
  • 4
    @NemanjaTrifunovic: That's great, really great, but both C# and C++/CLI require you to explicitly state when you want this to happen, they just use a different syntax. Nobody's disputing the essential point that you're currently rambling about (that "resources are released as soon as you don't need them") but you're making a gigantic logical leap to "all managed languages should have automatic-but-only-sort-of call-stack-based deterministic disposal". It just doesn't hold water. – Aaronaught Nov 09 '11 at 15:14
  • @Nemanja Trifunovic: Common Lisp has a very simply idiom for this (see e.g. `with-open-file` at http://cl-cookbook.sourceforge.net/files.html). So maybe RAII can be achieved with a generic method without introducing extra syntax and semantics into Java or C#. I am not sure of this but I would like to see this point clarified. – Giorgio May 10 '13 at 08:23
  • with-open-file looks like you could substitute Java's try-with-resources and not notice the difference. – Tim Williscroft May 12 '13 at 02:49
5

In C++, you use more general-purpose, lower-level language features (destructors automatically called on stack-based objects) to implement a higher-level one (RAII), and this approach is something the C# / Java folks seem not to be too fond of. They'd rather design specific high-level tools for specific needs, and provide them to the programmers ready-made, built into the language. The problem with such specific tools is that they are often impossible to customize (in part that's what makes them so easy to learn). When building from smaller blocks, a better solution may come around with time, while if you only have high-level, built-in constructs, this is less likely.

So yeah, I think (I wasn't actually there...) it was a concious decision, with the goal of making the languages easier to pick up, but in my opinion, it was a bad decision. Then again, I generally prefer the C++ give-the-programmers-a-chance-to-roll-their-own philosophy, so I'm a bit biased.

imre
  • 151
  • 2
  • 7
    The "give-the-programmers-a-chance-to-roll-their-own philosophy" works fine UNTIL to you need to combine libraries written by programmers who each rolled their own string classes and smart pointers. – dan04 Nov 08 '11 at 00:18
  • @dan04 so the managed languages that give you pre-defined string classes, then allow you to monkey-patch them, which is a recipe for disaster if you're the kind of guy who can't cope with a different own-rolled string class. – gbjbaanb Mar 09 '13 at 18:17
-2

You already called out the rough equivalent to this in C# with the Dispose method. Java also has finalize. NOTE: I realize that Java's finalize is non-deterministic and different from Dispose, I am just pointing out that they both have a method of cleaning resources alongside the GC.

If anything C++ becomes more of a pain though because an object has to be physically destroyed. In higher level languages like C# and Java we depend on a garbage collector to clean it up when there are no longer references to it. There is no such guarantee that DBConnection object in C++ doesn't have rogue references or pointers to it.

Yes the C++ code can be more intuitive to read but can be a nightmare to debug because the boundaries and limitations that languages like Java put in place rule out some of the more aggravating and difficult bugs as well as protect other developers from common rookie mistakes.

Perhaps it comes down to preferences, some like the low-level power, control and purity of C++ where others like myself prefer a more sandboxed language that are much more explicit.

maple_shaft
  • 26,401
  • 11
  • 57
  • 131
  • 12
    First of all Java's "finalize" is non-deterministic... it is *not* the equivalent of C#'s "dispose" or of C++'s destructors... also, C++ also has a garbage collector if you use .NET – JoelFan Nov 07 '11 at 15:20
  • @JoelFan Maybe I wasn't too clear. I wasn't trying to say that finalize was equivalent to Dispose or to RAII, I was merely stating that instead of a true RAII solution, these two languages can *roughly* accomodate a solution of cleaning resources alongside the GC. I would never condone actually using finalize as a catch all solution for closing resources. – maple_shaft Nov 07 '11 at 15:42
  • ... I actually wasn't aware of a GC for C++ on .NET! The last time I developed C++ for Windows was MFC but I think we can all agree that doesn't count as *real* C++ :) – maple_shaft Nov 07 '11 at 15:44
  • @DeadMG You would be surprised some of the code I had to maintain. – maple_shaft Nov 07 '11 at 15:48
  • 2
    @DeadMG: The problem is, you might not be an idiot, but that other guy who just left the company (and who wrote the code that you now maintain) might have been. – Kevin Nov 07 '11 at 16:36
  • 7
    That guy is going to write shitty code whatever you do. You can't take a bad programmer and make him write good code. Dangling pointers are the least of my concerns when dealing with idiots. Good coding standards use smart pointers for memory that has to be tracked, so smart management should make it obvious how to safely de-allocate and access memory. – DeadMG Nov 07 '11 at 17:06
  • 3
    What DeadMG said. There are many bad things about C++. But RAII isn’t one of them by a long stretch. In fact, the lack of Java and .NET to properly account for resource management (because memory is the only resource, right?) is one of their biggest problems. – Konrad Rudolph Nov 07 '11 at 17:57
  • 1
    Your argument about C++ with dangeler internal pointers is not really valid. A badly written class in any language is badly written no matter what. Admittedly C++ has a steeper learning curve to get it correct, but once you have learned the rule of three once you will never fail again. – Martin York Nov 07 '11 at 18:03
  • 8
    The finalizer in my opinion is a disaster design wise. As your are forcing the correct usage of an object from the designer on to the user of the object (not in terms of memory management but resource management). In C++ it is the responsibility of the class designer to get resource management correct (done only once). In Java it is the responsibility of the class user to get resource management correct and thus must be done each time the class us used. http://stackoverflow.com/questions/161177/does-c-support-finally-blocks-and-whats-this-raii-i-keep-hearing-about/161247#161247 – Martin York Nov 07 '11 at 18:05
  • 1
    @maple_shaft Not surprising, C++/CLI never really caught on. It's syntax was less ugly than its predecessor (Managed Extensions for C++); but it remained much more cumbersome to work with than C#. The only advantage is has is being able to call native methods from within managed code without the overhead of marshaling. This makes it valuable if you're calling a native method inside a loop in performance critical code; but otherwise it's almost completely unused. – Dan Is Fiddling By Firelight Nov 07 '11 at 20:32
  • @Joel: `C++ also has a garbage collector if you use .NET` - that's [C++/CLI](http://en.wikipedia.org/wiki/C%2B%2B/CLI), not C++. They are [two separate languages](http://stackoverflow.com/questions/6509146/why-do-net-languages-vary-in-performance/6510733#6510733). – BlueRaja - Danny Pflughoeft Nov 08 '11 at 23:01
  • @DanNeely, "the only advantage is being able to call native methods"... only? What about RAII? – JoelFan Nov 09 '11 at 04:03
  • 2
    Conversely, requiring every object to be physically destroyed guarantees that an object can close its resources while being destroyed. I'd rather write one destructor than copy the exact same code into twenty `finally` blocks, for example. – Justin Time - Reinstate Monica Dec 22 '16 at 05:23
  • @JustinTime Sure, but the copied code is a single line only and copying it twenty times is still much better than hunting a single bug caused by accessing freed memory. While I'd love to have RAII in Java, I can live without it, as there are just a few places where you have to close a resource (one per hundreds or thousands of lines while objects get allocated maybe every other line; YMMV). – maaartinus Oct 02 '17 at 00:25