At most; my largest object would probably be a QString of maybe a
million characters.
sizeof(QString)
is 8 bytes on 64-bit systems no matter how big or small its contents are, because the contents are always allocated on the heap, as with the case of std::vector
. QString
is effectively just a handle to memory on the heap. So even if you create a QString
object on the stack and insert a million characters to it, that's still just using 8 bytes of stack space at most, if not living directly in a register.
If you tried to allocate a million unicode characters on the stack, that'd generally be asking for an overflow in many cases.
In most of the normal kind of C++ code people write, you generally don't need to think about the distinction of stack vs. heap so much since it's all abstracted away from you with the standard data structures you use. The distinction is generally more about whether you use, say, unique_ptr
or not in implementing a class. If you do, that implies the additional heap overhead, an extra layer of indirection, and some memory fragmentation.
If you ever do work with objects that don't manage their own memory separately on the heap, like you just want to use std::array<T, N>
on the stack, then maybe a sane range (just a rough and crude number to go by) is typically in the range of hundreds of bytes to maybe a couple of kilobytes for a function which isn't called recursively or calling other functions that allocate a large amount of stack space to be reasonably safe against stack overflows on most desktop machines and operating systems at least. If it's an array of 32-bit integers, I might use the heap if there are more than 128 to store (more than 512 bytes), e.g., at which point you might use std::vector
instead. Example:
void some_func(int n, ...)
{
// 'std::array' is a purely contiguous structure (not
// storing pointers to memory allocated elsewhere), so
// here we are actually allocating sizeof(int) * 128
// bytes on the stack.
std::array<int, 128> stack_array;
int* data = stack_array.data();
// Only use heap if n is too large to fit in our
// stack-allocated array of 128 integers.
std::vector<int> heap_array;
if (n > stack.size())
{
heap_array.resize(n);
data = heap_array.data();
}
// Do stuff with 'data'.
...
}
As for move semantics, I'd apply a similar rule there because it's actually pretty cheap to deep copy even 512 bytes if that's just copying memory from stack to stack or from two regions of memory with high temporal locality. Even if you have a huge class, Foo
, with a dozen data members where sizeof(Foo)
is like a whopping 256 bytes, I wouldn't use that so much to allocate its contents on the heap if it can be avoided. Typically you're perfectly fine if the decision to use the heap or not is based more on things other than performance, like avoiding stack overflows and modeling variable-sized data structures (which imply heap unless they're optimized for common cases involving small sizes, like small string optimizations which avoid the extra heap allocation for small strings but use the extra heap allocation for big ones), allowing shared ownership with shared_ptr
, using a pimpl to reduce compile-time dependencies, or allocating subtypes for polymorphism where it might be unwieldy to avoid heap allocations in those cases.