I'm trying to understand what is the motivation for buffering the print to standard output.
I've experienced this in multiple programming languages, a scenario where you're trying to trace a bug by doing print outputs and your print statements aren't printed real time just because the buffer wasn't flushed.
So to get the command to be actually printed out where it's called and additional flush
has to be used.
What is the motivation for such behavior?