From what I've seen, it seems that big-O notation and time and memory complexity are emphasized a lot in formal computer science education... however being self-taught, this perception is based on hearing and reading what people with such educations say and write.
Though I do believe the general ideas and concepts are important, I don't believe the formalization of it (such as big-O notation and various terminology) matters nearly as much, except for the purposes of communication. Just because someone isn't familiar with the formal notation and terminology doesn't mean they can't see how and why one algorithm would be faster than another in a particular case. People can see that the time taken to search a balanced binary tree relates to the base-2 logarithm of the number of nodes without first learning about complexity theory in any formal sense, if they understand how the tree works and have a reasonable grasp of high school math. It's important to know when to pay attention to complexity and memory use, and to consider typical and worst cases, though... but some people don't. Of course, a formal background in the theory might help, but not having it doesn't mean one can't apply the concepts.
The notation and terminology become important for communication. They give a nice way to convey a quantification of the performance of an algorithm to someone else. Because it comes up in papers and explanations frequently, it's useful to have at least a vague understanding of it so that they're easier to follow.
So yes, the concepts are important (though less so when resources and time are ample but data isn't). But though the concepts are important, the formalization of them is often not so important -- and one needs to remember that the notation and terminology are not the same as the concepts themselves.
Edit:
I wouldn't claim to understand the concepts in as much detail as someone who's formally studied, but a lot of the general ideas just make sense. I do think there's value in formally studying this, but some of that value can still exist without.
As for introducing the concepts (outside formal study), I think a good start is to encourage people to think about how much memory overhead the data structures have, what steps the algorithms involve, and how these things change with different data.
It also helps to consider hypothetical situations and changes, like considering what happens if a tree is balanced versus what happens if it's as unbalanced as possible, or how many levels into the tree most of the nodes would be, or how many more nodes it can hold if the depth is increased one level. This way of thinking is generally useful for programmers anyway, not just when looking at complexity; and if applied to thinking about how algorithms and data structures perform under different circumstances it naturally points in the same direction as a more formal examination of complexity would.