The culture of programming has changed considerably since the times that C was the language of choice and the hardware was so wimpy that C was actually a necessity.
Luckily, we do not have to worry about optimization so much nowadays. The general rule is to never optimize unless you have a good reason to do so. And in order to have a good reason, you have to:
Have a pre-established performance requirement for your product, something like "server response time must be less than 200 milliseconds". (Vague requirements like "as fast as possible" are generally frowned upon.)
Measure the actual response time of your server and actually witness that it is failing to meet the requirement.
Exhaust all options of meeting the requirement by reconfiguring your system to make it work more optimally. (You would be surprised. Some people don't know the difference between running a web server in debug mode and in production mode.)
Exhaust all options of meeting the requirement by buying better hardware and/or more hardware. Hardware nowadays generally costs far less than developers' salaries.
Throw the profiler on your system and determine that the bottleneck is in fact in code that you are responsible for and have the power to change.
Exhaust all options for algorithmic optimizations. (Introduction of a caching layer somewhere; restructuring code so that something happens asynchronously rather than synchronously; etc.)
Then, and only then, is it advisable to try your luck by tweaking code to make it work more optimally. And as you might imagine, we hardly ever reach this stage.