I feel this is premature optimisation because our application is not even released yet. I suggested to monitor for slow queries once we go live and then add indices accordingly.
You can't treat your end-users and production environment like quality-assurance. In more words, you're saying that you'll figure it out in production. I don't think that's the right way, and I see that approach go horribly wrong every day.
You need to keep one thing in mind, as you can't paint this with a broad brush.
What is your common workload?
That might sound obvious or dull, but it's significant in practice. If you have 10 queries that make up 98% of your workload (quite common, believe it or not), my recommendation would be a hard analysis before production. With realistic and representative data, make sure those 10 queries are as good as they possibly can be (perfect is a waste of valuable time, and almost not achievable).
For the other 200 queries that make up the 2% of the workload, those are ones that most likely aren't worth a ton of effort, and will make up the corner-case perf troubleshooting oddities in production. That's also a reality, and not a terribly bad thing. But that doesn't mean ignore indexing best practices or make estimated assumptions about data retrieval.
It is common and good practice to figure out database performance prior to production. In fact, there is a relatively common position for this type of thing called a development DBA.
But...
Some take that too far and go crazy adding indexes "just in case". Somebody recommends this is a missing index? Add it, and four other variations. Also a bad idea. You need to not only think about your data retrieval, but what about data modification? The more indexes you have on a table, generally speaking the more overhead you have when you modify data.
Like most things, there is a healthy balance.
As a fun little side note... The pluralization of "Index"
"Indices" are for financial people
"Indexes" are for us