I feel that tests should be able to tell you where you went wrong.
I get your point, and it's a principle I very much agree with. When you open the door to debugging the db, you lower the developer's incentive to put effort in making tests that adequately explain the issue when they fail.
However correct I think your argument is, you do need to allow real-world compromises. No test is perfect. If anything happens to the database, unrelated to and not caused by your code, then the test result might report an issue because it detected an anomaly, even if your tests are not at fault. Or maybe your tests forgot to cover an edge case.
When you force your developers to rely purely on existing tests, then your developers' work can only be as correct as the tests themselves are. If your tests contain any flaws whatsoever, developers won't just be unable to fix it, but they might even be sent off in the wrong direction, hunting for a bug they think exists because a flawed test told them so.
This is a tough choice. One the one hand, you want to strongly incentivize the developers to rely on automated tests as opposed to their own aptitude at debugging when looking at the data.
On the other hand, completely blocking access to the database implicitly assumes that your tests are perfect. It's impossible to definitively know that your tests are perfect. The best you can get is "not having seen a problem yet".
You can create an obstacle instead of a ban. E.g. only allow a subset of the developers access to the resources (e.g. the technical leads or seniors). That means that your devs will first have to talk to your experts before they get access to the resource, thus incentivizing them to not immediately want to run to the database for every minor issue they encounter.
When a problem arises that genuinely requires access to the database, the experts will be able to provide it. When a problem arises that doesn't require access to the database, they'll be able to block it.
- Should we write our application in a way that a developer can access the db.
There are compromises between no access and full access. But even more importantly, I don't think a dev needs access to the database, but rather the data that was in the database. That's an important distinction to make.
For example, you can dump the database content to a log file when a test error is encountered (make this a configurable option so it only happens when a dev explicitly asks for it).
Just to be clear, "dump the database content" is an overgeneralization. The point I'm trying to make is to log the relevant data. That way, your developers have access to the data (in log format) without needing access to the database itself.
- Should developers be allowed to run tests across non-dev environments.
No. There is a better alternative: allow developers to copy a database from a non-dev environment to a dev environment, and then they can test what they want.
This may be complicated because of GDPR/privacy rules, but that's a different discussion.
The only reason to look at a non-dev environment is when you're looking at an environment-specific issue. Require your developers to first prove that they're unable to reproduce the error in the dev environment (with the exact same data), before allowing them to go outside of the dev environment.