Opting to load everything into memory on start up is a premature optimization. You haven't collected statistics on how long it takes to query, determined if some caching is going to speed things up more than your indexes etc.
To answer your specific questions:
Wouldn't this increase the query-count drastically?
That depends on your definition of drastic. If you never pulled from the database after startup, then you will be accessing the database more regularly.
However, that activity is likely to be well within what your database can handle. You will have to do your own analysis to ensure that.
What is better after all keeping things in RAM or query more often?
That depends entirely on how much memory we are talking about, and whether you can purge items from memory that just aren't needed. While most servers can have a lot of RAM, it is still one of the most expensive places to store data. Keeping everything in memory at once minimizes the ways you can scale your system.
When profiling your system, you will find that some information is just taking up space in memory and is used very sparingly. However other items are used much more often.
Caching systems allow you to keep often used data resident in memory, while allowing the less frequently needed items to be reclaimed by the virtual memory system. With intelligent caching, you will use more memory than blindly hitting the database all the time, but the often requested data is available to speed things up.
By keeping things in memory, you also lose the ability to do complex queries on your data and take advantage of the indexing that the database can do. You will find that your database has a very intelligent caching system built in, so the more often you run the same query, the faster it will run (to a point).
What do I do with the Data I've loaded lazy ? Do I dispose everything when the details collapse or should I store them for some time in case the user opens it again?
If your language supports it, I would use a weak reference to the data set. Weak references are a way to hold on to data in case it is used frequently, but still allows the garbage collector to dispose of the data if loading new data causes additional memory pressure.
Designing around weak references requires you to check if it is still populated, and re-query if it isn't. It's one of the key mechanisms for caching.