2

One of Bloch's Effective Java items 55: Optimize judiciously extends Jackson's rules on optimization:

Rule no. 1: Don't optimise!

Rule no. 2 (for experts): Don't optimise yet!

Extra Bloch rule 1: Don't try it until development is finished.

Extra Bloch rule 2: Measure performance before and after implementing an optimisation (you'll be surprised!)

And this is essentially how I approach optimisation. However I now have a situation where I am using a complicated but accurate, OO data structure reflecting the business domain.

I find myself considering implementing a significant amount of denormalisations.

Should I, if everything else is equal?

Is denormalisation in the business layer just another method of optimisation, to which these rules apply?

Adam
  • 187
  • 6
  • 1
    I'd guess it depends on how the denormalization is done. I can think of some denormalizations in our app that are (judiciously implemented) optimizations, and others that seem like merely combining data we already had to fetch in previous requests for the convenience/readability of other methods. Perhaps it depends on whether it happens at a layer that's aware of the database structure. Could you give a specific example of one of the denormalizations you're doing? – Ixrec Feb 07 '16 at 14:16

3 Answers3

5

Bloch's rules on optimization are just a variation on "premature optimization is the root of all evil" which, correctly interpreted, means "Measure first, before you optimize. Make sure that your optimization is actually going to give you the performance benefit you are seeking, before you spend the time and money optimizing."

I don't necessarily agree with delaying all optimizations until the end of the project. Some optimizations must be done during development; waiting until the end makes such optimizations effectively impossible.

Robert Harvey
  • 198,589
  • 55
  • 464
  • 673
  • Agreed, if you're building a stock trading platform, you better spend a lot of time optimizing transactions and if you have to redo your reporting design as a result, so be it. Don't have the tail wag the dog. – JeffO Feb 11 '16 at 18:59
  • I would go one step further in the last paragraph: Performance needs to be considered even at the very first steps of *design*. Some of the largest performance gains can be achieved by using the right architecture and data structures. If you postpone thinking about performance until the very end of the development process, you may find that you have to rewrite the whole thing to get decent performance out of it. – cmaster - reinstate monica Feb 14 '16 at 21:08
3

The problem I have with those rules is they're talking to babies. A time comes when finding speedups is necessary, and when it does clichés won't help. You've got to know how to do it.

Guessing doesn't find them. If somebody looks at the code and says "maybe this could be done better" - that's guessing.

Measuring doesn't find them. It may tell you there is no problem, but if there is one, it doesn't tell you what it is.

Some people are more helpful, saying "use a profiler". Profilers can give you pretty wallpaper, but they only find a limited class of speedups. If you let others get away, you'll miss out - big time. Here's why.

There's a method that some people use. It does not give you pretty wallpaper. It does find any speedups that profilers do, and more that they don't.

Mike Dunlavey
  • 12,815
  • 2
  • 35
  • 58
2

I am writing this to emphasize that Normalization is not just for optimization. There are cases when RDBMS is not the correct solution for the problem at hand and Document databases or even file systems could be used to store your information. However, it is not the intention of this answer to go into this.

Normalization promotes optimization but it is not just for optimization. Normalization (when done right), results in consistent information at the data store level. This type of consistency is highly desirable in many OLTP systems. It has a cost for larg-data based BI systems, so alternative approaches such as star schema are available for such systems.

For example given a web-based CMS application, when a customer changes her address, all references to that customer at the database level will point to the one (or most recent) address information.

In addition SQL works in complete harmony with Normalization to deliver this type of consistent information. You can take full power of aggregate functions, joins, etc. with a normalized structure hence reduce your code complexity.

Report generation tools also work very well with normalized structures.

For enterprise applications, normalization is so important so that integrating systems becomes easier.

In summary, Normalization has other benefits than optimization.

NoChance
  • 12,412
  • 1
  • 22
  • 39