Does semi-normalization exist as a concept? Is it "normalized"?
- by Gracchus
If you don't mind, a tldr on my experience:
My experience tldr
I have an application that's heavily dependent upon uncertainty, a bane to database design.
I tried to normalize it as best as I could according to the capabilities of my database of choice, but a "simple" query took 50ms to read.
Nosql appeals to me, but I can't trust myself with it, and besides, normalization has cut down my debugging time immensely over and over.
Instead of 100% normalization, I made semi-redundant 1:1 tables with very wide primary keys and equivalent foreign keys. Read times dropped to a few ms, and write times barely degraded.
The semi-normalized point
Given this reality, that anyone who's tried to rely upon views of fully normalized data is aware of, is this concept codified?
Is it as simple as having wide unique and foreign keys, or are there any hidden secrets to this technique? Or is uncertainty merely a special case that has extremely limited application and can be left on the ash heap?