We are developing a data model for a marketing database that will import transaction, customer, inventory, etc. files and the directive is ONE process that works for every client. We have been told every client will have different import layouts and different columns that identify a table's primary key.
Our initial idea is to get definitions from the client on what columns make each record unique, store those in a mapping table, and then have lookup tables that translate those primary keys to an internal surrogate keys automatically for every destination table so that every table conforms to an integer primary key no matter how many columns/types are used to make up the real pk.
The first major problem I saw was when/if have to map back to the lookup tables to get data that we did not store in the main data model, but I have ben assured that anything we ever want to query will be duplicated in the lookup and main tables so that should not be a concern.
This kind of flexibility seems like it will cause some serious limitations on:
Technology stack (no way to dynamically map these import files in SSIS, need lot of dynamic SQL or java/c#)
Scalability (based on previous concern and initial testing, this would be difficult to scale without speed concerns)
Complexity (we are already running into some complex code changes when we try to implement all these tables with historical change logging while maintaining mapping to the lookup tables for example)
My question - Is this feasible or is there another obvious solution we are missing?