I'm working with a SQL Server database with 1000+ tables, another few hundred views, and several thousand stored procedures. We are looking to start using Entity Framework for our newer projects, and we are working on our strategy for doing so. The thing I'm hung up on is how best to split the tables into different models (EDMX or DbContext if we go code first). I can think of a few strategies right off the bat:
Split by schema
We have our tables split across probably a dozen schemas. We could do one model per schema. This isn't perfect, though, because dbo still ends up being very large, with 500+ tables / views. Another problem is that certain units of work will end up having to do transactions that span multiple models, which adds to complexity, although I assume EF makes this fairly straightforward.
Split by intent
Instead of worrying about schemas, split the models by intent. So we'll have different models for each application, or project, or module, or screen, depending on how granular we want to get. The problem I see with this is that there are certain tables that inevitably have to be used in every case, such as User or AuditHistory. Do we add those to every model (violates DRY I think), or are those in a separate model that is used by every project?
Don't split at all - one giant model
This is obviously simple from a development perspective but from my research and my intuition this seems like it could perform terribly, both at design time, compile time, and possibly run time.
What is the best practice for using EF against such a large database? Specifically what strategies do people use in designing models against this volume of DB objects? Are there options that I'm not thinking of that work better than what I have above?
Also, is this a problem in other ORMs such as NHibernate? If so have they come up with any better solutions than EF?