I've been reading about NHibernate and Microsoft's Entity Framework to perform Object Relational Mapping against my data access layer. I'm interested in the benefits of having an established framework to perform ORM, but I'm curious as to the performance costs of using it against standard XML Serialization and Deserialization.
Right now, I develop stored procedures in Oracle and SQL Server that use XML Types for either input or output parameters and return or shred XML depending on need. I use a custom database command object that uses generics to deserialize the XML results into a specified serializable class.
By using a combination of generics, xml (de)serialization and Microsoft's DAAB, I've got a process that's fairly simple to develop against regardless of the data source. Moreover, since I exclusively use Stored Procedures to perform database operations, I'm mostly protected from changes in the data structure.
Here's an over-simplified example of what I've been doing.
static void main() {
testXmlClass test = new test(1);
test.Name = "Foo";
test.Save();
}
// Example Serializable Class ------------------------------------------------
[XmlRootAttribute("test")]
class testXmlClass() {
[XmlElement(Name="id")]
public int ID {set; get;}
[XmlElement(Name="name")]
public string Name {set; get;}
//create an instance of the class loaded with data.
public testXmlClass(int id) {
GenericDBProvider db = new GenericDBProvider();
this = db.ExecuteSerializable("myGetByIDProcedure");
}
//save the class to the database...
public Save() {
GenericDBProvider db = new GenericDBProvider();
db.AddInParameter("myInputParameter", DbType.XML, this);
db.ExecuteSerializableNonQuery("mySaveProcedure");
}
}
// Database Handler ----------------------------------------------------------
class GenericDBProvider {
public T ExecuteSerializable<T>(string commandText) where T : class {
XmlSerializer xml = new XmlSerializer(typeof(T));
// connection and command code is assumed for the purposes of this example.
// the final results basically just come down to...
return xml.Deserialize(commandResults) as T;
}
public void ExecuteSerializableNonQuery(string commandText) {
// once again, connection and command code is assumed...
// basically, just execute the command along with the specified
// parameters which have been serialized.
}
public void AddInParameter(string name, DbType type, object value) {
StringWriter w = new StringWriter();
XmlSerializer x = new XmlSerializer(value.GetType());
//handle serialization for serializable classes.
if (type == DbType.Xml && (value.GetType() != typeof(System.String))) {
x.Serialize(w, value);
w.Close();
// store serialized object in a DbParameterCollection accessible
// to my other methods.
} else {
//handle all other parameter types
}
}
}
I'm starting a new project which will rely heavily on database operations. I'm very curious to know whether my current practices will be sustainable in a high-traffic situation and whether or not I should consider switching to NHibernate or Microsoft's Entity Framework to perform what essentially seems to boil down to the same thing I'm currently doing.
I appreciate any advice you may have.