Often in the development of enterprise applications, business logic and data access logic are defined in separate tiers. This brings the great challenge to the developers to exchange data between these tiers.
There are various options the programmers are exposed to. They are
Let’s explore pros and cons of these approaches below.
Data Transfer Techniques
XML is very data source independent. It’s easier to represent relational data in the XML format. There are quiet a few XML technologies support provided in .Net Framework. XPath queries can be used in the code to query the data from the XML document. But this approach poses lot of challenges in terms of parsing the XML document. Business components will contain lot of XML Parsing logic than the Business logic. This becomes maintenance overhead.
DataReader are data source dependent, even though it provides fast, forward, and read-only API to read data. Attaching business rules to the columns (ordinals) in the DataReader are not possible. Connection to the data source is kept opened till the reader is closed.
DataSet / DataTable provides container for storing data in a tabular format. It’s an in-memory object that provides data retrieval and filter operations. Data Aggregation, Expression languages, Computed columns and built-in support are the greater side of the DataSet. It is not designed only to represent database (relational) data but also to any data source – the file system, external services and in-memory data. It extends support to define hierarchial data. DataSet may contain one or more DataTable. We can create relations for these DataTables at the design time itself.
We can create dynamic views of the data stored in the DataSet. DataView helps in creating different sort orders, and filter data by expresssions.
Even though it brings lot of good features for quick to market enterprise applications, it comes with its own overheads also. Let’s explore them below.
DataSet comes with two flavors.
Consumers of the Untyped datasets should know about schema at the design time. So it becomes very hard to maintain when there are any changes to the schema in the underlying data source. DataSet (untyped) merely represent data and it does not represent any domain objects or business entities. We need additional classes to manipulate DataSets. Adding business rules methods are not feasible with untyped datasets.
Alternatively, Typed datasets provide flexibility defining the columns and the types at the design time using XML Schema (XSD). This gives great performance over untyped as the compiler need not have to do the type check and conversion. Typed DataSet inherits all the members of the DataSet. Since the typed DataSet is generated using XML Schema, it is easier to define friendlier names to the columns. DataSets cannot hold specific business-rules by itself. We can create a partial class with business rules validation to encapsulate the typed datasets in the business layer. Each typed dataset created will generate a partial class for the dataset, a partial nested class for each table in the dataset, and each DataTable will have DataRow Partial Class. This is we create a business aware typed datasets.
DataSet can be serialized and sent over the network or persisted in a file system. .Net 1.1 framework introduced XML streaming for seriazling the data in the XML notation. These XML steams had the data padded with schema information. When the size of the dataset increases, the performance was less. .Net 2.0 framework introduced binary streams for the serialization algorithm which provided lot of bandwidth. We can further improve the performance by excluding the schema.
Now let’s see how we can encapsulate data using custom objects.
One option is to have both data and business methods encapsulated in a single business object. The commonly made design mistake is to pass the business objects directly into the data access layer so they can be populated. Business object libraries reference the data access layer assembly to make method calls in the data access layer. So we should not make the data access component reference the business component. This will lead to a circular reference.
Another option is to encapsulate data in a separate custom class called Data transfer object (DTO). DTOs are formerly known as value objects, is a design pattern used to transfer data between software application subsystems. DTOs are often used in conjunction with data access objects to retrieve data from a database. The difference between data transfer objects and business objects or data access objects is that a DTO does not have any behavior except for storage and retrieval of its own data (accessors and mutators). Custom Objects should implement Collection interfaces (IEnumerable) to store multiple values. It should implement ISerializable interface if the object has to be serialized.
DTOs are designed to move data between different layers in the enterprise distributed applications. They can be created in a separate project/assembly and reference them from Presentation, Business and Data Access layers. We can any of the ORM tools to construct the DTOs.
Conclusion
Both Typed DataSet and Custom Entites can be used while building systems and both accomplish same objective. DataSet are best suited for small applications or prototyping. For small application business entities adds complexity. Custom objects may be a best choice for large and complex business applications. It is easy to maintain and improves the readability of the code.
转自:http://www.dotnetfunda.com/articles/article603-data-transfer-between-business-and-data-access-components-in-enterprise-app.aspx