I am planning to standardize our way of creating project for our new projects.
Currently we are using 3tier architecture where we have our ClassLibrary Project where it includes our Data Access Layer and Business Layer
Something like:
Solution ClassLibrary
>ClassLibrary Project :
>DAL(folder)
> DAL Classes
>BAL(folder)
> BAL Classes
And this Class Library dll was reference on our presentation Layer Project which are the Application(web/desktop)
Something like:
Solution WebUniversitySystem
>Libraries(folder)
> ClassLibrary.dll
>WebUniversitySystem(Project):
>Reference ClassLibrary.dll
>Pages etc...
Now i am planning to do is something like:
Solution WebUniversitySystem
>DataAccess(Project)
>BusinesLayer(Project)
>Reference DAL
>WebUniversitySystem(Project):
>Reference BAL
>Pages etc...
Is this OK? Or there is a good approach that we can follow?
1
That’s a pretty good approach. You’ve separated the concerns into projects and referenced the projects where they are used. That’s good.
Since you’ve tagged this Visual Studio, I’m going to share how I structure my .NET software projects.
Typically I’ll have these three projects, each contained within a single Visual Studio solution (.sln
):
PointOfSale
|
|--PointOfSale.Domain (*A class library project. Typically Entity Framework models etc.*)
|
|--PointOfSale.WebUI (*The UI project. Could be MVC3, or WPF, or whatever. Presentation.*)
|
|--PointOfSale.Tests (*The unit testing project. I place all my tests in here.*)
As far as references between the projects, the .Domain
project is referenced in both the UI project and Tests project.
My main critique of your approach would be, don’t make your BL directly depend on your DAL, put a layer of indirection in their where the BL receives the DAL via IOC or IPC (which is basically a form of IOC) Then the consumers of your BL get to choose what DAL it gets; If the consumer’s a unit test it gives a mock DAL, if the consumer is that one annoying client for some reason they use a CSV DAL you hate maintaining, but for most consumers the BL is given a large DAL for enterprise use and about 20% of implemented consumers are the mobile versions presentation layer which gives the BL that in-memory DAL.
I’m a personal fan of having a shared block of data models, 2 BL’s, one DAL, and a presentation layer in the MVC pattern.
Why 2 BL’s? There is always a body of logic that lives at the DAL layer for actually translating the requested data into a data model, and even more complexity on the other way around, especially if you have any form of caching whatsoever. Usually people embed all of this into their DAL but end up paying a maintenance penalty they wouldn’t have to if they made a separate layer above the DAL which receives DAL components in IOC and completes said logic.
The other BL then should in my opinion be idempotent in a functional paradigm with minimal if any side effects, side effects are what the dal is for.
So the dependency graph would be (layer: things it depends on):
DataModels: nothing
DAL: nothing
DALInterfaces: nothing
DALBL: DALInterfaces, DataModels
DALBLInterfaces: DALInterfaces, DataModels
BL: DataModels
BLInterfaces: DataModels
Presentation: DataModels, *Interfaces, some IOC framework or service client framework that turns the interfaces into concrete implementations.
Then the use is such that in your presentation you would do things like:
public MenuOption[] GetMenuOptions(string username, string password)
{
DataModels.User currentUser = _securityBl.GetUserFromThread(Thread.CurrentThread);
// remember no side effects in BL layer, just works from inputs, so needs be told the thread
return _usersDalBl.GetMenuOptionsForUser(currentUser);
// this dalBl was given a concrete Dal at construction, again
// no *direct* side effects, only the dal it's accessing has the code with side effects
}
Note, this makes the top most layer (which should be the thinnest and most lacking in substance) the most highly dependent, however I think this makes sense, it’s main purpose is to tie together whatever pieces of functionality it needs, while all the functionality really lies in the underlying libraries.
The interface pieces just exist for unit testing as much as anything, though modularity may always be needed later on. My preference is that IPC is used for access to both BL’s at least, as well as DAL though that should be a more secured IPC interface. The form of IPC is irellevant, it could just be a memory mapped file or anything. I just think it’s an important boundary to maintain independence for that time later when someone writes a better DAL than yours and all you have to do is run their DAL process instead of yours. Though maybe I’m just being a touch nutty for IPC, it’s hardly necessary if you know you’ll be able to maintain the boundaries between layers without needing any protections to safeguard against accidents/mistakes/other people.