I have been looking for some advice for a while on how to handle a project I am working on, but to no avail. I am pretty much on my fourth iteration of improving an “application” I am working on; the first two times were in Excel, the third Time in Access, and now in Visual Studio. The field is manufacturing.
The basic idea is I am taking read-only data from a massive Sybase server, filtering it and creating much smaller tables in Access daily (using delete and append Queries) and then doing a bunch of stuff. More specifically, I use a series of queries to either combine data from multiple tables or group data in specific ways (aggregate functions), and then I place this data into a table (so I can sort and manipulate data using DAO.recordset and run multiple custom algorithms). This process is then repeated multiple times throughout the database until a set of relevant tables are created.
Many times I will create a field in a query with a value such as 1.1 so that when I append it to a table I can store information in the field from the algorithms. So as the process continues the number of fields for the tables change.
The overall application consists of 4 “back-end” databases linked together on a shared drive, with various output (either front-end access applications or Excel).
So question is: Is this a common way how data driven applications work?
Each back-end database is updated with fresh data daily and updating each takes around 10 seconds (for three) and 2 minutes(for 1).
Project Objectives: I want/am moving to SQL Server soon. Front End will be a Web Application (I know basic web-development and like the administration flexibility) and visual-studio will be IDE with c#/.NET.
Should these algorithms be run “inside the database,” or using a series of C# functions on each server request. I know you’re not supposed to store data in a database unless it is an actual data point, and in Access I have many columns that just hold calculations from algorithms in vba.
The truth is, I have seen multiple professional Access applications, and have never seen one that has the complexity or does even close to what mine does (for better or worse). But I know some professional software applications are 1000 times better then mine.
I have been completely on my own and need some guidance on how to approach this project the right way. What would be a good approach to follow ?
3
In short, it will really depend on usage of the application by end-users and the requirements.
However, you may strongly benefit from two general approaches that are parts of enterprise level application development, which i have experienced so far. Namely, you may have multi-tier application with 3 layers: Presentation Layer, Business Layer and Data Layer. or as an alternative you may have 2 layers, where you shift your Business layer to the Data Layer.
- Presentation Layer: will be responsible for “User Interface Process Components”
- Business layer: will be responsible for Business Components, Workflows, Entities, Service Interfaces
- Data Layer: mainly responsible accessing data through Data Access Components, Service Gateways, Foundation Services (like Security, Operational Management and Communication).
As a universal truth, One diagram is worth of several description pages.
References:
There are several places where you can get some thought on how to design your application in a scale-able way. One of the reference points is Enterprise Solution Patterns Using Microsoft .NET
5
As this sounds like a reporting application, to simplify the database side another approach to consider is to use some of the principles of data warehousing. This is basically denormalizing the database into into “facts” and “dimensions” (star schema). This makes it simpler for generating reporting and handling changes of the source database (e.g. just add another dimension) and also open it up to using OLAP tools for reporting. This could also reduce the number of databases/tables you have to worry about as they don’t need to be customized.
2