In our company we have two large systems, basically they are scoring and CRM. Both have large SQL Server databases running on Windows servers on the intranet. These two databases are managed by other department, which we have a direct contact with.
Neither of those systems have any external APIs except for some stored procedures (hereinafter “SPs”), which we use to input all the data from our (web-dev team) two applications, which are a public web site and a web-based CRM.
There are problems which make this kind of integration a nightmare:
-
There are no proper copies of databases for testing. There are some occasional copies which may or may not be up-to-date and being synced manually, which make them practically useless in “ASAP-situations” (all the time).
-
All sorts of modifications in stored procedures code are being made straight on the production databases (!!!), which obviously cause us a lot of problems.
-
Stored procedures code is not documented, poorly commented and mostly not well-written. Just to say that I’ve spent last two days “decrypting” two SPs, one of which is ~2400 SLOC long and has many magic numbers and vars like
@str1
.
To make it easier and safer:
-
We made kind of “API-server”, which is basically a SOAP server which does queries to SPs and gives back whatever queries return together with some nice status messages. This helps us decouple our apps from the chaos, but it barely simplifies our task when it comes to writing and tesing new functionality.
-
We store some IDs and other constants from these databases in our databases to prevent using magic numbers in our code whenever possible, but still we have to collect and fill in all the data manually with sqlcmd (which is very uncomfortable because of the Windows
cmd
disability to handle copying, pasting and displaying text).
I wish we could have some proper process of testing interaction with those databases, but what we have today is just another copy of CRM database which we have to sync manually somehow.
My questions are:
-
How can we make all the changes made on production databases automatically propagate into testing copies? I can think of something like
.bat
-script, but maybe there are better tools. I doubt (because of previous experience) we can ask other department to do some replication, so I’m asking for something we can make ourselves and convince that other department to use. -
How can we easily sync some data without copying all the data? For example, we need to have an up-to-date copy of all dictionaries/taxonomies, but not the data about clients (which is a lot of data and will take significant time to copy). Again, I’m thinking of some script, but may there be something with just a bunch of checkboxes to select.
-
What is the best approach for testing the interactions through our “API-server”? I mean, suppose I made a request via SOAP with some fake data and no errors occured – how can I be sure the data is there in that external DB and it is consistent? It is sort of black/grey box testing and I don’t know how to deal with it.
1
How can we make all the changes made on production databases automatically propagate into
testing copies? I can think of something like .bat-script, but maybe there are better tools. I doubt (because of previous experience) we can ask other department to do some replication, so I’m asking for something we can make ourselves and convince that other department to use.
Use standard backup/restore mechanisms provided by your database vendor. If your other department makes a daily backup of the production database (automatically), they surely can make a daily backup for testing purposes using the same mechanism. See here
http://msdn.microsoft.com/de-de/library/ms187510.aspx
for information about the options of backing up an MS SQL Server.
How can we easily sync some data without copying all the data?
Well, the optimal way depends much on the data model, the number of records and the complexity of the schema. Copying only a part of the records may need a specifially designed program which transfers the data in the correct order, without violating constraints. Perhaps it is easier to make a full backup/restore first, and afterwards delete everything you don’t need in the copy and keep the rest for further processing. That may be too slow in your case, or probably fast enough, impossible to know without further details. Another idea is to use differential backups, see here for further details.
but still we have to collect and fill in all the data manually with sqlcmd (which is very uncomfortable because of the Windows cmd disability to handle copying, pasting and displaying text)
If cmd
is not powerful enough, why don’t you use something more powerful like Powershell?
suppose I made a request via SOAP with some fake data and no errors occured – how can I be sure the data is there in that external DB and it is consistent
Simple – query for that data, either via your SOAP API, or by adding some SP to your test database especially designed for testing purposes. Of course, the re-adding of the SP code to the DB should be reproducible, so you can add it automatically to your testing copy of the database everytime you made a copy of the production DB.