What I am doing is creating a DBProject.. importing a DB.. then creating scripts in the PostBuild that are DATA scripts for each table (so that the post build can populate the tables after the schema is generated..
Then problem is some of these tables containg 50,000 rows and when you put a 50,000 row insert into ONE .sql file for the post build it fails because it is too big.. so.
-
Is there a better way to achieve this ? (getting a schema and data of a DB created as part of a DBprojects Build/Publish)
-
If this way is valid.. Is the only thing to do is create another post build process that manually breaks the one big .sql file into smaller .sql files ? (so it can build/publish)
3
We had a similar situation (actually we were using Entity Framework to create and populate the database) where we were populating a table of address data. Basically, we exported the address data into a csv file and used the SQL Bulk Insert command to process it (which took seconds instead of minutes to do). Possibly doing the same within the post build of your DBProj would fix your problem.
2