Sometimes, it is necessary to export or import large amounts of data into or out of a database. Those operations are called bulk export and import operations.
ApexSQL Diff and ApexSQL Data Diff provide a user friendly GUI for SQL comparison and synchronization of database schemas and database objects. They both include the command line interface which provides the exact same options, and which you can use to schedule unattended comparisons and synchronizations
But, what can you do when the features that the GUI and CLI provide are simply not enough? In such cases a more flexible solution is needed and the good news is – there’s a programmable API
The applications used by travelling sales representatives, or other field workers – delivery drivers, visiting nurses, etc. are designed to collect data from remote locations and then send it to a data center. Also, the data from the data center, occasionally need to be sent back to these remote locations, to keep them up to date.
For example, whenever nurses pay a visit to a patient, they enter the information about the visit into the database on mobile devices. At the end of the day, all these entries created during the day are sent to the central database in a hospital. After that, the nurses can synchronize mobile devices with the database in the hospital data center, so they get the new information about their patients, and also, the information about any new visits they need to make the next day.
In the scenario such as this, there’s constantly a need to synchronize the information from a mobile device to a central database.
The time when source control systems were used only for code development is gone. Today, database versioning is quite common, even in small companies.
As there are many different source control systems, with different features and options, there are no strictly defined rules how to use them. Each company can create its own standards and rules for using a source control system for database versioning, following the best practices and recommendations.
There are certain situations when creating a build script from a database is necessary – you might need a specific object (table, view, stored procedure, user, etc.) recreated or rolled back to a state it had earlier. If an online database doesn’t contain this object anymore or contains a newer version of it, the only places to look at are full database backups. If your database is versioned under source control, you can easily find object there but let’s assume you only have SQL backups
Sometimes the best test data is the data you have in your live, production SQL Server database. Since using production data is most often not an acceptable option, this requires retrieving the data from a live database table and inserting it into a new table. The challenge arises if you don’t want to insert all records from the original table but only the records that comply with certain conditions
How to create INSERT INTO script that moves only specific data from one table to another
When work on a database in a development environment is completed, developers are required to send the new version in for testing. If the testing is done on one or two machines – this is not a problem. However, if you need to test the new version on 10 or 15 machines (with different operating systems, SQL Server versions, or service packs installed), this can be tedious.
Another situation where scripts need to be executed on multiple servers is if you have many databases for different clients that were all derived from the same database and share many common objects. Any change you make in your development must be pushed to all users.
SQL developers and DBAs are increasingly required to include data management and SQL Server auditing into their custom applications. For companies that have to comply with a certain compliant policy and obey all security and privacy rules and standards, this is a must
One of the ways to audit a SQL Server is by mining database transaction logs. To be able to do this the database has to be in the full recovery model, and regular transaction log backups need to be created. This way, there will be enough information about the transactions and the online database transaction log will not grow too much in size
There are numerous scenarios when data synchronization between the two databases is needed, such as distributing data to different locations, integration of data from different locations into a central repository, or just a simple synchronization between your test and production database.
The recommended solution is to use database replication.
However, even with database replication there’s a need to check whether all the data is synchronized and to synchronize it if needed.
It’s common knowledge that running database changes directly in production, without thorough testing first, should be avoided
If you have enough resources in your environment, you will have at least one testing and one production SQL Server instance. However, that introduces another challenge. When everything is set up in the test, and runs smoothly and as expected, how can it easily be applied to the production instance?