DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Data Privacy and Security: A Developer's Guide to Handling Sensitive Data With DuckDB
  • What Is SQL Injection and How Can It Be Avoided?
  • Snowflake Data Time Travel
  • SQL Server From Zero To Advanced Level: Leveraging nProbe Data

Trending

  • Grafana Loki Fundamentals and Architecture
  • Java's Quiet Revolution: Thriving in the Serverless Kubernetes Era
  • How to Build Local LLM RAG Apps With Ollama, DeepSeek-R1, and SingleStore
  • How To Develop a Truly Performant Mobile Application in 2025: A Case for Android
  1. DZone
  2. Data Engineering
  3. Databases
  4. Automatically Synchronizing Data in Two SQL Server Databases

Automatically Synchronizing Data in Two SQL Server Databases

This tutorial explains how to automatically synchronize data into 2 SQL Server databases.

By 
Jordan Sanders user avatar
Jordan Sanders
·
Apr. 14, 20 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
34.3K Views

Join the DZone community and get the full member experience.

Join For Free

Data in SQL databases often needs to be synchronized in real-time — this can be achieved by checking one database for updates and then applying them to another database. In this case, the process of change detection and synchronization should be run automatically on a schedule and with no need for outside intervention.

How We Can Achieve This Goal

Data Compare is an external tool that allows you to compare data in SQL databases, backups, and script folders. With dbForge Data Compare for SQL Server, you can schedule almost real-time database synchronization.

You can set up the process by following these steps:

  • Run Data Compare, and in the new Data Comparison window, choose the source and target databases in the corresponding tab:

  • You can set up various comparison settings in the Options tab if this is needed. Also, in the Mapping tab, you will be able to select which objects should be compared. Also, you can specify the key columns and the list of columns for comparison, if this is necessary:

  • When everything’s ready, press the Compare button in the bottom-right corner to start the comparison process. When the comparison is done, you can see the results in detail:

  • Select all necessary objects by using the corresponding checkmarks and click Save. The saved project (dcomp) file will contain all objects and options needed for scheduling data synchronization.

  • When the project (dcomp) file is saved, press the Synchronize button which will open the Synchronization wizard.

  • Choose ‘Execute the script directly against the target database’ so that the databases will be synchronized after you set up all necessary options, and press Synchronize. When the synchronization process is over, you can view synchronization results in the bottom pane of the window.

Automating the Process

As we have already successfully tested the synchronization process in Data Compare and saved the project (dcomp) file, let’s automate the process with a PowerShell script.

Setting Things Up

First, we’ll need to create a function that will check if the Outputs folder exists — it will be used to store date-stamped output summaries. We want to make sure that an easy-to-find application log of every synchronization is saved in case we will need to perform troubleshooting in the future:

SQL
 




x
16


 
1
#checks if the Outputs folder exists. If it doesn’t, the script creates it and returns its full path
2
function CheckAndCreateFolder($rootFolder, [switch]$Outputs)
3
{
4
    $location = $rootFolder
5
 
          
6
    #set the location based on the used switch
7
    if($Outputs -eq $true)
8
    {
9
        $location += "\Outputs"
10
    }
11
    #create the folder if it doesn't exist and return its path
12
    if(-not (Test-Path $location))
13
    { mkdir $location -Force:$true -Confirm:$false | Out-Null }
14
 
          
15
    return $location
16
}


 

Next, let’s define the root folder and the location for date-stamped output summaries: 

SQL
 




xxxxxxxxxx
1


 
1
#set the root folder
2
$rootFolder = "D:\DataSync\"
3
 
          
4
#set the location of output files
5
$outsLoc = CheckAndCreateFolder $rootFolder -Outputs



Variable and Switches

In this section, we define the application’s location along with the data stamp variable. Also, we define the variable containing the application’s parameters, such as:

  • the path to the saved project (dcomp) file
  • the /sync switch for direct synchronization of the destination database
  • the /rece switch which returns the ‘102 – No differences detected’ message when data sources are equal
  • a date-stamped output summary

The following script allows us to achieve this:

SQL
 




xxxxxxxxxx
1


 
1
#define the tool’s location, date stamp variable and the tool’s parameters 
2
$toolLocation   = "C:\Program Files\Devart\dbForge Studio for MySQL\dbforgemysql.com"
3
$dateStamp = (Get-Date -Format "Mmddyyyy_HHMMss")
4
 
          
5
#output log file path
6
$logPath = "$outsLoc\DataOutput_$dateStamp.txt"
7
 
          
8
$Params = "/datacompare /compfile:""D:\DataSync\Project\test_DB_1vstest_DB_2.dcomp"" /log:""$logPath"""
9
$sync = " /sync"



Execution

The next part of the PowerShell script will call Data Compare from its location with the parameters we stated in the previous step. Then, the return code variable is defined:

SQL
 




xxxxxxxxxx
1


 
1
#initiate the comparison of data sources
2
(Invoke-Expression ("& `"" + $toolLocation +"`" " +$Params))
3
     $returnCode = $LASTEXITCODE
4
     
5
     $message = ""



The script’s final part serves to create proper responses for the three possible outcomes:

  • An error occurred and the output summary will be opened.
  • There are differences, e.g. return code 0 – Success
  • There are no differences, e.g. return code 100 – No differences detected
SQL
 




xxxxxxxxxx
1
34


 
1
if ($returnCode -notin (100, 101))
2
     { #an error is encountered
3
       $logPath = "$outsLoc\DataOutput_error.txt"
4
 
          
5
       $message >> $logPath
6
       clear-content $logPath
7
       $message = "`r`n $returnCode - An error is encountered"
8
 
          
9
       #output file is opened when an error is encountered
10
       Invoke-Item "$logPath"
11
     }
12
     else{
13
     if ($returnCode -eq 101)
14
    {
15
    clear-content $logPath
16
    (Invoke-Expression ("& `"" + $toolLocation +"`" " +$Params+$sync))
17
    $returnCode = $LASTEXITCODE
18
 
          
19
   #schema changes are detected
20
   }
21
   if($returnCode -eq 0)
22
   {
23
       $message = "`r`n $returnCode - Schema changes were successfully synchronized"
24
   }
25
   else
26
   {
27
       #there are no schema changes
28
       if($returnCode -eq 100)
29
       {
30
           $message = "`r`n $returnCode - There are no schema changes. Job aborted"
31
       }
32
   }
33
   }
34
   $message >> $logPath



Now that the job has been automated, it can be scheduled in any way you prefer – for example, with the help of Windows Scheduler.

Reviewing Results

Once everything is up and running, the output summary can be reviewed anytime. In this example, the location of output files is defined by the $outsLoc variable, so the output files will be saved to $rootFolder\$outsLoc — in this particular example, DataSync\Outputs:

If an error occurs when the script is being executed, an error message will be displayed to provide more information about the potential cause of this error. Additionally, a DataOutput_error.txt file with details of the error will be created.

Here’s the script in its entirety:

SQL
 




xxxxxxxxxx
1
73


 
1
[expand title =”Full script“]
2
#checks if the Outputs folder exists. If it doesn’t, the script creates it and returns its full path
3
function CheckAndCreateFolder($rootFolder, [switch]$Outputs)
4
{
5
    $location = $rootFolder
6
 
          
7
    #set the location based on the used switch
8
    if($Outputs -eq $true)
9
    {
10
        $location += "\Outputs"
11
    }
12
    #create the folder if it doesn't exist and return its path
13
    if(-not (Test-Path $location))
14
    { mkdir $location -Force:$true -Confirm:$false | Out-Null }
15
 
          
16
    return $location
17
}
18
 
          
19
#set the root folder
20
$rootFolder = "D:\DataSync\"
21
 
          
22
#set the location of output files
23
$outsLoc = CheckAndCreateFolder $rootFolder -Outputs
24
 
          
25
#define the tool’s location, date stamp variable and the tool’s parameters 
26
$toolLocation   = "C:\Program Files\Devart\dbForge Studio for MySQL\dbforgemysql.com"
27
$dateStamp = (Get-Date -Format "Mmddyyyy_HHMMss")
28
 
          
29
#output log file path
30
$logPath = "$outsLoc\DataOutput_$dateStamp.txt"
31
 
          
32
$Params = "/datacompare /compfile:""D:\DataSync\Project\ALLA1vsALLA2.dcomp"" /log:""$logPath"""
33
$sync = " /sync"
34
 
          
35
#initiate the comparison of data sources
36
(Invoke-Expression ("& `"" + $toolLocation +"`" " +$Params))
37
     $returnCode = $LASTEXITCODE
38
     
39
     $message = ""
40
 
          
41
if ($returnCode -notin (100, 101))
42
     { #an error is encountered
43
       $logPath = "$outsLoc\DataOutput_error.txt"
44
 
          
45
       $message >> $logPath
46
       clear-content $logPath
47
       $message = "`r`n $returnCode - An error is encountered"
48
 
          
49
       #output file is opened when an error is encountered
50
       Invoke-Item "$logPath"
51
     }
52
     else{
53
     if ($returnCode -eq 101)
54
    {
55
    clear-content $logPath
56
    (Invoke-Expression ("& `"" + $toolLocation +"`" " +$Params+$sync))
57
    $returnCode = $LASTEXITCODE
58
 
          
59
   #schema changes are detected
60
   }
61
   if($returnCode -eq 0)
62
   {
63
       $message = "`r`n $returnCode - Schema changes were successfully synchronized"
64
   }
65
   else
66
   {
67
       #there are no schema changes
68
       if($returnCode -eq 100)
69
       {
70
           $message = "`r`n $returnCode - There are no schema changes. Job aborted"
71
       }
72
   }
73
   }
74
   $message >> $logPath
75
[/expand]



Thank you for reading.

Database sql Data (computing)

Published at DZone with permission of Jordan Sanders. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Data Privacy and Security: A Developer's Guide to Handling Sensitive Data With DuckDB
  • What Is SQL Injection and How Can It Be Avoided?
  • Snowflake Data Time Travel
  • SQL Server From Zero To Advanced Level: Leveraging nProbe Data

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!