Kindness costs nothing but pays you back in abundance. And it’s a blissful experience when you get to work for a project that revolves around all form of kindness. We’ve been working on a non-profit organisational project since a year now, which aims at bringing together all sorts of such organisations from a host of categories. Their core motive is to empower people and communities, by bridging the gap between aspiring volunteers and non-profit organisations.
We found the concept to be really inspiring and interesting and hence knew, we’re going to love working on it. The larger job was to work on and built a robust back-end system for their website. But we didn’t mind going an extra mile and sharing ideas that were in favour of the website. Let us start from the start and share our journey of this project with you!
The Plan
We’re were asked to create the entire back-end using ASP.Net version 4.7 and being one of the most trusted .Net Development Companies across the globe, the client had high faith us and we had to once prove our worth.We also had to integrate the Entity Framework, HubSpot CRM, Deep Link, and Custom controllers to make the website a robust and highly effective one. The migration of the database on one click also had to be made possible, in order to avoid a time-demanding and lengthy process of migration.
Actualizing the Plan
Implementing the entire plan wasn’t a task for our pro-team and hence, we succeeded at each of the tasks. Be it the entity framework, deep link or HubSpot CRM. Everything went smooth enough until it came to migrating the data.
The Challenge
Migration of the data was that one challenge, which took a while for us to decode and accomplish. We tried quite a few methods, but since the database was hosted on Azure the migration just wasn’t working out.
Addressing the Challenge
We have had worked on multiple other projects of migration. And hence, we tried following those tricks for this one project. Let us share a few of them with you:
-
We initially used the keyword for reading data from one database table to another. But since the database was hosted on Azure, it did not allow transferring 3 part database names.
-
We later tried the migration by trying the static way, where we did the reading of the data from each database in the dataset and insert them into individual column and data of the other table. This was indeed a workable method, but not an advisable one. Because the project was live and editing/adding new tables and columns in the existing one is not a healthy way of doing it. Every time that we change the DB structure, the migration needs to be updated for adapting the changes made. This entire process is not just cumbersome and lengthy, but also prone to errors.
The ‘Triumph’ Moment
Amidst this entire process of trying and finding out the apt method meant for migrating, we stumbled upon an idea that so far rested somewhere in the corners of our mind. And that was the approach of ‘Master Table’.
-
In this approach, the tables and columns are read dynamically.
-
Any change in the table and its structure can be easily taken care of, where only the name and hierarchy level of the of the table needs to be updated in the master table. That’s all and the problem of migration is sorted!
Below is the ‘Master Table’ that helped us resolve the issue we faced in the process of migration:
Id | Table Name | Table No. |
1123 | AdminUser | 1 |
1313 | SignRole | 2 |
Let us also share the functioning of the table that made migration smooth and quick.
-
We can delete the data from the existing database from bottom to top(descending order from the last number to the first one, which keeps the references intact.
-
Once that is done, you can insert the data in the tables from top to bottom(ascending order from first to the last number), which is again helpful in managing the references.
The Afterword
The trial and error method might be a tiring one, but one never knows what works and how! This process of migration was one such project for us. After multiple efforts of making the migration smooth and quick, when the ‘Master Table’ thing finally worked, it truly was a moment of triumph for us!
If our method of migration helped you in any way, kindly share your response with us. We’ll love to hear from you and bond over coding. And if you’ve got a suggestion, regarding the same, even great! We’ll be even more eager to hear from you, for we seek for new opportunities to learn from all around!
Frequently Asked Questions
Migrating a database with a Master Table involves transferring data from one database to another, using a designated master table as a guide. This ensures a smooth and organized transition, maintaining data integrity and accuracy.
Choosing a Master Table depends on factors like the complexity of your data, the need for a systematic transfer, and the desire for a centralized control point. It's beneficial for projects where maintaining data accuracy and organization is crucial.
This approach is versatile and works across various database types and platforms. Whether you're migrating between SQL databases, NoSQL databases, or a mix of both, using a Master Table provides adaptability.
The Master Table acts as a roadmap, ensuring data is moved accurately and consistently between databases. It is crucial to prevent errors, maintain relational integrity, and provide a successful migration.
Challenges may include handling schema differences and ensuring compatibility between databases. Thorough planning, understanding the data, and employing the right tools can help overcome these challenges.
Yes, the concept extends to ongoing synchronization. By regularly updating the Master Table, you can ensure continuous consistency between databases.
When appropriately implemented, the impact on database performance is minimal. Careful mapping and consideration of data volumes are essential for a smooth migration without significant performance issues.
It's suitable for both large-scale and smaller projects. The key lies in meticulous planning and execution. For larger projects, optimization strategies like incremental migration and parallel processing may be employed.