Migrating to Snowflake, Redshift, or BigQuery? Avoid these Common Pitfalls

The Drive to Migrate Data to the Cloud

With data being valued more than oil in recent years, many organizations feel the pressure to become innovative and cost-effective when it comes to consolidating, storing, and using data. Although most enterprises are aware of big data opportunities, their existing infrastructure isn’t always capable of handling massive amounts of data.

By migrating to modern cloud data warehouses, organizations can benefit from improved scalability, better price elasticity, and enhanced security. But even with all these benefits, many businesses are still reluctant to make the move.

How to Migrate Your Data From Redshift to Snowflake

For decades, data warehousing solutions have been the backbone of enterprise reporting and business intelligence. But, in recent years, cloud-based data warehouses like Amazon Redshift and Snowflake have become extremely popular. So, why would someone want to migrate from one cloud-based data warehouse to another?

The answer is simple: More scale and flexibility. With Snowflake, users can quickly scale out data and compute resources independently by automatically adding nodes. Using the VARIANT data type, Snowflake also supports storing richer data such as objects, arrays, and JSON data. Debugging Redshift is not always straightforward as well, as Redshift users know. Sometimes it goes beyond feature differences that could trigger a desire to migrate. Maybe your team just knows how to work with Snowflake better than Redshift, or perhaps your organization wants to standardize on one particular technology.

How to Migrate Data From SQL Server to PostgreSQL

Migrating data between different types of databases is not a trivial task. In this article, we will compare several ways of converting from SQL Server to PostgreSQL.

Microsoft SQL Server is a great database engine, but it has drawbacks in some cases.

7-Step Data Migration Plan

Data migration is complex and risky — yet unavoidable for most companies' processes. Especially now, at times of mass transitioning from on-premises systems to the cloud, companies are migrating their data to or in-between Microsoft, Google, or AWS cloud storage. 

Regardless of the reasoning behind your data migration, the process and its pitfalls stay the same: downtime, data misplacement, data corruptions, losses, leaks, format incompatibilities, etc. In fact, Bloor’s data migration report shows that 84% of data migration projects overrun time or budget and 70-90% of migrations don’t meet expectations. 

A Tale of Two Migrations

Within an enterprise, there are services (systems really) that are widely popular, offer just what you need, and are easy to use. There are also systems, which for years the organization tries to decommission but they have so many applications depending on them, so many strings attached, it seems impossible. Often, it's the same system, at different points in time.

Recently while exploring a legacy application in order to design its Cloud-native replacement, we identified a connection to such a system. We will refer to this system as the SAK (aka Swiss Army Knife). We wanted to do our part and remove one more string. The SAK’s service we consume acts in essence as a proxy for a database. After investigation, we found out that our application is the only one using the specific data (and thus the service). For the data, imagine a contact list (it's not really a contact list), which facilitates the main business offering of the application. I know I am vague, but I have to be. The data in question make the main functionality easier but their lack does not make it impossible. You could still make calls without your contact list, but it would be a pain. Some clients use the application daily and some might not use it for months. 

How to Migrate Data From Neo4j to Nebula Graph

This article mainly introduces how to migrate your data from Neo4j to Nebula Graph with Nebula Graph Exchange (or Exchange for short), a data migration tool backed by the Nebula Graph team. Before introducing how to import data, let’s first take a look at how data migration is implemented inside Nebula Graph.

Data Processing in Nebula Graph Exchange

The name of our data migration tool is Nebula Graph Exchange. It uses Spark as the import platform to support huge dataset import and ensure performance. The DataFrame, a distributed collection of data organized into named columns, provided by Spark supports a wide array of data sources. With DataFrame, to add new data source, you only need to provide the code for the configuration file to read and the Reader type returned by the DataFrame.

MariaDB SQL Set Operators

Set operators are the SQL operators that deal with combining, in different ways, different result sets. Say you have two different SELECTs that you want to combine into a single result set, the set operators come into play. MariaDB has been supporting the UNION and UNION ALLset operators for a long time, and these are by far the most common set operators.

But we are getting ahead of ourselves here, let me first explain the set operators that we have and how they work. If you want to give this a try, you can use your existing deployment of MariaDB Server, or try this out in a MariaDB SkySQL cloud database.

UNION and UNION ALL

The UNION and UNION ALL set operators add the result of two or more result sets. Let's start with UNION ALL and UNION will then be a variation of UNION ALL.

Concise Guide to Data Migration

Migrating data is a challenging but very important process: it’s a fundamental component of upgrading or consolidating servers, conducting server maintenance, relocating data to a data center, adding data-intensive apps like data lakes and warehouses, among other important processes.  

Because of the complexity of data migration and risks associated with it, such as costly downtime or corrupted or lost data, understanding the process and having a solid data migration implementation plan is critical. 

How to Approach Data Migration in 3 Stages

Moving data from one system to another can be a complex process. In this post, we have broken down how to approach data migration in 3 stages:

  • Before you migrate
  • Ready to migrate
  • Once you’ve migrated

Your requirements will take into consideration:

Managing Business Risks of Large Scale Cloud Migrations [Webinar Sign-up]

The leading enterprises continue to drive digital transformation and are modernizing their data architecture to take advantage of the many economic and functional benefits enabled by the cloud. While the move to the cloud is making companies more competitive, lean and nimble, many technical teams are concerned about the complexities and business risks associated with large scale data migrations.

Join technical experts from Infosys and WANdisco as they share technical insights about the risks and costs associated with large scale data migrations. Learn how technical teams can avoid these business risks by leveraging a LiveData approach using WANdisco solutions, and how Infosys and WANdisco have recently worked together on behalf of a global retailer on a successful 3.5 petabyte business-critical data migration project, completing it in 72 days with minimal business disruption and zero data loss.

10 Best Practices for Data Migration

It’s highly probable that at some point, your business will have to go through a data migration process. Data migration involves moving current data from one storage system or computer to another. 

Data migration is a very complex task. Today we present the best practices that will help you to carry out this process properly.

Learn How Data Mapping Supports Data Transformation and Data Integration

Data mapping is an essential component of data processes. One error in data mapping can cause ripples in the organization, bringing it to ruins through replicated errors and inaccurate analysis. So, if you fail to understand the significance of data mapping or how it’s implemented, you are minimizing the chances of your business becoming a success. 

In this article post, you’ll become aware of what data mapping is and how it can be done.

3 Pitfalls Everyone Should Avoid With Hybrid Multicloud (Part 4)


The daily cloud hype is all around you, yet there are three pitfalls everyone should avoid.

From cloud, hybrid cloud, to hybrid multi-cloud, you're told this is the way to ensure a digital future for your business. These choices you've got to make don't preclude the daily work of enhancing your customer's experience and agile delivery of those applications.

Company Overview: HYCU

I had the opportunity to meet with HYCU today, the sixth on the IT Press Tour #31. Simon Taylor, CEO, led off the “Leverage to Scale” presentation with an overview of where the company has been providing purpose-built back-up and recovery for Nutanix to growing to provide solutions for the rest of the cloud. They currently have more than 1000 customers in 52 countries and are as close to Google Cloud Platform (GCP) as they are Nutanix. 

While Veritas was built for Unix, Commvault for Windows, Veeam for VMware, HYCU is being built for multi-cloud. As budgets moving to lines of business (LOBs) with more control and power, hybrid-multi-cloud infrastructures are proliferating with data protection being siloed and mired in the past.

Mistakes to Avoid When Adopting Salesforce Data Migration

Many of you have this misconception that data migration is nothing more than moving records from one system to another. But what if I say it is one of the most challenging and time-consuming tasks? Moreover, it demands ample planning along with a good understanding of the current system.

Implementing software without appropriate challenges is just a myth. And all those cloud-based software deployments that revolve around products from Salesforce.com are no exception. Salesforce implementations, additions, and upgrades are on a global rise. According to IBM, using Cloud-based CRM software solutions generates powerful returns, as much as 50% improvement in productivity and 65% improvement in sales quotas. Plus, it reduces labor costs by an average of 40%.

QA Approach for Data Migration

Several systems are getting migrated to latest technologies these days. Cloud enablement is one of the major reasons for migration. The other reasons being cost reduction, productivity improvement, and flexibility in managing data. When a system gets migrated, the data held by the system cannot be ignored and needs to be migrated. The data migration process involves huge risks because of the high volume and criticality of the data to be migrated. The data consists of both business data and customer data. Business intelligence is contained in the business data, which is mainly the rules and parameters used for the successful processing of the application; whereas the customer data includes data such as buyer demographics and consumer ratings. There are several steps involved in the successful migration of data from one system to another.

First and foremost is the planning and determination of the scope of the migration. Once the scope is identified, we can to list out the components with the help of SMEs and people working on the system. Planning should be done for allocating the resources, including the tools and the phases of data migration. This includes the final step of the actual migration of production data, which may impact the availability or downtime of the system. Plan in such a way that the business continuity is interrupted at a minimum level.

You Don’t Have to Be a Big Corporation to Have a Great Database Migration to the Cloud

data migration

IT is the heart of every business. Business records, plans, employee records and so much more is handled via IT. Even the smallest companies rely on computing to handle all the important aspects of their business.

No matter how convenient IT support is, it still has a stranglehold on the budget. Purchasing and repairing servers and hard drives are increasingly expensive, which is problematic for small businesses.