Bad Data Costs Big Money, Part 1


Bad data costs money.  How much money?  It is estimated that the U.S. loses 3 trillion dollars per year according to IDC’s estimate in 2016.  That is a lot of data that goes unused and/or is bad.  So you should be improving your data right now!  How do you do this?

There are many different options available.  Cloud Performer has engaged in many projects to improve data for our customers.  One thing I am frequently asked about is moving data from a legacy system to Salesforce.  This process seems fairly straight forward until you start looking at how much data that needs to be moved or transformed to fit into the new Salesforce model.  There is also duplication and unclean data that needs to be corrected before moving.  This is where we start building an ETL (Extract, Transform, & Load) script.  We’ve used many different tools and applications to perform this data work.

In the next several blogs I will start walking through the ETL process that we use. I will also will be discussing data cleansing and data deduplication.  We will be going through some of the tools that are used and how the data is stored, transformed, and loaded. Stay tuned for Part 2!