Data Transformation Tools: Which Tool to Choose in 2023
Top Lists

Data Transformation Tools: Which Tool to Choose in 2023

Shawn Fergus
Shawn Fergus

Data transformation tools play a fundamental role in fields that use data management, like data analytics, data management, data engineering, and business intelligence.

These tools ensure information ingested from different data sources is well organized, properly formatted, and validated. They also ensure data compatibility between different types of applications and systems, often analytics and business intelligence tools like Pentaho, Domo, and Tableau.

However, despite these common destinations, not all data transformation tools work the same way. This doesn’t mean that one solution is better than another. Rather, the way a specific tool functions will be more or less suited to the needs of a business and its data management teams.

So, let’s break down what these tools do, the factors that inform making the best choice, and cover examples of the most popular data transformation tools of 2023.

What are data transformation tools? (And why do we need them?)

Spoiler alert: Data transformation tools are, at their most basic, mechanisms that transform data. This transformation typically happens as said data is transferred from source locations to a destination repository, like a data warehouse.

Additional benefits associated with this process include these abilities:

  1. Maintain data quality in real time
  2. Consolidate different types of data that may come in many different forms and formats
  3. Ensure the volumes of data being transferred can be accessed and analyzed when delivered as a newly transformed dataset

A good data transformation tool will do the above as quickly and efficiently as possible. And, as a whole, most data transformation tools fall into one of two broad categories: real-time tools and batch tools.

As their names suggest, real-time data processing tools continuously ingest data from sources as it’s transformed and loaded onto a destination repository. Batch data transformation tools ingest large volumes of data during intermittent windows of time and transform it all together before sending it to its destination.

What factors matter most when choosing a data transformation tool?

It’s important to focus on certain factors whenever you’re shopping for a new data orchestration solution. Data transformation tools are no exception, as the right tool for one organization may end up being a bad investment for yours. In general, the following factors should steer you in the right direction as you browse options, tiers, and offers:

Use cases:

  • Be sure you understand who your data consumers are—who exactly will use this new data transformation tool, and what will they try to accomplish in doing so?
  • Do these data consumers all work in the same fields or roles? Or, for example, will you have a mix of data engineers and scientists using the tool?

Your current capabilities and resources:

  • What are the strengths and weaknesses of your existing data infrastructure?
  • How much data does your organization currently store, and how is it stored?
  • How much is your organization prepared to invest in a new data transformation tool? (And how soon will stakeholders in your organization expect to have things up and running?)

Security and compliance:

  • What are the local and regional regulatory standards your data transformation tool will need to be in compliance with?


  • What integration process will be needed to implement the tool in your existing infrastructure?
  • Will the data consumers identified in your use cases be able to use your data transformation tool on day one? Or will any training or certification be required?

Objective reviews and testimonials:

  • What insights do independent review sites like G2, Gartner, and Capterra provide about potential tool/business fits?
  • Do reviews from actual users align with how the features and benefits of a tool are positioned?

Once you feel like you have a holistic view of what you’re setting out to accomplish, it’s time to see which data transformation tool factors best with your objectives.

Below, we’ve included some popular tools with links to get you started.

(Note: The tools featured below are listed in alphabetical order.)

Azure Data Factory

Microsoft’s Azure Data Factory offers code-free data flows that provide a data integration and data transformation layer to meet a user’s data transformation needs.

Top features:

  • Allows data engineers and citizen integrators to create and monitor pipelines without the need to code
  • Intelligent, intent-driven mapping automates copy activities, speeding up the transformation process
  • Microsoft’s managed Apache Spark™ service handles code generation and maintenance


  • Offers feature-rich APIs
  • Self-hosted integration runtime feature
  • Variety of different connectors available for data sourcing
  • Autonomous ETL can be a time-saver
  • Seamless integration with other organizations on Azure


  • May be cost-prohibitive for running long pipelines
  • Resources may be needed to monitor ADF performance
  • Some learning curve is required as this is still a relatively new product


  • Offers a consumption-based pricing model 👇

Azure Integration Runtime Price

  • Orchestration: $1 per 1,000 runs
  • Data Movement Activity: $0.25/DIU-hour
  • Pipeline Activity: $0.005/hour
  • External Pipeline Activity: $0.00025/hour


Informatica’s data transformation solution is a cloud-based platform. However, Informatica’s data tools are powered by CLAIRE, their proprietary meta-data-powered AI technology.

Top features:

  • Offers over 100 pre-built templates for setting up data pipelines
  • Informatica’s solution handles both ETL and ELT for multi-cloud environments


  • Fast data integration and replication
  • Provides users the option for transforming data via scripting (e.g., Python, SQL)
  • Replication capabilities enable efficient information duplication


  • Somewhat dated UI
  • Lack of consistency across products
  • Inbuilt scheduler may not handle more complicated scenarios


  • Informatica offers a 30-day free trial of its cloud data integration services.
  • Beyond its free trial, Informatica offers users a consumption-based pricing model


Matillion’s ETL software is cloud-enabled, allowing it to integrate with most data sources, and offers pre-built data source connectors for both cloud and on-premises databases.

Top features:

  • Offers pre-built, out-of-the-box data connectors to many popular applications and databases
  • Custom data connectors can be built quickly, without the need for coding
  • Matillion ETL provides users with the ability to build custom data connectors to REST API source systems


  • User friendly
  • Allows for high levels of automation
  • Compatible with multiple cloud warehouses
  • Grid iterators are well-thought-out


  • No container support at the time of this writing
  • Community experience could be improved


  • Matillion offers a free subscription tier that includes:
  • Standard customer support
  • Unlimited users and up to one million rows per month with Matillion’s data loader
  • ETL services are not available with the free subscription tier
  • Matillion user credits are universal and can be used for either Matillion’s Data Loader or ETL
  • Basic, Advanced, and Enterprise tiers are also available, each offering a free trial.


Shipyard’s services are designed to help data engineers and less technical data team members easily build solid, end-to-end data infrastructures.

Top features:

  • Shipyard’s data management platform is built with native code support and requires no proprietary configs to use
  • Multiple pipeline formats are available out of the box, and reusable snippets allow users to keep changes and updates in sync


  • Modular workflows can be quickly created and scheduled
  • Enables reusable no-code templates
  • Cloud-native infrastructure
  • Native support for Python, Bash, and SQL scripts


  • Data must pass through Shipyard, which may not align with organizations that require their data to remain in the cloud at all times.


  • Shipyard’s introductory Developer tier is free, and includes the following:
  • One user seat
  • Up to 10 hours of runtime per month
  • 72 hours of logs
  • Access to Shipyard’s blueprint library
  • Unlimited projects, fleets, and vessels

A Team tier begins at $50 USD per month and includes the following:

  • Everything in the free tier
  • Unlimited usage
  • 30 days of logs
  • Run times of up to 4 hours for jobs
  • Version control
  • Webhook triggers
  • API access

Looking for a place to start?

Data transformation tools are nothing less than essential for data-reliant teams. So, more tools on the market make it easier to find the right fit for your business (and your data teams). Just remember to keep your list of business factors handy as you browse to ensure the right fit becomes a wise investment.

However, if you just need a place to start, our Developer plan is free (and always will be). Sign up today to transform and refine your datasets into workflows in 10 minutes or less, reducing downtime and speeding up business processes—all without needing a credit card.