In April 2022, I had the opportunity to sit down with Pierre Brunelle from Noteable to talk about data driven cultures and using data effectively. What follows is an except of the core question. How can you effectively leverage data to take your company to the next level? Read the full interview with Authority Magazine.
1. Centralize Your Data, Then Get Every Tool In Sync.
Once you pick a cloud data warehouse where you can store your data, your first mission should be to dump raw data from every tool and system your company uses there. Once the raw data exists in a central location, your next step is to sync that data into the SaaS tools you rely on (like Salesforce, Hubspot, Mailchimp, Intercom, etc.). You can accomplish this with various data orchestration or reverse ETL tools.
The goal is to make sure that any data your company generates is accessible in every tool that the business is using. It’s unreasonable for every team to be able to query the warehouse directly, so you have to meet them where they’re at. This setup helps get every team on the same page about customer touchpoints and encourages collaborative thought on what data points can drive the business forward.
For example, you want your sales team to know about marketing touchpoints that a customer has had. You want your support team to know about product interactions a customer has had. You want your marketing team to know about support conversations a customer has had.
At Shipyard, we take all our product generated data and sync it to our existing sales and marketing tools. This helps us better understand someone’s usage of the platform and the integrations they rely on so we can cater communication to their unique needs.
2. Build Real-Time Alerts on the Good and the Bad
When you start storing your data from every SaaS tool that your business uses, you gain the unique opportunity to build job-enhancing alerts into the core of your business. You’ll want to build alerts that look for positive events that should be quickly acted on and negative events that indicate something could be wrong and should be investigated. It’s important to focus on the actionability of an alert. If you can’t do anything about it immediately, don’t create an alert, lest you end up with notification overload.
On the positive side of things, you could use event tracking tools to determine which users are visiting your website. Anytime a prospect with a known email visits your pricing page, you could create an alert to notify the sales team. They can then prioritize conversations with leads that are likely close to converting. Going a step further, you could eventually automate the outreach on behalf of sales!
On the negative end of the spectrum, you could look at your payment processing data every day to see if no subscriptions had been invoiced in the last 24 hours. You would almost never expect this alert to fire, but in the rare event that it did, you would know about issues with payments almost immediately, resulting in a quicker resolution.
When you start making custom alerts foundational to your organization, you can start building out team processes around them. Which tasks can our team build alerts for? Who is responsible for which alerts? How can we claim the alert so two people don’t address it at the same time? These types of questions greatly accelerate the standardization of work while increasing the output of a team over time.
3. Prioritize Automated Self-Service Reporting
This may seem like the obvious choice, but that doesn’t make it any less powerful. Pulling reports is the bane of most team’s existence. Hours get spent pulling data from multiple different sources so it can be compiled together in an Excel sheet or a PowerPoint. As a business, you should focus on how you can turn the time spent pulling reports into time spent analyzing reports more strategically.
Better yet, ditch Excel and train your team to build self-service dashboards for reporting. Not only are they better for pulling in up-to-date data automatically, which reduces the overall manual labor, but they also allow the people closest to the data to explore and gather new insights. Too many businesses put all the dashboarding work on a team that doesn’t understand the ins and outs of what the data represents.
At Shipyard, we make all our data available to Metabase (but any BI dashboarding tool will do). Here, analysts and business users alike can “ask questions” of their data with SQL or with a drag-and-drop builder. These questions can then be published and discovered by anyone on the team. Any user can compile these questions into daily and weekly dashboards that refresh on a schedule and are easily accessible to the rest of the team. As more of the team builds out questions, we gain better discoverability into our data over time.
4. Automate Decisions First. Save Machine Learning for Later.
A lot of organizations jump the gun and try to use their data to solve the most complex problems their organization has with AI and machine learning. However, most data scientists will tell you that it takes a lot of work and experimentation just to move the needle a few percentage points. Even then, it’s not guaranteed that the effort will continue to drive improved results.
Instead of jumping straight to the finish line, take a step back and figure out what tasks people at your organization are doing on a frequent basis that are taking the most time. Make a list of these tasks and identify what data sources are being looked at to achieve the task. Determine which of those data sources can be programmatically exported and stored. Of those that meet the criteria, have your data team focus on automating the tasks with the highest number of hours spent.
For example, your marketing managers may spend a few hours each week sorting through search term reports to find new keywords that they don’t want ads to show against due to low performance. Rather than having them sift line-by-line in the UI or Excel, you could instead store this data in your warehouse and create a query that filters down to any search term that falls below a specific performance threshold. The results of this query can then be added directly to your account’s negative keyword list.
Or let’s say that your HR team needs to send follow-up messages to all hourly employees that have yet to submit their timesheets by the end of each week. You would need to access a dataset of employees (filtered by those that were hourly) and access a dataset of timesheets (filtered by those submitted in the last week). Thinking of the problem from a data perspective, this would consist of running an outer join between two datasets which would return all employees that have yet to submit a timesheet. This list of employees could then be looped through with a message sent to each.
A lot of teams get caught up in using data to make more efficient or effective decisions, but the simple act of automating a business process with no improvements to the logic can greatly save on employee expenses, freeing up time to focus on higher value initiatives. While you may think of automation as an engineering problem, 90% of the work is a data problem.
5. Track Data Usage with Metadata.
This may sound counterintuitive at first. How does tracking data about my data help take my company to the next level? This is one of the most overlooked steps in the data process that determines if your data is useful or not. Most companies focus so much on getting all the data available that they forget to take a step back and verify which data is driving the most value and who is responsible for that value.
When I was at PMG, we started out with ad hoc datasets that anyone could set up themselves, but these got wildly out of hand. This resulted in us developing a set of standardized datasets for every client. Every table would have the same naming structure and the same schema, guaranteed. We built out a process to analyze every query that occurred against the entire database. Who ran the query? What clients did they serve? Was the query run in a SQL editor, Excel, Tableau, or through an automated script?
Our mission was to use this metadata to guide our team’s limited hours of work. If a team was constantly using the standardized data, we could work with them to create internal case studies to share with other teams to increase cross-client collaboration. If a team was overutilizing the ad hoc data, we could have conversations or training sessions with that group to better meet their needs. Metadata could help us identify our power users and points of contact for feedback along the way. Armed with this information, we were able to decrease reliance on ad hoc datasets down to 20% over the course of multiple years.
Metadata can also help you isolate the tables, views, or columns that nobody is accessing. If some data is rarely being used, you need to determine if it’s worth supporting. If some columns go unused, you should assess if that data is actually driving business value. Sure, data storage is cheap, but it’s not free.
Ready to start taking your data to the next level? Shipyard can help in that journey. With the ability to build powerful data workflows that extract, transform, and load data from your warehouse into any other tool, you can quickly automate any business process. Get started today with our free Developer Plan.
Don't know where to start? Schedule a time with our data experts! We're happy to learn about your unique challenges and figure out how you can start driving action with your data.