Data and automation – either topic on its own, given its breadth, can generate endless hours of discussion points, brainstorming and eureka moments. The world more and more revolves around those broad terms. There are different ways to approach either. The less savvy, even if they are not aware, could be using them somehow already. Those with a more scientific approach to their routines – being more efficient, generating a better ROI or improving other measures of success – will feel compelled to tap into either term to improve what they are doing already.
For the sake of this article, let’s combine the terms. We will develop on them, but let’s focus on the basics. Not too long ago, my passion for marketing led me into data. From there, the necessity to deliver or develop a closer relationship with users made me question how to decode their behaviour at scale.
Was that the first time automation crossed my mind? No, but it does make the story sound nicer. Just like the least savvy of us, I was already doing it but without seeing the bigger picture, without connecting all the dots or building a proper foundation to create a better delivery. But I was on the right path; I had taken the first step. With the sheer volume of profiles, metrics and data – data that was readily available through analytics or other types of platforms – you can find yourself helplessly trying to connect or make sense of it all.
Yet connecting those different points and making sense of it all across a thankfully more fragmented ecosystem has always been a must.
The truth is that if you are on either side of this three-way match – on the tech, buy or sell side – data is neither standardised nor normalised. It does not follow common naming conventions across the ever-expanding list of ad and marketing technology providers. Investing time in thinking ahead and planning what to achieve must start from the basics, the easy and quick wins. You will soon empathise with Neil Armstrong’s quote about small steps, even if in a much smaller context. The more significant leaps will happen.
As you start seeing the correlations between processes and the need for a non-siloed view of everything related to zero-, first-, second- or third-party data or platforms, you will find the need for streamlining more processes through automation. For the savvier but niche group of us, understanding the full scope – mainly the vision that eventually becomes objectives – is crucial. I usually use simple real-life scenarios to Illustrate processes or to discuss use cases; this time around it will not be any different.
If you followed the traditional marketing career path like me, we have a common point in our journey. We have worked countless hours on the buy-side, namely at agencies. So let’s talk about one of the most common tasks, reporting. I am sure it also made you wish there was a better way to go about it. Automation, anyone?
I remember spending hours going through report extractions from different platforms that were then cleaned and normalised so they were somewhat ready for slicing and dicing data with our old best friend Excel.
One of the follow-up steps would be to go about connecting media results from different sources, sometimes going through an analytics platform to generate insights on post-engagement optimisation, its correlation with ROI and – most importantly – the causation of such.
Time was an issue. Reporting insights is one of the most critical steps of delivery, and it is not a coincidence I have chosen the subject as a primary example. Still, life in our vertical didn’t allow us to give that reporting the work it deserved. Other than that, going in-depth on a plethora of clients considered to be long-tail wasn’t an option, and because it wasn’t an option, as a business the agency could overlook potential increases in fees coming from media spends.
And so the need for automation becomes clear. The way to go was working towards standardising the naming convention across different trafficked lines, creating a taxonomy where non-normalised fields could be easily ingested through APIs (rather than report extractions) to be processed, and linking analytics to read the source-specific results. Easier said than done. Here is where having a clear vision and understanding the bigger picture – to be explained for internal and external stakeholder buy-in – shows itself to be a must-be-executed flawlessly step.
Showcasing a proof of concept (POC) aligned with direct and indirect commercial benefits in terms of human resources and hours, company image and efficiency (among other points) will help you. You will be managing above and below you in terms of expectations and (mainly functional) consequences. Managing upwards and downwards is a skill to be mastered and used, always. During the agreement on moving the POC ahead, it is imperative to have the involved stakeholders understand the short-, mid- and long-term objectives and results, and also what is needed to achieve them.
Illustrating the benefits is easy enough, as they rely on the definition of data automation itself. The challenge is better usage of time, but also the quality of the delivery. Those two to four man hours previously used to deliver a below-par report with incomplete insights can now be used to change things around for the better.
Automating data ingestion, classification and analysis will, in the short term, enable you and your team to effectively analyse and process larger data sets, increase speed and quality of delivery and minimise human errors and interaction in a not-too-nice task.
Mainly, you will allow human ingenuity and imagination to perform where it should, surfacing valuable and actionable insights to guide datadriven decision-making by properly using the most valuable currency there is, time. In a non-poetic sense, it will also ultimately free your team to outperform and work happier instead of smashing keyboards. Machines can do that work for you, without the smashing.
Sooner rather than later, start it and tell me where you get to.