No-nonsense marketing and analytics best practices from
international industry leaders, straight to your inbox
When you’ve worked long enough in any industry, you will start to notice some patterns. Ideally, those are positive patterns, like best practices and proven approaches. On the other side of the spectrum, we can find some negative patterns (sometimes called anti-patterns) as well, like common misunderstandings or antiquated solutions.
On any normal day, I personally prefer to write about the positive sides of the analytics industry. There are many things to be excited about, whether it is new solutions rising to popularity or making the most out of existing tools. Today’s post is going to be a bit of a deviation from that, but not without showing some ways to get into a better position towards the end.
Today I want to talk about one of the most common less-than-best practices in analytics: The urge from companies, teams, and individuals to build custom, internal tools. If you look and ask around in the community, it is hard to find any company that is not relying on individuals maintaining a tool that has been built years ago (sometimes even from previous team members who left the team) for a very specific task and has since outgrown its original use case and user base.
For this post, I’m going to discern some of the reasons why those tools are built, how they grow into a substantial cost factor and risk for their companies, and what can be done instead. Of course, large parts are inspired by what I’ve seen from Accutics’ own customers since I joined the company, which was even inspired by what I’ve experienced myself on the customer side.
To start, let’s first take a look at...
The temptation to “just build it yourself”
Even in today’s market, with analytics tools like Adobe Analytics or Customer Journey Analytics becoming more and more end-user friendly and less and less technical, digital analytics remains a somewhat-technical field. We are constantly dealing with java script in the frontend, eVar expiration and attribution models in the backend, and sometimes even SQL or Python for companies who don’t get Analysis Workspace. Our daily lives can be pretty complicated.
As a natural result of that, we like to staff our teams with people from a technical background. Common ways into our industry include experience in web development, statistics, programming, and other highly specialized areas. We generally like complex environments, figuring out how things work, and finding solutions through everything we bring to the table. And that’s good!
adoption
With those common backgrounds and the teams commonly featuring skills like this, it’s no miracle that many analysts and analyst-adjacent team members know how to and enjoy some level of programming. Practically everyone has built a small website once, built or adapted a small script for some routine job, or worked with the APIs of an analytics tool. That’s good, too!
Where it then starts to get less good is when we allow our previous experiences and skills to influence other, less familiar areas. Even though we built a website once, building a technical tool for others to use is an entirely different game. Using an API and coding a script to automate the task is brilliant, but building a large-scale platform to automate diverse tasks for business-critical operations is very much not the same. And while it can be tempting to embark on a new skill-expanding adventure every now and then, the ongoing need for support and new features can lead to quite a bit of stress to keep up with the demand, often leading to substantial follow-up investments and/or frustrated users. That’s less good!
I’ve gone on record in the past complaining about the constant temptation of adding complexity to our daily work lives and companies. Given that we are commonly surfing on the edge between boredom and being overwhelmed by new intellectual adventures, I have heard about escalating pet-projects and subsequently frustrated stakeholders a few too many times. Considering that working in digital can be even more overwhelming to our business partners, we should remind ourselves to always strive for a reduction of everyday complexity rather than adding to the overwhelmingness.
On a way more business-related point, something that is very commonly underestimated is…
The surprisingly high cost of self-built tools
Since I’ve been through the Dunning-Kruger Curve on this topic myself, I want to discuss some of the false assumptions that might drive an effort to build something internally over just buying a solution on the market. One of the biggest misconceptions is that building tools is more affordable than buying. Let’s go over an example.
Marketing and analytics teams are commonly looking for ways to reliably track marketing traffic across channels, brands, and global teams. In the Adobe Analytics world, customers usually leverage Classifications of unique campaign tracking codes to provide metadata for campaigns. Using Classifications, marketing can use URLs like https://www.accutics.com/?cid=1_1234 to signal to Adobe Analytics that is later translated into channel, campaign, and other information.
In this setup, there are two critical tasks to be done:
- Globally unique campaign codes need to be created to reliably identify each and every creative across all channels and markets
- Campaign codes and their associated metadata need to be ingested into Adobe Analytics as soon as possible to ensure reliable reporting right from when the campaign launches
At the start of that process's maturity journey, teams commonly use manual processes with the tools they have available to them. Excel is usually the first tool getting used, as it works well enough to collect metadata in a somewhat standardized way. Creating unique campaign codes is very challenging, especially considering that the same Excel file might be used by marketers and agencies across channels, oftentimes copied and sent via email, and edited without proper change management. After the information is collected, it is oftentimes up to the analytics team to manually convert the Excel file to a CSV file and ingest it into Adobe Analytics, creating delays and even prioritization conflicts. If the analytics team is heavily utilized or hit by the aftermath of a recent Christmas party, marketers are met with the horrible choice of either delaying a campaign launch or risking that traffic is not tracked correctly, compromising the perceived value of their channel. In a worst-case scenario, the person who originally built the Excel sheet might leave the company, leaving everyone with the choice between re-building it from scratch or trying to make sense of what has been left behind. “There must be a better way for this, I bet I can build something!” is a common next thought from that one team member who has built a website before.
Now, let’s say it takes that team member a month or two to build a simple tool that lets stakeholders campaign codes on their own. While the time estimation is rather optimistic, the initiative will be met with much enthusiasm from the previously bottlenecked stakeholders. They might even contribute some budget to get the issue solved once and for all. And while the resulting tool might get the job done well enough initially, it quickly starts showing its limitations. Tools like this are usually overly technical, cumbersome to use, and don’t meet compliance or accessibility requirements. Even worse, once that tool should be made available to a larger audience over the internet, a whole new list of requirements from IT, security, and legal start hammering on the poor team members who were trying to be helpful. The added complexity leads to slower updates, long processes, and often outages that hinder the marketing teams just as much as the manual process has in the past.
At this point, some might consider handing over the project to IT. While it is hard to communicate the exact requirements, feature ideas, and involved APIs to a less-involved developer, the delays and even more cumbersome access processes usually lead to an even lower adoption rate than the less-professional-but-working-fine version. If a higher support level is needed, the cost of having the IT resources on standby can be substantial.
Let’s try and put some numbers behind this endeavor. Of course, the numbers may vary depending on your exact setup, but the direction should be accurate enough for our discussion today. Here’s what we can assume:
- With the initial, manual process, it’s not uncommon to see analytics team members spending a day per week on managing classification imports, helping troubleshoot errors in the Excel files, and explaining the process and requirements to involved stakeholders. That’s 20% of their time on average (assuming a single person handles the process and never takes time off) for one team member, who earns somewhat around 70,000$ per year in London. That’s 14,000$ in cost already.
- With the manual process in place, marketing oftentimes is faced with either delays in their operation or decision making, as well as the consequences of suboptimal marketing spendings due to misattributed traffic through wrong or missing campaign codes. Even considering a small yearly marketing budget of 1,000,000$, even a 10% lower performance or added complexity quickly adds up to 100,000$ of wasted budget. It’s crazy how even small inefficiencies drive up the cost as soon as big budgets are involved.
- Building the first iteration of an internal tool for two months adds another ~12,000$ to our bill, even ignoring the next iterations, bug fixes, and feature enhancements. With the constant need for monitoring, hosting, and troubleshooting, this point could quickly become even more expensive.
- As a last resort, the added “professionality” from involving internal IT will quickly drive up our bill even higher. Depending on the initial scope, long-term vision, available resources, the cost could easily go up into the millions of dollars range. We’re going to stay in the lower range for this post, assuming a relatively affordable 50,000$.
That’s a lot of money! Over one or two years, a company climbing up the maturity ladder would spend close to 200,000$ on the process of managing Adobe Analytics Classifications with a manual process and later some internal tools. Given the many stakeholders involved and the many decisions made along the line, it will be exceedingly difficult for any single individual in the chain to consider the whole undertaking and see the high overall cost for the company.
Of course, this article would be rather depressing if I would point you to the fact that...
organization
fall behind