When someone talks about “long tail” strategy, you probably think it begins and ends with search engine marketing. But it should be a part of your email campaign planning as well.
If you don’t track your results long after your campaign ends, you’re more likely to under-report your successes! With all the pressure on email marketing to produce results, that could be a major omission.
It’s easy to see why search dominates the long-tail concept, though. The term entered the marketing lexicon in 2004 when author Chris Anderson published “The Long Tail: Why the Future of Business Is Selling Less of More,” about the profit potential in selling less-popular goods with longer shelf lives instead of high-demand, high-cost, impulse-driven merchandise.
Anderson applied the long-tail concept to search marketing in 2006. He argued that less-popular but more specific keywords in both paid and organic search could be more effective than generic or expensive, high-traffic keywords. They might get fewer hits, but they’re more likely to lead searchers to the right results faster, he said.
The long tail of email marketing benefits attribution, too
Email campaigns have always had long tails, although we typically haven’t framed the discussion that way.
The email long tail comes into play when someone acts on your email days, weeks or even months after you send it. It doesn’t matter whether your subscriber opened your email the day you sent it and then went on to something else or let it sit unopened, just waiting for serendipity to strike.
This long tail is one of email’s many benefits, but you probably aren’t accounting for it right now because your campaign reporting window closes too soon. That does a disservice to your email program because it’s likely not getting all the credit it should for driving revenue, engagement and value for your company. Without this credit, your attribution and, in turn, your budget could suffer.
Dig deeper: Email marketing strategy: A marketer’s guide
Why we overlook email’s long tail
It all has to do with the way we report on campaign results. We typically look only at the initial reporting period and then move on to the next campaign. For many understaffed and overworked email teams, there’s only enough time for one glance at the reports before moving on to the next.
Another problem: The reporting period often is set arbitrarily based on how often you send emails, instead of data showing how long it takes for recipients to respond.
However, conversions and revenue happen over a long period, thanks in part to email’s “nudge effect,” which can inspire a customer to open an email campaign weeks after it landed in the inbox.
Low-commitment actions like downloads or time-sensitive actions like registrations will happen quickly. When we’re talking about spending money, long-consideration purchases or date-based bookings, conversions can take longer.
The problem is that we don’t continue to measure performance over the long term. Often, any activity that happens after the reporting window closes — say, a week to a month or even more after the campaign goes out — doesn’t get counted. That’s what I mean when I say our campaigns could have performed much better than our official reports show.
New research from the Data & Marketing Association’s Consumer Email Tracker 2023 highlights the need to keep revisiting campaign performance. Here’s one finding: 19% of consumers save their discounts, offers or sales email for a later date. This means they’re interested, but now is not the right time to act on them.
What you gain when you revisit campaigns
When did you last revisit your results long after the reporting window closed? I wager you will find email activity — opens, clicks and likely conversions. They won’t be as significant in volume as the activity your campaign creates in your initial reporting window, but they all contribute to your campaign performance.
We’ve found that if someone has saved an email to act on it later, their intent is higher; therefore, the conversion rate is higher than in the initial reporting window.
The result is campaign metrics that are likely more positive than you initially thought, especially accounting for conversions.
Dig deeper: 7 key email metrics to track beyond opens and clicks
Case study: Finding the right reporting window to measure success
This case study from work I did for a client reveals the need to include the long-tail results in your reports.
Here’s what we found:
Our client sends campaigns twice weekly and tracks them over four days. Using Google Analytics, we delved into their reports and found that a single campaign was under-recording their email success by 128%.
This wasn’t an anomaly. After checking multiple campaigns, we found similar results within their other campaigns.
Using this data, we were able to define the appropriate reporting period, which resulted in increased attribution to email and an increased budget.
- Original date range for tracking: March 8 – March 11
- 114 transactions.
- 1,294 web visits.
- 9% CR.
- £8,326 revenue.
- Expanded date range for tracking: March 8 – May 31
- 303 transactions.
- 2,317 web visits.
- 13% CR.
- £19,022.30 revenue.
We were astonished to discover how many customers who visited the website from this email in April showed high intent. Although the number of visits was lower than the previous 23-day period, their conversion rate was a whopping 37%.
5 steps to set up a long-tail research study
It’s easy to discover if you’re missing campaign activities, especially conversions and revenue. Follow my 5-step procedure for setting up a useful tracking study:
1. Ensure your analytics software is set up correctly and all your email campaigns and programs are tagged to track email success.
2. Check your dashboard: In your Google Analytics dashboard, choose 15 to 20 campaigns that you sent out within the last year.
3. Look for campaign activity: For each campaign, review all of the campaign activity until the present day or until nothing else registers. Keep track of how many weeks you go out before no more activity happens. This will give you an idea of your long-tail potential.
4. Expand your monthly reporting: Go beyond the regular reporting on your campaigns in that period. Go back a month and get updated metrics for those campaigns as well. If you discover long-tail activity from that period, add a page to your new report with those fresh metrics of the previous campaigns.
5. Find your cutoff: You will have to decide when to stop looking for activity. Remember, you need to recoup your investment in time, too. So, be realistic with this. But your long-tail research will help you find the right cutoff point.
Implications for A/B testing and long-tail conversions
In a typical A/B testing scenario, you send a control message to 10% of the list, the variant to another 10% and the winning version to 80% of the list. However, the winning version could be a conversion when you factor in the long tail.
This standard A/B testing procedure works well enough for testing using opens and possibly clicks as your success metrics, but not for conversions. This is because it doesn’t account for conversions that happen long after the short 2- to 3-hour testing period.
To account for this, you would do a 50-50 split, record the result, update the results over time and then you would consider the long tail based on conversions and set the test period based on the active period for your campaign, as shown in your analytics program.
Dig deeper: 7 common problems that derail A/B/n email testing success
Different campaigns, different tail lengths
As I noted earlier, some campaigns, brands or products likely don’t have long tails. A daily flash sale or hard-deadline campaign might generate some opens and clicks but no conversions. A season-opening cruise-line campaign could have a much longer active period. But you won’t know until you start checking.
My prediction is you will find more conversions happening and unaccounted for than you expect. Collecting and reporting that data will give you a truer picture of your email program’s actual performance and how it contributes to your company’s bottom line, attribution and budget.
Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.