A PPC Case Study That's Actually Good
Published by Spinutech on October 3, 2016
You won’t find a decent PPC case study by searching in Google. The best you’ll find are poorly organized blog posts by agencies aiming for your business.
The good ones are hidden behind paywalls like Stack That Money and Aff Playbook. If you haven’t heard of those forums before, it's safe to say they won’t help you. The advertisers are running in industries you wouldn’t tell your mother about, and from traffic sources beyond AdWords, Bing, and Facebook.
Legitimate PPC case studies don’t exist in clean industries for a few reasons, some of which you may not want to hear.
Reason #1: The agency set it and forgot it.
This happens. It's unfortunate.
As a wise man once said, "That's all I have to say about that."
Reason #2: The campaign has insufficient funds or targeting.
Successful optimization of a PPC campaign requires statistically significant data. This gets bottlenecked by two things:
- Inadequate targeting
- Inadequate budget
If you don’t have a big enough audience or budget to get the data you need, it’s very difficult to make educated decisions.
Reason #3: Data is sacred.
Good case studies are hidden behind paywalls because quality data can make you very wealthy.
On that note, let me clarify what information I will and will not share in this post.
Information I WILL NOT Share
For privacy reasons I cannot share any of the following:
- The Specific Niche
- Account Structure
- Budget
- Bids
- Campaigns
- Ad groups
- Ads
- Keywords
- Geotargeting
Please respect that this is basically our Krabby Patty secret formula. I'm happy to answer questions outside of these things in the comments, but if it has to do with personally identifable client information or the complete recipe to our secret sauce, I'll have to say, "No comment."
Information I WILL Share
Here's the good stuff:
- Traffic Source: AdWords
- Overarching Niche: Industrial
- Conversion Tracking: Lead Generation (Phone Calls & Form Submissions)
- Duration (So Far): Four Months
Of course, we'll also cover results and how those results came to be.
First Things First: Results
If you've been skimming up until this point, now is the time to pay attention. Here are the results after four months of campaign optimization.
- The total number of qualified leads (including branded) increased 85.71%
- The total number of qualified leads (excluding branded) increased 140.00%
- The average cost per lead (including branded) decreased 31.09%
- The average cost per lead (excluding branded) decreased 45.61%
- The overall average cost per click decreased 45.02%
Those are the facts. Admittedly, the lead increase percentages seem amplified because we're dealing with a relatively small sample size, but we're not selling pizzas. We're working with high margins. Think software contracts – it only takes a few good leads to see big ROI.
That's it for results! The rest of this case study is primarily for those interested in the analytics. If you're an aspiring Excel wizard, data nerd, or number cruncher, keep on readin' on.
The Details: Setup
Everyone has their own preferences on how to set up and structure an AdWords account. This account could have gotten by with one campaign, but we chose to build out nine campaigns with a shared budget instead. By doing so we could easily segment the budgets at a later date, should campaign performance provide a reason to do so.
In addition, we also utilized the following ad extensions:
- Sitelink Extensions
- Call Extensions
- Callout Extensions
Like many agency agreements, we had a maximum ad spend but no target CPA. In layman's terms, this means "Generate as many leads as possible for under $X,XXX,XXX,XXX.XX" (except the budget might have been slightly less than a billion dollars).
This is an important distinction to note: In a perfect world, all campaigns would have unlimited ad spend as long as ROI remains positive.
Unfortunately, unlimited ad spend is a luxury reserved for a smalll number of e-commerce businesses and affiliates. The rest of us have to work with set marketing budgets. That's okay. I'm pretty sure Tolkien has a quote on the subject.
The final point I'd like to make before moving on is that I decided against using any third party bidding tools (e.g. Acquisio or Marin). I truly believe that if you don’t know how to optimize a campaign without tools, you definitely won’t know what to do when – not if – they break.
Month #1: Growing Pains
Month #1 is always about collecting data. You need a solid two to three weeks of data collection before making educated decisions, and often longer if the budget is small. Since our budget wasn’t actually a billion dollars, we made mostly insignificant changes for the first month but gathered a lot of valuable information.
See for yourself.
Cool, right? So what does it mean?
1. Campaign #7 is highly competitive.
Although the image above doesn’t show any dollar signs, you can compare the Cost (%) column of campaign #7 with campaign #2 and campaign #5 to see that these three are the big spenders.
What really separates #7 from #2 and #5 is the number of clicks. We’re getting less than 25% of the volume, which means we’re paying more than four times as much per click. It’s expensive, and it’s not converting.
2. Our targeting is maxed out.
Take a look at the last three columns. Our search impression share (the number of impressions we received divided by the number of impressions we were eligible to receive) is around 75%. The other two columns explain why we aren’t capturing the other 25%.
By looking at the last column, we can see that less than 5% is due to our budget. This means we’re spending about as much money as we can, which is a problem because we’re only spending 70% of the client’s target budget. They want to make the most of their ad spend, and I want to help them do that.
The Search Lost IS (rank) column shows that 20% of our lost impression share is due to rank, or lack thereof. We can handle this a few different ways:
- Increase our bids. We can try to buy the #1 ad slot every time. This is usually a bad idea. Unless you have data that says the top slot converts better than other positions, you're going to blow through a lot of money really fast.
- Change the campaign delivery method from standard to accelerated. This tells Google not to optimize the delivery of your ads. Again, a bad idea unless you have data to back the decision.
- Improve relevancy / quality score. This means making sure our ads closely match the keywords we’re targeting, among other things. We might get more clicks, but we’ll also pay less for them. It’s a Catch 22.
3. Focus on Campaign #2 and campaign #5.
I know what you're thinking.
“But wait! What about campaign #3? That campaign is CRUSHING IT!”
That’s because it’s a brand campaign, targeting users already searching for the company. It’s supposed to be crushing it.
Aside from that, campaigns 2 & 5 are the campaigns converting and driving substantial volume.
At the end of the month I sent the client a report detailing these key points. We had a phone call at the beginning of month #2 and decided on action items moving forward.
Month #2: Getting Better Leads
Ready for the changes? Here's what we did.
We Shut Down Campaign #7
This was a tough call. If you look back at the picture detailing each campaign’s performance, you can see that we only received 29 clicks. That is not statistically significant data, and does not provide enough information to accurately determine whether or not the campaign was a success or failure.
With that said, we spent a lot of money. The average cost per click was much higher than the other campaigns. The decision can be argued both ways, but we ultimately chose to shut it down.
We Increased Our Geotargeting
Pretty self-explanatory. Our targeting was nearly maxed out before shutting off campaign #7. The obvious solution was to cover more ground.
We Launched a New Campaign
In addition to covering more of the map, we launched a new campaign targeting more keywords. Our hope was for this campaign to help supplement the loss in spend from campaign #7.
We Optimized Campaign #2 & Campaign #5
By "optimized", I mean we set up Single Keyword Ad Groups. This takes a good chunk of time to do, so it might not be practical to build every account with SKAGs from the start. However, once you know where the conversions are, it's time to SKAG it up.
After implementing these changes we let the campaigns run for the rest of the month with minimal adjustments. Here are the results.
Notice any big differences? Let’s go over them.
1. The number of leads decreased... right?
What you can’t see here is that we spent much closer to our target budget (93% instead of 70%). If we’re spending more money, we should be getting more conversions, right?
Well, sort of. Look closely at which campaigns are converting and you’ll see that campaign #3 – the brand campaign – didn’t generate a single conversion. With only 22 clicks, it’s hard to determine why. What we do know is that it’s not entirely fair to put the brand campaign in the same category as the others. Brand campaigns are expected to convert. The users clicking on those ads are already familiar with the business.
Simply put, we got less conversions, but we got harder-to-get conversions. That's a win.
2. Campaign #10 is a gift from the heavens.
Profitable campaigns aren’t launched. They’re optimized.
Sometimes you get lucky and launch a campaign like this. That’s what happened here. We got lucky.
That’s about it for big changes in month #2. The client and I discussed overall performance and developed a plan for month #3.
Month #3: Trimming the Fat
You’ve probably gathered how this works by now. Here come the changes.
We Shut Down More Than Half of the Campaigns
If the campaign hadn’t delivered a lead in the past two months, we shut it down. Many of the campaigns weren’t driving enough volume to receive statistically significant data. They weren’t necessarily hurting anything, but managing them took time away from managing campaigns that were converting.
We Changed the Landing Pages
With campaign #10 we had tried something different from the other campaigns. Rather than send users to individualized landing pages, we sent them to the homepage. Since it seemed to work so well, we tried the same thing in the other campaigns to see if it increased conversions.
Here's the data from month #3.
These metrics look pretty good, though they're a bit misleading.
1. We nearly DOUBLED our conversions... sort of.
Our total conversions went from 13 to 25. That’s good!
Our non-branded conversion stayed at 13. That’s… the same.
Basically, our branded conversions came back. That’s great, but it’s not a result of our optimization efforts.
2. New landing pages, same performance.
Campaign #10’s performance in month #2 seems to be a small fluke. It’s still a good campaign – just not as good as initial statistics implied. Nothing about the change in landing pages indicates better or worse performance.
Month #4: Ad Scheduling
Ad scheduling shouldn't be the your first optimization tactic. It can lead to huge improvements, but keyword optimization should come first.
The problem in this situation is that our search lost impression share due to budget doesn't give us much wiggle room. If we start shutting off high volume keywords our total spend will drop quickly. Seeing how we're targeting a small number of keywords with very little room, our keywords ultimately fall into one of three categories:
- Keywords that are converting
- Keywords that are driving a lot of traffic, but aren't converting
- Keywords that aren't driving enough traffic to accurately determine if they should be removed
For this reason, ad scheduling seemed like the next logical step.
Day of the Week
To try and get enough data, I pulled numbers from the last three months. Here's what I found.
Google isn't resting on Sundays. In fact, they're hustling us harder than any other day. We've only seen one conversion on a Sunday in three months. It would be a good idea to stop running ads on Sundays.
Hour of the Day
Trends were also found by hour of the day.
Roughly 80% of our budget was spent between 8:00am and 8:00pm. This is when we saw nearly all of our conversions.
Between 8:00pm and 8:00am, 20% of our budget was spent for two total conversions.
I don't think it's an exaggeration to say we were losing money in our sleep.
Once again these findings were presented to the client. The difference in CPA between peak times and poor times was so large that we chose to shut down ads between 8:00pm and 8:00am entirely. With a change that once again limited targeting, we increased our geotargeting to expand our reach.
You Won't BELIEVE What Happened!
Surprisingly, our average cost per click dropped. This isn't what you would expect from alocated money to more competitve hours. It's very possible our competitors haven't looked at their ad scheduling data.
Beyond that, not much else changed.
1. Conversions changed slightly.
The total number of conversions increased because branded conversions increased. Non-branded conversions actually decreased by one.
Ultimately, the data set is too small to say much of anything.
2. Ad scheduling made no clear changes to conversions.
I believe our decision to change when the ads run was correct and is saving the client money. It didn’t increase non-branded conversions at this time, but it very well may in the future.
Summary and Moving Forward
This account is performing well and will only continue to perform better.
I'll be the first to point out that our approach was unorthodox, mostly due to the fact that we were trying to make statistically significant decisions on data sets not much larger than a survey from Cosmopolitan. Now that a decent amount of our impression share is limited by budget, we can start shutting down keywords that aren’t performing at an ideal level.
But that's agency life. As Tolkien said, you do your best for your clients with the data Google gives you.
Got Questions?
Hit up the comments ↓