You have very likely heard it before - that "A/B Testing" really means "Always Be Testing."
While that is not actually the truth, it is a memorable mnemonic trick to help businesses remember that testing is a fundamental building block of digital marketing. That is especially true within paid search where it is exceptionally easy to churn through money wastefully with little to no benefit.
However, I still frequently hear from businesses who allow their Google Ads campaigns to operate in "auto pilot" mode - essentially trusting what they have initially created to perform adequately without daily, weekly or monthly improvements. The fundamental flaw of that approach is its complete deficiency in a/b split testing - a crucial component for elevating Google Ads performance from satisfactory to elite.
There are countless methods of a/b split testing in Google Ads that every business should implement. Though the following are critically important:
You can read about landing page best practices in my previous blog, but even the most attractive and optimized Google Ads landing pages should always undergo rigorous testing. Businesses can drastically improve their conversion rates simply by altering or relocating a CTA button, changing headlines, experimenting with new text, etc.
My recommendation with landing page testing is to conduct only one experiment at a time. If you turn too many knobs at once, you are never actually aware of which knob "opened the door," so to speak, to achieving higher conversion rates.
Testing new ad copy is the simplest and most widely utilized practice among Google Ads a/b split testers. The most widely utilized approach is to analyze a series of ads within the confines of one ad group, pause ads with the lowest click through rates, then create new ads using the "winning" ad as a base (with slight modifications).
This process works well, however, I have seen countless businesses (as well as professional marketers) judge the winning and losing ads simply based on click through rate, which is misleading. When judging an ad, it is absolutely crucial to consider the average positions of the ads being analyzed. The average position of an ad affects its CTR, meaning an ad with an average position of 1.0 will always have a higher CTR than an identical with an average position of 2.5. To effectively judge winning and losing ads one must understand and consider this important caveat.
Campaign settings, especially advanced settings, are frequently overlooked by the dozens of in-your-face KPIs that bombard you upon loading Google Ads. However, a/b testing your campaign settings can actually yield drastically different (and often better) results. I recommend the testing the following:
- Turning "Search Partners" on or off
- Adjusting mobile bids
- Adjusting geographic bids
- Trying new bid strategy types altogether
- Changing ad delivery to "accelerated"
- Changing "location > target" options to "people in my targeted location" and "people in my excluded location" as opposed to people interested in or viewing pages in those locations (the default).
One important thing to remember is there is no such thing as a bad test. For every mistake that is made, a lesson is learned. And the likelihood of wasting money in the future dwindles.
Now get out there and start testing!