Unlike direct response marketing of yesteryear where tens of thousands of pieces were printed and mailed at great expense, testing variables in electronic marketing is easy and inexpensive. There is no excuse not to test a variety of elements within your campaign. And, there is no excuse not to keep testing even when you find a winning combination.
In traditional direct response marketing, the highest pulling mail package was subsequently used as the control and new concepts tested against it. The tradition still lives, but Internet marketing levels the playing field. Even small businesses can take advantage of variable testing.
Here are just few ideas to get your creative juices flowing about to test. To zero in on which versions of different variables are most effective, try to limit the test to one element. Otherwise, you won’t know exactly which variable had the best impact.
Keep your tests simple so interpretation of the results is obvious. The more complex the testing, the harder it is to figure out what you’ve actually learned.
Email subject lines
After you brainstorm on possible subject lines that will entice readers to open your email, pick your best two. Send one subject line to half your list, the other subject line to the other half. Or, split your list based on characteristics of those on your list. For instance, names you purchased versus names you acquired organically.
Subject line variables to test include the benefit, the offer, subject line length, sense of urgency, and whether there is a curiosity factor so readers are intrigued to read more.
Email and newsletter formats
Within your format, you can test HTML against text content, colors, design, ratio of photos to text, tone of the copy, where key copy points appear…the list is as long as you are creative about what variables you can test.
Number of links
Recently, I tested two event announcements for a client who knew virtually nothing about who was on her list (or how they got there). In the long version with several links, the click-through rate was slightly higher, though less people clicked on the actual registration link. In the shorter version, offering only a registration link, the open rate was slightly lower; however, the readers were less distracted and took only the action we offered.
The learning was this: I realized that though the list had been used numerous times, the audience was clicking on the more informational links in the long version. To me, this meant previous emails were not as informative or useful as the client would have hoped. Since this version also offered the price, we determined the course was priced too high for those on the list. This point was confirmed in the short version, which just offered the registration link, but did not state the price. Once readers clicked through and saw the price, they bounced off the site. We gathered a lot of information just by virtue of the links included in each version.
Long copy vs. short copy
Depending on your product or service, determine what works better for your audience—long or short copy. Take it one step further and test when long and short copy works best. For instance, some products may require long copy at the beginning of the sales cycle and benefit from shorter copy further into the cycle. Single product emails do better with short copy and a photo or two.
You can also vary copy length if you use a combination of email and landing page. The email provides enough for the reader to want more, which is on the landing page.
Try testing free offers, different price points, percentage versus actual dollar savings (10% vs. $200), different payment options and guarantees.
Send your emails on different days of the week and times of day. Again, split your list and see how timing changes the response rate.
Assuming you have high quality content or great offers to share with readers, test how many emails you can send each month or week to increase response rate and sales. Just because your newsletter is monthly, doesn’t mean you can’t reach out to your email list with special offers throughout the month.
Once you develop reliable control variables, use what works best for a majority of your list and continue to test new variables (one at a time) for 10 to 20 percent of your list.
Monitor opens, click-through and forwards, opt-outs, and the number of times a particular reader came back to click on links.
What tests have you conducted and what did you find out? Send me an email at email@example.com and let me know what worked for you.