cover-guide-to-inbound-marketing-flat.png
Take Me To:
9

Understanding The Science Of Inbound Marketing

Consistently Running Experiments Is Key To Producing Results

One of the secrets to getting your inbound marketing to produce results is ongoing optimization. But simply put, optimization should take on some form for it to be productive. Making disconnected adjustments over time with no rhyme or reason is not going to produce results. You're going to be frustrated and your CEO is going to be frustrated. It's going to look like inbound marketing doesn't work, when in reality it's your execution of inbound marketing that isn't working, not inbound itself.


Think Like An Inbound Marketing Scientist

The best way to mitigate ineffective optimization is to think like a scientist. Start by asking questions. Scientists ask a lot of questions. Why aren't we getting more visitors to our site from social media? Why isn't this landing page converting at a higher rate? Why are people not downloading this piece of content?

Next, do some research. What information is already available that might answer your question? What information might inform additional questions? It's possible that through your research, you'll find your answer. For example, when it comes to landing pages, one way to improve the conversion rate is to remove the general site navigation from the page. This might be something to test, but it might also be something new that you didn't know. Gaining knowledge around your questions is a key part of being scientific.

Think like a scientist: Ask Questions, Research, Form Hypothesis, Experiment, Review Results, Repeat

Now you're ready to answer your questions with a hypothesis. Going back to our landing page example, your hypothesis might be, "if we remove the site navigation, our landing page conversion rate will go up by 10%." You might want to create a series of hypotheses. The collection of theories or hypotheses is important because that's where your tests or experiments come into play.

It's now time to design your experiment. Sticking with our example, it would be a new landing page or an A/B testing protocol for both pages. You'll run the experiment until you have enough data points to be sure of your findings. Then you'll know if your hypothesis was correct or not.

Make sure you don't change more than one variable per experiment. If you change the navigation and the headline, and conversions improve, you won't know if it was the navigation change or the headline change that led to the improvement.

Once you have a successful experiment, you might want to try and replicate the results by running a similar experiment on another page. If the results are in line with the first test, you're probably onto something worth sharing.


Create A Backlog Of Inbound Marketing Experiments

Inbound marketing is a little different than most scientific experiments in that you have a goal to produce improved results and quickly. Most scientists are trying to prove theories or ideas. Because you're on the clock, your approach requires a slightly more business-oriented approach.

Create a collection or backlog of potential experiments. You have your tactics and the quantitative performance of each tactic. Go through those and look for areas that could be performing better.

Create your hypothesis as to why they're underperforming or not performing to expectations, and then create the backlog of experiments to prove or uncover the underlying issues.

Now you can pull from that backlog based on priorities, timing and results. For example, if new client signings are down, you might want to look at the very last step in your sales process. What's preventing people from signing the paperwork? You could run a test with shorter contracts, change the name of the document from "contract" to "agreement," or remove all the legalese from the agreement and see if you can reduce the time from paperwork submitted to signed agreement from two weeks to two days.

If you could crack that nut, you'd be a revenue superstar at your company and quickly you'd be the most promotable associate on the block.


Work Through A Testing Methodology

Running experiments is a bit more complicated than perhaps we've alluded to so far. Let's dig into the process of doing experiments a little more so you have more details to run them yourself. We went into greater detail on this in Chapter 8, so click here to get the full insight, but here is a high-level overview of a testing methodology that might work for you.

AnalyzeThe ongoing optimization starts with the "analyze" phase.

Here, we're looking at data from almost every aspect of the program, including keyword data, website data, visitor source data, email marketing data, conversion data and content data.

ReviewThis helps us move to the "review" phase.

In this phase, we're comparing data and looking for patterns. We're huddling internally to discuss the data and see if there are any obvious answers.

RespondNow we're ready for the "respond" phase.

This is where we're getting our tests together. We've created a series of experiments we want to deploy to help validate our hypothesis on how to improve results. We've prioritized the tests based on the ones that will have the biggest potential impact and take the least amount of effort.

ActFinally, we're on to the "act" phase.

The teams have the tests and they're deploying them. Some might get installed quickly, such as in a day or two. Others might require assets to be adjusted, created or modified, and it might take a week or so to get those tests up and running.

Then we start the cycle over again, analyzing the data, reviewing it to see how it played out, creating new responses and putting those into action.


Share Your Findings (Successes And Failures)

Once you've uncovered something, you should share it, regardless of whether the insights are positive or negative. Sometimes the biggest insights are the result of a failed experiment, so don't feel bad if most of your experiments don't produce positive results.

Learning what not to do is sometimes even more important than learning what to do.

One creative and fun way to share the results of experiments is to run science fairs monthly or quarterly across the company, across departments and across cross-functional teams. This gives you a chance to talk to team members about your experiment, your hypothesis and the results you produced.

It also gives you a chance to get key knowledge out across the organization. If you're having challenges with landing page conversion rates, other people might be having similar challenges. By sharing your results, you could potentially help other people move their projects forward.


Test Everything

One of the major benefits of a scientific approach to sales and marketing is the elimination of attitudes, opinions, assumptions and guesswork from an area that has been using guesswork to make decisions for years. Your CEO wants the website button to be blue because he thinks blue will work better. Great, let's test it. Your VP of marketing wants the landing page to have four form fields instead of three. Great, let's test it.

"Let's test it" should be the answer to almost anything.

In the final chapter, we'll look at how to set expectations for meeting and exceeding your goals around website visitors, site-wide conversion rate, lead generation and individual tactical performance areas like email open rates, landing page conversion rate and blog subscribers. You're almost done; just one more chapter to go and you'll be an inbound marketing master.


« To Chapter Eight» Chapter Ten, Coming Soon