About Christine Churchill

Christine Churchill, the President of KeyRelevance, is a pioneer in the field of Search Engine Marketing. She is a strong advocate for ethical search engine marketing, and was a member of the founding Board of Directors of the Search Engine Marketing Professional Organization (SEMPO).

Stretch Your PPC Budget By Optimizing Landing Pages

Finding dollars a little tighter this year is a common problem so learning new ways to stretch marketing dollars is more important than ever. Improving your PPC landing pages so they convert better is one dynamite way to improve your ROI.

Stretch your PPC Budget

Stretch your marketing dollars by optimizing your landing pages

With that in mind, here are a few tips for making your landing pages more effective. The goal here is to make your landing pages more persuasive, focused, and complete, and to provide the necessary testing feedback to measure the success of your efforts.

1. Make your the landing page Mobile Friendly. The number of mobile searches surpasses desktop searches for many industries and demographics. If you are spending money to send people to a page, the page needs to render well.

2. Include a call to action and place it ABOVE the fold. Your landing page can be long or short – you’ll need to test the page to know which works best for you, but always include a call to action above the fold.

3. Make the call to action look like a button and make it larger and brighter than you think you need. Text links have their place but they don’t draw the eye as much as a brightly colored button call to action.

4. It is still benefits not features. Those of us in marketing hear that expression all the time, but it is amazing how many pages include content that is all about features. Don’t tell me the statistics on a product; tell me how my life will be improved if I buy it. Remember it’s emotions that really motivate us to buy. We like to have the logic to explain to our friends why we purchased something, but most of us really buy a particular item because it fulfilled an emotional need.

Focus -You know the keywords and ad text that lead to the customer clicking on your ad. Now, make sure that you maintain that focus and lead visitors to the next step in purchasing on your site. Maintaining continuity by providing the proper focus to the landing page will help to keep those visitors on track.

5. Continuity between ad and the landing page is a must.. Inconsistent messaging can confuse a visitor. If they click on an ad with a certain value proposition, then the landing page should reinforce the appeal mentioned in the ad copy.

6. Remove unnecessary noise and clutter on the page. Too many bright graphics and excessive bold text can distract visitors. Links to other offers can de-rail customers from their purchase mission. When presented with too many options, visitors often get confused and leave a site or if distracted, they might totally forget why they entered your site in the first place.

7. Ensure your customer contact loop is working. Can the potential customer coming to your page contact you easily? Is the submission form working? How reliable is the VOIP number you are using? If a phone call is the preferred method for customers to contact you, is the phone number present on the page and easy to find? If you are using a phone number on the landing page, do you have a method to track offline conversions? If not, you have a big hole in your analytics.

8. Include Trust factors on the landing page. Specialized badges, Better Business and Chamber of Commerce memberships, testimonials, and a professional looking web site all convey confidence to the visitor that your site is trustworthy. Use trust factors on the page whenever possible.

9. Keep your SSL certificates up to date. Nothing increases the bounce rate on your landing page like serving potential customers an expired SSL certificate notice. Secure Sockets Layer (SSL) certificates are used by ecommerce sites to encrypt sensitive information during online transactions. If expired, the visitor is served up a frightening notice. Most visitors don’t know what the notice means and leave your page feeling very uncomfortable about doing business with you.

10. Test your landing page in different browsers. Designers often build landing pages in a hurry without the extensive testing program normally included in a site redesign. This inattention to detail can cause browser compatibility issues to sneak into your landing pages. For optimum user experience, view the landing page in Edge, Chrome, Firefox, Opera, and Safari. A free tool that lets you view how pages render on different browsers is located at BrowserShots.org.

Testing pages under different browsers is an extra step and takes time, but poorly rendering pages make a bad impression and can cost you sales.

11. Make sure analytics is installed correctly and capturing true performance. Analytics is only as good as the data you feed it. If you forget the analytics JavaScript on a landing page or include the JavaScript but accidentally enter a typo while dropping it on the page, the story your analytics is telling you may be inaccurate. We have seen both of these occur on big brand sites on more than one occasion. Test that your analytics is capturing conversions correctly or you may base your marketing decisions on incorrect information.

12. Conduct a usability test on the landing page with a live person from the target audience. Many times we put all the best practices to work and find pages still aren’t converting like we expect. One of the simplest ways to figure out what is going wrong is to pick someone from the target demographic and have them sit down with you and do an old fashion usability test.

Give the usability tester a scenario and a mission to buy a product from your website. Tell them to start with the ad and then arrive on the landing page. Have them talk out loud about how they perceive the page.

You can’t have an ego when you’re doing usability testing. Prepare for brutal criticism because you may find the copy you thought was so compelling is considered drivel by the tester. Or the tester may sit staring at your page lost about where to go next because the link you thought was so obvious is invisible to them. Usability testing reveals problems that your analytics may be hinting at, but don’t definitively tell you.

13. Use multivariate testing to test options on the page. You are never done fine tuning your landing pages. As you put out new pages, learn what worked on earlier pages but continue to try new things too. In this competitive market, getting even another percentage better can make the difference between success and failure. Google Website Optimizer is a free tool that allows the webmaster to perform multivariate testing. Software that did this used to be very expensive. Google provides the tool without cost and has made it straightforward to use.

If you put all these tips and techniques into play, I am confident that you will improve the overall effectiveness and convertibility of your landing pages and will help to make you paid search campaigns more profitable.

If you need help with your PPC marketing and landing page optimization, feel free to give KeyRelevance a call at 972-429-1222.

Getting Query Data from Google Webmaster Tools to Correlate with SEO Query Data in Google Analytics

As most marketers are painfully aware, In Oct 2011 Google stopped providing keyword referral data in Google Analytics for searchers who are logged into Google or for searchers using Secure Search (which Firefox has more recently adopted as the default for Google searches).

As a result of the loss of this valuable keyword data source, online marketers have become very creative about designing ways to recapture some details on the search behavior of these visitors. One approach has been to extract more information out of the Google Webmaster Tools (GWT) data which so far isn’t affected by the not provided restriction.

The query data in GWT is limited. The reports provide impressions and click information, but no conversion or other site engagement data. As a result, the data is good for keyword research and to find terms that are relevant to your site, but you’ll likely still have to test the terms in PPC to gain actual conversion information about the terms.

If you have your Google Analytics and your Google Webmaster tools accounts linked you can view the GWT query information directly within Analytics. Google has a section called Search Engine Optimization that displays data sourced directly from GWT.

Not a Discrepancy, a filtering Inconsistency

If you have looked at the SEO data in Analytics and then logged into your GWT, you probably noticed that the queries don’t correlate well. The reason for the list of queries being different is due to the default filtering being different in the two reports.

In Google Webmaster Tools, the queries are filtered to show search query impression and click data from Web searches only by default. You can see a snapshot of the query filtering options below. Notice the Filters button and to the right of it the default is Web.

If you click on Filters you can chose between Web, Mobile, Images, Video, or All. In GWT they list queries filtered for Web by default.

If you then go over to Google Analytics, you will notice that the SEO queries are not filtered. You essentially get the All option. If you want to compare the default list in GWT, you will need to tell Analytics to include “Google Property” and filter for Web.

Then assuming you have the same date range in both Google WT and GA and that you have the data sorted by the same metric, the lists should correlate. Please note that there may be latency reasons for the data not matching if you select a too recent date.

Once I realized the filtering was different the query data made a lot more sense. Hope this helps a few who like myself, was puzzled over the lack of consistency in the query data.

Understanding Robots.txt

Robots.txt Basics

One of the most over-looked items related to your web site is a small unassuming text file called the robots.txt file. This simple text file has the important job of telling web crawlers (including search engine spiders) which files the web robots can access on your site.

Also known as “A Standard for Robot Exclusion”, the robots.txt file gives the site owner to ability to request that spiders not access certain areas of the site. The problem arises when webmasters accidentally block more than they intend.

At least once a year I get a call from some frantic site owner telling me that their site was penalized and is now out of Google when often they blocked the site from Google via their robots.txt.

An advantage of being a long time search marketer is that experience teaches you to know where to look when sites go awry. Interestingly, people are always looking for a complex reason for an issue when more times than not, it is a simple more basic problem.

It’s a situation not unlike the printing press company hiring the guy who knew which screw to turn. Eliminate the simple things that could be causing the problem before you jump to the complex. With this in mind, one of the first things I always check when I am told a site is having a penalty or crawling issues is the robots.txt file.

Accidental Blockage by Way of Robots.txt
This is often a self-inflicted wound that causes many webmasters to want to pound their heads into their desks when they discover the error. Sadly, it happens to companies small and big including publicly traded businesses with a dedicated staff of IT experts.

There are numerous ways to accidentally alter your robots.txt file. Most often it occurs after a site update when the IT department, designer, or webmaster rolls up files from a staging server to a live server. In these instances, the robots.txt file from the staging server is accidentally included in the upload. (A staging server is a separate server where new or revised web pages are tested prior to uploading to the live server. This server is generally excluded from search engine indexing on purpose to avoid duplicate content issues.)

If your robots.txt excludes your site from being indexed, this won’t force removal of pages from the index, but it will block polite spiders from following links to those pages and prevent the spiders from parsing the content of those pages. (Pages that are blocked may still reside in the index if they are linked to from other places.) You may think you did something wrong that got your site penalized or banned, but it’s actually your robots.txt file telling the engines to go away.

How to Check Your Robots.txt
How do you tell what’s in your robots.txt file? The easiest way to view your robots.txt is to go to a browser and type your domain name followed by a slash then “robots.txt.” It will look something like this in the address bar:

http://www.yourdomainname.com/robots.txt

If you get a 404-error page, don’t panic. The robots.txt file is actually an optional file. It is recommended by most engines but not required.

You can also log into your Google Webmaster Tools account and Google will tell you which URLs are being restricted from indexing.

You have a problem if your robots.txt file says:
User-agent: *
Disallow: /

A robots.txt file that contains the text above is excluding ALL robots – including search engine robots – from indexing the ENTIRE site. Unless you are working on a staging server, you don’t normally want to see this on a site live on the web.

How to Keep Areas of your Site From Being Indexed
There may be certain sections you don’t want indexed by the engines (such as an advertising section or your log files). Fortunately, you can selectively disallow them. A robots.txt that disallows the ads and logs directories would be written like this:
User-agent: *
Disallow: /ads
Disallow: /logs

The disallow statement shown above only keeps the robots from indexing the directories listed. Note that the protocol is pretty simplistic: it does a text comparison of the path of the URL to the Disallow: strings: if the front of the URL matches the text on a Disallow: line (a “head” match), then the URL is not fetched/parsed by the spider.

Many errors are introduced because webmasters think the robots.txt format is smarter than it really is. For example, the basic version of the Protocol does NOT allow:

  • Wildcards in the Disallow: line
  • “Allow:” lines

Google has expanded on the original format to allow both of these options, but these are not universally accepted, so it is recommended that these expansions ONLY be used for a “User-agent:” run by Google (e.g. Googlebot, Googlebot-Image, Mediapartners-Google, Adsbot-Google.).

Does the robots.txt Restrict People From Your Content?
No, it only requests that spiders keep from walking through and parsing the content for its index. Some webmasters falsely think that if they disallow a directory in the robots.txt file that it protects the area from prying eyes. The robots.txt file only tells robots what to do, not people (and the standard is voluntary so only “polite” robots follow it). If certain files are confidential and you don’t want them seen by other people or competitors, they should be password protected.

Note that the robots exclusion standard is a “please don’t parse this page’s content” standard. If you want the content removed from the index, you need to include a Robots noindex Meta tag on each page you want removed from the index.

Check robots.txt First
The good news is that if you have a situation where you accidently blocked your own site, the solution is easy to fix now that you know to look at your Robots.txt file first. Little things matter online. To learn more about the robots.txt file see http://www.robotstxt.org.

SMX East 2008 – Great show, great people, great content

Great conferences don’t happen by accident. That said, the recent SMX East show rates as fabulous. Danny Sullivan, Chris Sherman and the rest of the ThirdDoorMedia folks did an incredible job of putting together a first class show. How they do it is an art form. First, they entice the best speakers in the industry to come and openly share their knowledge. (Looking over the list of speakers, I feel privileged and humbled that I’m even allowed to participate.) Then Danny and Chris develop a killer agenda that has broad audience appeal yet is balanced enough to offer something for everyone from the novice marketer to the advanced expert. Throw in sponsors and exhibitors to help finance the show and provide the attendees cool stuff like wireless connections (thanks Rand), tee shirts, and light-up promotional items that max out the geek meter. Lastly, you need a hard working staff to run the lights, music, microphones, registration, and all the other behind-the-scenes things that make the show the A+ event it was. Great job to all of you.
The week leading up to the show was tough on me personally. Employee and friend Li Evans unexpectedly lost her father. Another employee had to be rushed to the Emergency Room and spent the week in the hospital undergoing breathing treatments. If that wasn’t enough trauma, during the week, a close family friend succumbed to cancer after a long arduous struggle. While that death wasn’t a total surprise, I found myself emotionally drained. The world felt a little smaller and colder.
Arriving in NYC after such a week meant I really wasn’t in the mood for parties. I was craving quieter smaller exchanges with close friends. One positive thing about conferences is that it brings old friends together. Conference friends have a special place. They may not physically live near us but because we share time and adventures in locales far from our homes where we are without our usual support networks, there is a special bonding and closeness that occurs. I have conference friends who are like extended family to me. We take turns looking out for each other and we’ve cried on each other’s shoulders on more than one occasion.
This trip my dear friend Scottie Claiborne popped up to NYC to visit our group of friends and stayed with me. A few years ago Scottie had withdrawn from the conference limelight to focus more on kids and a balanced life. Within a few minutes of seeing Scottie my spirits were brighter. Scottie has that effect on me and most people she comes in contact with. It was great catching up with her.
One night during the conference a group of friends assembled in the hotel bar to celebrate Debra Mastaler’s birthday. It was comforting to be in the midst of friends and I was genuinely happy to see them. Debra is a popular lady in search and a dear long-time friend. Some of the many friends who stopped by to wish her well were Jill Whalen, Scottie Claiborne, Mike Grehan, Brad Neelan, Mona Eiesseily, Andrew Goodman, Stacy Williams, Li Evans, Kim Krause Berg and her charming husband Eric, Kevin Newcomb, Simon Heseltime, and many others.
I sat in on a number of sessions at the conference and was delighted with the content. It would be hard to choose which was my favorite this conference, so many were excellent. If I was forced to pick just one, I would have to say I enjoyed Gregory Markel’s presentation on video search engine optimization the best. I’ve known Greg a long time and consider him a friend. I have also learned over the years that embedded in his enthusiastic presentations are really great marketing jewels. You can tell he loves what he does and Greg is very willing to share his knowledge. If you missed his session at SMX, watch for him at another show. I’ve been in the search business for ten plus years, and I walked out of the session with a few new tricks. Thanks Greg.
That leads me to another topic. The search industry moves too fast to sit on your laurels. You have to actively grow and learn new skills….constantly. If you stand still the industry will pass you by. One of the easiest ways to stay up to date of new changes in the industry is to attend conferences. Books in our industry are outdated before they are printed. Attending conferences gives you more current information and is one of the best professional development practices you can do. Sure, it costs money to attend, but if you get a couple nuggets of new information and network with folks who can help you do your job better, it’s worth every cent.
My next conference, SMX London, is another month away. I’m already looking forward to it. Each conference has its own flavor and the London show is a great place to learn about all things search, but especially learn about international marketing techniques.
I’ll be speaking on two panels in London. Dear friend Tor Crockett (who is not only drop dead beautiful, but is a first class marketer) and I will be paired up in a Keyword Research Bootcamp. I’ve spoken on panels with Tor many times and it’s always thrilling to share the podium with someone as knowledgeable and fun as Tor. There is good chemistry between us. Keeping us in line (or trying to) will be moderator and conference co-chair Chris Sherman. Good luck Chris, we outnumber you. 
My other speaking session at SMX London is the Paid Search Checkup panel. Paid Search wizard Mel Carson and I will interactively review paid search campaigns and provide constructive advice to improve them. Live clinics are my favorite type sessions because you never know what will be thrown at you. They are also where you, as an advertiser, can get free advice from experienced marketers. If you are already an expert marketer, it’s nice to get a second opinion if you’re looking for new ideas on marketing. The cross fertilization of tips and experience in the clinic makes for a rich exchange where everyone benefits.
Well, I’ve managed to ramble on a number of topics and even cross the globe in a very short time. You have things to do, so I’ll close by saying I hope to see you at a conference soon. And please, do come up and say hello if you attend. I’m very approachable, human, and always open to make a new friend.

Looking Back At SES Chicago 2009

Keyword Research Session

Despite cold and the busy holiday season, search enthusiasts gathered in Chicago to attend the Search Engine Strategies conference. This year, I had the honor of presenting a solo presentation on Keyword Research. As long as there are search boxes requiring text queries, keywords will play an important role in being found on the web. Keyword research is a fundamental skill set all successful online marketers must master.

In the keyword session I discussed techniques for finding and evaluating keywords. I also covered a number of the keyword tools available to simplify, organize, and manage keyword research.

One of the main benefits of using keyword tools is that they give marketers insight into the popularity of a keyword phrase, which is another way of saying that they give you insight into the traffic potential of the phrase. Higher popularity in a keyword means there is an opportunity for more visits, but it is often associated with more competition.

Byron Gordon, SEO-PR, talked with me after the keyword session about how to conduct successful keyword research. You can watch our discussion in the video below.

Avoiding Keyword Mistakes

Keyword selection is both an art and a science. One of the common mistakes I see people making with a keyword tool is to dump a keyword list directly from a tool into their online marketing campaigns. The tools are helpful, but for best results, there still needs to be a human in the loop reviewing the keyword list for non-relevant or inappropriate words. You need to review your keyword list with several criteria in mind including relevancy, competitiveness, user intent, popularity, and performance.

While there are several tools on the market (both paid and free) that can assist in developing a list of candidate keywords, it is still crucial that you employ your brain to filter the keywords for maximum effectiveness. Otherwise, the list you develop, while extensive, will lack the necessary focus.

There are a number of excellent Keyword tools available to online marketers. Some of the more popular tools available to webmasters include

There are also tools that help you gain insights into your competitors’ keywords. A few of my favorite tools for competitive intelligence are

I’m obviously only scratching the surface on the tools available. In the session I talked about many more and gave demonstations of some of the tools in use. The important thing is tools can help you make better keyword decisions and give you a perspective beyond your own analytics. Not every tool “fits” with every keyword researcher. Try several of these tools (sometimes in combination) until you find a tailored tool suite that works with the way that you think and work.

Keyword Focus

In the keyword session I talked briefly about doing keyword research for different types of online marketing. For example, if you’re doing KW research for PPC, you have the luxury of going wide in your keyword list (budget limiting of course) and targeting more keywords (compared to keywords for SEO). SEO requires you to laser target your keywords, so you really have to cull your list down assigning a small number of keyword phrases.

SEO versus PPC Session

I was honored to be included on the session entitled SEO vs PPC the ultimate battle. The panel was a mock debate to determine which marketing technique was best – SEO or PPC. Representing SEO were the always-a-class-act Rand Fishkin, my favorite SEO bad boy David Naylor, and the provocative and insightful SEO rockstar Michael Gray. Representing the PPC were myself (Christine Churchill), Karen Weber, VP of E-Marketing, Irwin Union Bank and our moderator Brian Lewis, VP, Engine Ready.

I need to emphasize that this was a MOCK DEBATE, because it became clear during the session that many in the audience thought we were serious in our debate and that we were actually recommending one form of marketing over the other. In practice, my company KeyRelevance does about a 50/50 mix of SEO and PPC, and the synergy between the two often leads to us doing both for a given client.

To be perfectly clear, one is NOT better than the other. The goal of the session was to highlight the merits and differences of the two techniques and to stimulate thinking about when and where to use each technique. On the panel in our mock debate, panelists were tasked with defending one side or the other. In real life we believe SEO and PPC are complementary, not adversarial forms of marketing. It’s not an either/or decision… both techniques should be in your marketing arsenal. There may be circumstances when one might be more appropriate (like PPC being helpful with a new site or one needing immediate traffic), but many sites would benefit from both methods.

PPC Site Clinic

My final session was a PPC site clinic with Melissa Mackey, the Search Marketing Manager from Fluency Media and Ayat Shukairy the co-founder of Invesp Consulting. It was a real pleasure sharing the stage with such accomplished professionals.

Clinics are a chance for companies to get a free review by experts, so they are always a popular event, and many others benefit from hearing the points raised about the sites reviewed. Sites reviewed get recommendations that would be worth many times the cost of admission to the conference. That might be a good tactic to use to convince your boss why you should attend a conference. If you were to hire an expert to review your ads, it could cost thousands of dollars. If your site is chosen to be reviewed in the clinic, you could receive valuable, actionable advice for free as part of attending the conference. That’s a bargain you can’t pass up.

Why should you attend a Search Conference?

Search engine conferences are expensive, any way you measure it (travel expenses, time away from work, admission fees, etc.) so you really have to weigh the costs and benefits. Our industry is unique in the volume of changes occurring. Reading blogs, articles, books, watching SEO videos are all also good ways of learning about SEO/SEM, but attending a search conference takes you to a deeper level in your professional development. Attending a conference is like drinking from the Search knowledge fire hose: there is so much information shared, in such a short time, that you can’t help but come away from the conference with several nuggets of valuable insight that you can immediately implement and reap benefits many times the cost of the trip.