New Year’s SEO Resolution: Update Your Copyright Statement Dates

In the last few days, I’ve reviewed a few different large websites which have utterly neglected to update their copyright statement dates to reflect the current year.

Copyright statement dates have been something I increasingly check on websites that I audit for search engine optimization purposes, because of a few different things.

Copyright as an SEO Ranking Factor

First of all, it’s now established that Google has been giving special treatment to content dates found on webpages. I’ve written before on the subject of whether dates on pages might be used as a search engine ranking factor. As I wrote previously, Google’s been parsing date information out of pages already, and they’ve decided to often lump these dates into the snippet found below listings of pages in search results. They’ve stated that their usability testing has established that for many types of content, consumers would like to see the date. I’ve argued that it could be a ranking factor, but whether it is or isn’t is virtually secondary to the positive effect that it likely would have on influencing clickthrough behavior.

One type of date that Google typically does not display in the search snippets are the more commonly-used date included with the copyright statement found on most corporations’ webpage footers. However, it’s my belief that Google is likely to be paying attention to this page parameter just as much as they focus upon content update dates, although for slightly different reasons. Read on and I’ll elaborate. Continue reading

Google Launches hProduct Microformat Support In Time For The Holidays

Google Present for the HolidaysYesterday, just in time for the holliday shopping season, Google announced that they now support Rich Snippets for shopping sites. What this means is that in certain cases they will call out particular data items from online catalog sites and display them with special formating in the search results.

This is a particular boon to the internet retailer sites which are savvy enough to be able to format their data properly for Google to recognize it — a Rich Snippet graphic treatment can be eye-catching, allowing a search result to stand out from the crowd a bit, and this attention-getting display apparently results in a significantly greater click-through rate (if not even a higher conversion rate).

The new shopping Rich Snippet allows e-commerce sites to display information such as price, availability and product reviews in their search engine results page listings. For instance, here’s how a snippet for Buzzillions appears for a Cabela’s jacket:

Cabelas Jacket - hProduct Rich Snippet

Google is providing a few different methods to structure e-commerce catalog page data in order for the Rich Snippet treatment to get invoked. One of the prime methods is to code the catalog page in hProduct microformat.

We’ve been recommending the use of Microformats as a component of overall search optimization for quite a number of years at this point — our clients and those who’ve heard us speak at search marketing conferences will hopefully have benefited from the advanced recommendations from us and will already be ahead of the curve. Continue reading

What’s Best: Microformats, RDFa, or Micro Data?

In a recent post by Mike Blumenthal about Google’s announcement of supporting Microformats for local search, Andy Kuiper asked in the comments whether it would be best to go with Microdata versus RDFa or Microformat for marking up local business information. As the number of flavors of semantic markup have grown, I think Andy’s not the only one to wonder which markup protocol might be ideal. Here’s my opinion.

Microformats LogoWhen you’re asking “which is better?”, it’s important to know what we’re speaking-of, since there are a number of different goals that people could be pursuing. For some, this is a question of which is better from an elegance-of-coding perspective (if you’re interested in this, you might read Evan Prodromou’s great article, RDFa vs microformats). For yet others, the question should be focused on what’s best for their site — which solution is the simplest, most cost-effective to apply, and least likely to cause problems. Finally, the question could be seen from a perspective of what’s going to work best for the purposes of search marketing?

It’s this last orientation of the question that I’m focusing upon — which semantic protocol is going to work best for Search Engine Optimization (“SEO”)? Continue reading

Resources For Subjects In My MIMA Summit Session

MIMA SummitDuring my MIMA Summit 2010 conference presentation today, I’m covering a large chunk of information very rapidly. So, I’m providing this list of links to longer articles which more thoroughly cover the subjects touched upon in my presentation for those who might wish to dig in deeper:

Title Tag Optimization

Image Search Optimization

Benefits of Loose Image Licensing for Image SEO

Video Search Optimization

Online Catalog Optimization

RSS Optimization

Local Search Optimization

Local Search Ranking Factors

Local SEO 101: Choosing Local Domain Names

Ranking of Businesses Without Websites

Avoiding Tracking Phone Numbers

Category Names In Local SEO

hCard Microformat for Local SEO

KML & My Maps For Local SEO

Google’s New Image Search & An SEO Hint

As you may be aware, Google recently rolled-out a newer, AJAXified user-interface for their image search which features “infinite” scrolling and automatic pagination. The new UI was rolled-out at the end of July to a subset of users, and they state that more users will receive the new layout in upcoming days.

Beyond the items their blog post outlined, I noticed a couple of other things had changed. First, when comparing the new UI versus old, the order of the search results is a little different, indicating that the algorthm mayve been updated. Second, the text associated with the image is different — previously, some visible text from near the image on its native page was shown below the thumbnail in Image SERPs. Now, the new UI displays the image filename instead of a title or a caption.


Legacy Google Image Search

Legacy Google Image Search

New Google Image Search UI

New Google Image Search UI


Google has apparently decided that an image’s filename is more important to display to endusers than other text — this is a major paradigm change! Google must’ve decided that the filename is a more important usability or user-experience factor — and Google likes using such factors in their ranking algorithms.

This could mean Continue reading

Exploring Dates On Pages As A Ranking Factor

During the past year, I became a little excited at one of Google’s many enhancements to the presentation of search results, because I suspected it could hint at a possible ranking factor they might’ve introduced. The element in question is a date stamp.

Dates in Google Search Results Page Listing Snippets

You may’ve noticed that in some cases Google will prepend the usual listing snippet text with a date. That change was introduced sometime around late 2008 or early 2009. I noticed the addition of the date with interest, but I became even more interested after I heard Matt Cutts state in a Webmaster Help video that Google considered the date to be helpful to users:

When Google states outright that they consider some element of webpages to be “useful” to searchers, my ears prick up, because Google is so obsessed with Usability that they sometimes use quantifiable elements of user-centered design in their search algorithms, such as their recent introduction of Page Speed as a ranking factor. In this way, Google’s Usability fixation can reveal ranking factors.

I wasn’t alone in twigging to the dates in search snippets — Continue reading

Off-Label Use For Google’s Image Labeler?

One of the creative methods Google has used for associating keywords with images is their Image Labeler game, which has been in “beta” for some years. As you may be aware, it takes images from their extensive repository of spidered pictures, and assigns one simultaneously to two different people who opt to play the game. Each participant submits keywords describing the image presented to them, attempting to also match keywords submitted by the other participant.

Google Image Labeler

If you’ve reviewed very many websites and webpages, you’ll quickly see that there would be a great many cases where Google might spider some images, yet not have very much data to go on in terms of what the image is all about. Ideally, webmasters add images onto webpages with very clear captions right below them, and also use the ALT parameter in the IMG tag to tell what the image depicts. (ALT Text or “Alternative Text” is a parameter that allows a designer to supply some meta-data with an image — the ALT text describes the image in text, enabling audio browsers to speak the image’s text for blind and vision-impaired web users, and the text can also be used by search engines.) Well-optimized sites might even have their image filenames also reflect descriptive keywords, too. However, it’s frequently the case that a webpage designer neglects to do such things, leaving search engines to try to decipher how to make the images appear for appropriate keywords.

So, Google’s Image Labeler game is one of many methods they’re using to overcome the lack of info they encounter in crawling the web. (They also employ some more sophisticated techniques in combination with this, such as supervised multiclass labeling and optical character recognition (“OCR”).)

It recently struck me that Google could easily make use of the Image Labeler in another way as well — a sort of hidden, “off-label use” of the technology. Continue reading

Facebook SEO Tip: Add Your URL To Your Wall

Here at KeyRelevance we’re researching a number of different avenues for online marketing for our clients, so, along with our bread-and-butter work on Paid Search (PPC) management, and Search Engine Optimization (SEO), we’ve done quite a bit of exploration of ideas on how to leverage the massive audiences found in various Social Media such as in Facebook and Twitter.

Yesterday, I published an article on a somewhat subtle technique which can be used when posting status updates on Facebook in order to increase the numbers of people who might see each updates. However, there are a number of very straight-forward things which businesses and organizations can do to extract marketing advantage from Facebook without getting all tricky. Sometimes the most basic steps can give you the greatest advantage, but it’s not always obvious how to go about it.

So, here’s a ridiculously basic tip which I’ve found many businesses have utterly failed to accomplish in setting up their Facebook presence: add your website link to your Facebook wall page!

There are a great many companies, organizations, and small businesses which haven’t figured out how to do this, and so you can encounter pages all the time which do not sport that most basic element of their online marketing. For instance, the official Facebook page for the University of Texas at Austin, one of the largest universities in the country, has completely missed the boat by leaving their URL off their Facebook page:

University of Texas at Austin on Facebook

By contrast, their rivals at Texas A&M University have implemented their website URL on their Facebook page:

Texas A&M University on Facebook

(Disclosure: Texas A&M was my alma mater, so I did get a grin when I noticed that the TAMU University Relations Department did this most basic element correctly while the "Tea-Sips", as we like to call them, did not.)

Oh, to be certain, I should point out that URLs on Facebook pages are nofollowed (not to mention that they're apparently dynamically written to the pages onload, via Javascript), so they're not precisely as optimal as many search engine marketing experts might like. However, there's much to indicate that Google, if not the other search engines, Continue reading

Using Bing’s New Webmaster Tools For SEO

You may be aware that Bing recently released a new version of their Webmaster Tools which are intended to help webmasters in improving their sites’ performance in Bing search. One of Microsoft’s Senior Program Managers and SEOs, Duane Forrester, asked a number of us to give feedback to their team on what could be improved about the interface. So, I thought it might be good to provide that feedback via blog post, openly — not to beat up on Bing, but to further bounce ideas among the community.

Bing Webmaster Tools

Giving feedback to any of the search engines about their tools for webmasters seems a bit fraught with the near-futile dichotomy between the desires of Search Engine Optimization experts and the desire of search engineers to neutrally provide positive/fair rankings of search results. However, the exercise of me providing a little feedback is worthwhile, because if the tools are useless or pointless to us, then there’s little point in the search engines going to the effort of providing them in the first place.

Having worked in a major corporation before, I almost feel repressed about throwing out suggestions that I know could be deemed no-gos from the point of view of Bing engineers. I tend to self-censor to a degree because I don’t want to be interpreted as naive of the issues the search engines must take into account in trying to limit undue influence of those attempting to subvert the SERPs.

Even so, I’m aware of the potentially conflicting considerations, and as I described earlier, it’s an exercise in futility if the tools don’t provide worthwhile functionality to the intended users.

One of the primary problems I see with Bing’s Webmaster Tools is the sense of “keeping up with the Joneses” one gets when reviewing their interfaces. Bing’s development team is in a near no-win situation with whatever they do in this area. On one hand, if they copy the same functionality found in Google’s Webmaster Tools, they’d be accused of being mere immitators. However, there are some good elements in Google’s toolset which really ought to be provided, perhaps. On the other hand, if they went even further in providing usefulness to webmasters, it could make them more prone to unethical marketing exploits. So, there likely were not a lot of easy solutions nor perhaps obvious things which they should have done.

Further, their focusing upon their tool-vs-Google’s tends to be a bit incestuous, and there’s the usual engineer myopia in providing what they think people would need/want versus trying to really look at the problem directly from the point of view of a webmaster. (Now, this bias in perception can’t be accused of Duane, because he was an external SEO prior to working for Microsoft — but there’s a definite sense of this basic utility design problem inherent in both Bing Webmaster Tools as well as Google Webmaster Tools.)

Likewise, Google Webmaster tools suffers a bit from the conflicting goals of the engineers and the needs of the tools’ target audience. So, I’d prefer that none of the search engines look at one another’s offerings when designing such things, but instead try to focus solely upon providing as much functionality as webmasters might need. As things currently stand, there’s a sensation that all of the search engines are providing something of “placebo utilities” to webmasters — the interfaces have some confusing melange of features which are ultimately not all that useful, but are instead intended to throw up some smoke and mirrors to make it appear that they’re trying to help webmasters with the optimization of their sites.

Moving past my perhaps-unfair assertions, let’s look at what the new Bing tools provide, and what could be done better.

First, a head-nod to Vanessa Fox for her comparison between Bing’s and Google’s Webmaster Tools — as the creator of Google’s Webmaster Tools, she is likely one of the best people around to examine such utilities with a critical eye, and in the best position to know how much info a search engine might realistically be able to provide, and in what format. Likewise, a nod to Barry Schwartz’s post about Bing’s tools.

Both Vanessa and Barry berate Microsoft for building the tools while requiring Silverlight technology to view/use them. I don’t consider that as much of a big deal, because I consider that a sort of “religious difference” in how the tools were constructed (most of us who are jaded about how Microsoft has strong-armed proprietary technology in the past might react negatively against Silverlight, as well as those who avoid it out of conservative privacy practices).

However, if I’m looking at Bing Webmaster Tools purely from the perspective of how well it does or doesn’t function, I’m not concerned about this tech dependency built into it, since I think the majority of webmasters out there will be unaffected by this. I’m not a fan of MS programming protocols AT ALL, and it may be a bit of my former bias as a technologist within a megacorporation creeping in, but the Silverlight criticism just appears slightly out of sync with the primary issues of whether the tools provide vital functionality or not — and, it may not be unfair of Microsoft to decide that if you wish to play in their Bing sandbox, they have the right to promote their proprietary technology to do so. In comparison, I have a friend who is a privacy freak, and he surfs with Flash disabled — Google’s Webmaster Tools requires Flash for one or two graphs, and would be equally irritating to him as Silverlight.

Both Barry and Vanessa mention how Bing’s new interface revoked the backlink reports, and I agree with them both on this point. This was one area where I’d hoped Bing would take the opportunity to be more open than Google. If the engineers looked at competitors’ tools while building Bing’s, they should have tried to recreate the backlink reports that Yahoo! provided in Site Explorer — which seems to give a more comprehensive picture of backlinks. Since webmasters are told that inbound links are one major criterion for rankings, avoiding providing this info is a major void.

Bing obscures the numbers of pages indexed when one performs a “site:” search by domain, too, so revoking this functionality, such as it was, from the old interface eroded some of the usefulness. Perhaps their pre-development surveying of webmasters resulted in feedback that their earlier backlink report “wasn’t useful”, but that would’ve mainly been because it was less-robust than one like Yahoo’s.

Vanessa mentions that they don’t provide data export features, and I agree completely that this is a major oversight. In fact, as a programmer I happen to know just how relatively easy it is to code data to export in XML or CSV, and considering how long it took to launch the product it’s sort of shocking they didn’t include this upon launch. (You’d think Microsoft would not miss an opportunity to provide a “click to export to Excel” button!)

Vanessa stated that they also ditched the Domain Score, and remarked that this was a good thing. I disagree on this point because I think any insight into a ranking score that any of the engines give us is helpful in assessing how effective/important a site or domain is. Was this the same as the small bar-scales Microsoft had been providing for a handful of the more important site pages via the interface? Although these graphical page ranking scores were entirely derivative of Google Toolbar PageRank, I would’ve prefered they provide even more in that area. Bing’s in a position where they ought to be able to experiment with providing more info than Google does, and see just how dangerous it really is to be more open with marketers!

Vanessa did a great comparison between the analytics Bing provides versus Google Webmaster Tools and Google Analytics. While analytics from Bing’s perspective are interesting to us all, she notes one aspect that also strikes me as an issue with the graphs: as a webmaster/SEO, when I see indexation decreasing, I’d really like to know why. This is particularly irritating where Bing is concerned, because among the industry I think it’s widely felt that Bing simply indexes a lot less than Google.

Many of my clients want to know what they can do in order to increase their indexation with Bing. I see the same thing with my test websites. I may have 30,000 discrete pages and Bing appears to index a sharply lower number than Yahoo or Google. The feature allowing one to manually submit URLs seems to acknowledge this sad fact — but, in context, it’s nearly sending the wrong message! “Oh, our spider’s legs get tired out there, so bring your pages directly to us.” Vanessa’s got a point on this score — why should I feel I need to do this if you accept Sitemaps? And, if I or my clients have tens of thousands of site pages, fifty pages to be manually submitted at a time is simply not a sustainable solution. I can understand having the interface to rapidly submit brand new content pages, but what’s missing may be some clear communication as to what issue is restricting my indexation.

The Features showing whether there are robots restrictions, malware, or crawl errors which could impact a site are all great. However, if one already has everything functioning just fine, the tools need to answer futher questions: Why isn’t my site crawled more deeply? And: Why don’t my pages rank higher? Ultimately, webmasters ask: What can I do to improve my site’s performance? Understandably, Bing and other search engines are reticent to provide too much info in this area. However, there are things which they could provide:

  • Possibly a tool where a webmaster could select one of their pages or submit a URL to find out what primary keyword phrase Bing considers the page to be about?
  • Tools which report upon quality issues detected with specific pages. For instance, is the page missing a Meta Description or Title? Are there pages where the Title or Meta Description appears to not be relevant to the page’s content, or out of sync with anchor text used in inbound links? Are there images which could be improved with ALT text?
  • Why not merely inform webmasters that you consider their links and other references to be too few or of too low in importance?
  • Bring back the scales showing page scores, and actually go further in providing some sort of numeric values!
  • Actually show us that you spider all pages, even if you opt not to keep all in your active index! This would at least give the impression that you are able to index as deeply as Google, but choose not to display everything for other reasons.
  • How about informing us of pages or types of pages (based upon querystring parameters, perhaps) which appear to have large degrees of duplication going on?
  • Tie-in our Webmaster Tools with local listing account management for local businesses, so everything could be done via one interface.
  • Provide means for us to customize the appearance of our SERP listings a little bit, similar to Yahoo SearchMonkey and Google’s Rich Snippets.
  • Provide us with tools that help us improve overall site quality, such as if you see pages on our site with incorrect copyright, misspellings, or orphaned pages.
  • Consider providing us with an A/B testing type of tool so that you might inform us about which of two page layouts performs better for Bing search!
  • Inform us if you detect that a site has some sort of inferior link hierarchy — this could indicate usability problems affecting humans as well as spiders.
  • Provide more granular details on how well we perform in Bing Image Search, Mobile Search, Video Search, etc. Currently, I cannot tell how many of my images are indexed in Bing.
  • For that matter, it would be nice to enable an Image Sitemap file, like what Google offers.
  • Finally, for a really pie-in-the-sky request: You operate web search for Facebook — would your contract allow you to tell us how often our webpages appear in Facebook search results and how many clickthroughs we might get from those?
  • Anyway, there’s my feedback, criticism, and some ideas for additional features. I’m not trying to beat anyone up and I’m actually grateful for any feedback we receive from all of the search engines on our performance within them. I mainly ask that the search engineers will keep in mind that we mainly want to know “What can we do to improve performance?” and to provide us with tools to accomplish that in as much as they’re able to do without compromising the integrity of the engine.

    Where Bing is concerned, I believe it could be possible to be even more open than Google is in order to further differentiate yourselves and experiment to see what’s really possible in terms of openness!

Keyword Research for Local SEO

Doing keyword research for Local SEO has been somewhat difficult in the past, because many local search phrase combinations have relatively low volume, and the amount of data has been too small/granular for the limitations found in many keyword research services.

Even just a few years ago, I used to try to research local keyword phrases for things like “boston plumbers” in a service such as comScore’s qSearch tool, and such phrases would frequently have insufficient search volumes for the tool to reflect back any data. Even Google Trends today states that there’s insufficient volume to show graphs for “boston plumbers“.

The reason why such research is important is because a site seeking to grab as much qualified traffic from consumers interested in a particular type of business as possible, must first know which phrase(s) to focus upon to achieve search engine rankings. They must answer the questions of whether consumers are searching for “plumbers”, “plumbing”, “plumber” or “pipes”. And for local businesses, it’s ideal to match for exact phrases that include local keywords. Do consumers search for “boston plumbers”, “plumbers, boston”, “plumbers in boston”, “plumbers boston ma”, or “plumbers 02118” (a Boston ZIP code)?

There are cases that are even more complex, where an industry may have multiple terms used to find businesses (“accountants”, “accounting”, “tax preparation”, “CPAs”), and cities with multiple name versions and neighborhoods (“New York”, “New York City”, “Manhattan”, “New York, NY”, “NYC”, etc).

The problem is irritating when there’s little recourse available for researching consumer behavior.

However, various providers have been beefing-up the data they make available in order to help address the marketing demand. For instance, Google Insights will sometimes show us the relative search traffic for phrases in cases where Google Trends will not, such as for the “boston plumbers” example:

One reason why Continue reading