Image Search Vital To Rankings

A MediaPost article by Laurie Sullivan reported on some of the comments from reps of search engines Google and Bing at the recent SES conference in San Jose. According to them, consumers rely on images in search results more than previously thought, and, knowing this can help SEO professionals to better optimize sites.

Nadella shows current popular content in Image Search: Michael Phelps
Microsoft sees increasing importance of images to searchers - and their search engine's homepage design reflects this

The representative from Microsoft Bing stated that after regular web search, Image Search was their next most-popular feature. This also reflects the same user behavior that Google and Yahoo! have reported in the past (at least until Google purchased YouTube – before that acquisition, Image Search was Google’s second most popular feature).

With the advent of “blended search” or “Universal Search”, where images and other vertical search content are mixed into the traditional keyword search results listings, the usage picture becomes a bit more blurred. Users are now able to find image content in the regular search results, and they don’t always have to click into the specific image search pages to be finding and clicking through to that content. As such, marketers desiring to dominate keyword search page “real estate” must seriously consider targeting some image content to be able to exploit this channel.

If you’re familiar with me, you’ll know that I was one of the earliest SEO experts to write articles on optimizing images for search, and particularly a pioneer in optimizing images via Flickr and optimizing images through other image sharing services. (See also my Comparison Chart for SEO Value of Image Sharing Sites.) I’ve also spoken numerous times at marketing conferences on the value of Image SEO and how to go about optimizing images for search. It’s safe to say that I’m a proponent of it!

Google is also promoting image content in regards to search presence. At SES, Google’s representative, R.J. Pittman stated, “Images are no longer a ‘nice to have, but a must-have’ piece to promote businesses online.”

Google is also continuing to aggressively develope innovations in their image search engine sophistication. Google is no longer merely focusing upon the contextual text keyword content surrounding images in order to interpret their subject matter, they are now using a number of strategies for actually analyzing the graphic content of images and their relative quality compared with other similar images.

One of the biggest issues that I see facing internet retailer sites, travel portals, and other online commerce sites is the fact that they’re often incorporating thousands of product images supplied by their providers. Those manufacturer or content provider supplied photos are replicated across many competitor websites, and the search engines like Google expend great effort at detecting duplicate content such as this so that they can offer up a variety of images when their users conduct searches (trying to avoid offering up a page of search results where all thumbnails reflect the same identical image).

I know a number of ways around the duplicate content filtering in addition to how to optimize for contemporary image ranking factors. If there’s sufficient interest, I might soon provide a list of tips on how to optimize for these paradigms, so leave a comment below if you’d be interested!

Should You Use .TEL Top Level Domains (TLDs)?

.TEL domainsPeriodically, someone will launch a new, specialized Top Level Domain (“TLD”), claiming it’s the next big thing on the net. As we’ve seen time and again (such as with the .MOBI TLD), most of these efforts are never going to achieve the same level of recognition or adoption as the .COM and .NET standards, and businesses which muck about with them are likely to expend valuable resources resulting in zero ROI.

Such is likely to be the case with the .TEL top level domain which launched in March. .TEL, operated by Telnic Limited, is intended to be a sort of domain-based authoritative location for contact information – a sort of grand new evolution of phone directories, white pages, and yellow pages. When you obtain a .TEL domain, you don’t manage it on the servers of your choice, but instead it will generate a site hosted on the Telnic service. Justin Hayward, Communications Director for Telnic, is quoted as saying:

“We consider .tel to be the first global live contact site directory. Once contact details are populated in a .tel, anyone can type a known .tel address into any browser or use keywords that describe the person or business they want to find. Keywords are free so the more keywords that are used and the more descriptive they are, the easier it is to be discoverable.”

On the surface, this all sounds good, but the first problem I see with it is one of adoption: if people know to look for you, they will not be likely to type in a .TEL domain name — everyone looks for .COM. Even if you specifically tell someone about your .TEL URL, you’ll expend extra time explaining that, yes, .TEL *is* a type of web URL, and even then they’re just as likely to type it in as “something.tel.com”, which is operated by Tokyo Electron company, and not the proper Telnic page URL.

This is the exact same issue with .MOBI, which was intended to be used as an authoritative URL for the mobile-friendly versions of websites. Most people don’t know/understand the protocol, so they won’t be naturally typing it in when out and about with their mobile devices (and, .Mobi has the additional downside of creating one-letter-longer URLs, which make it that much more tiresome for someone on a wireless device to type in).

From a marketing perspective, .TEL domains have additional downsides. You don’t appear to be able to control the UI or look-and-feel of the generated contact pages, and slapping on yet another domain can split the effectiveness of your natural search engine optimization work. Links pointing at that additional URL will dribble away portions of the PageRank you could be sending or keeping for your primary domain.

And, how will search engines treat it? As with many of the lesser top-level-domains, they’re likely to be more mistrustful. I see zero toolbar PageRank values for the top-ranking .TEL pages, though this may be due to the domains only getting launched recently. But, their public statements touting keywords (“…Keywords are free so the more keywords that are used…the easier it is to be discoverable…”) and the fact that domainers seem to be excited by having another channel to potentially exploit makes the TLD concerning in terms of potential search engine performance.

Bucking standards in favor of creating your own proprietary one, and flying in the face of established adoption rates in terms of internet consumer behavior are not a good formula for success.

In a very self-serving blog post by Telnic CTO, Henri Asseily, titled, “Why .tel and not a free hcard microformat?“, they seem to also be taking aim against the increasingly popular hCard Microformat standard in favor of .TEL. What’s funny about this is that this is comparing apples and oranges, and Telnic has deployed example domains (see emma.tel, henri.tel) which provide a vCard at the bottom of the pages.

It’s absolutely stunning to me that they took the trouble to provide vCard off of their contact info pages when they could easily also embed all the vCard information into the page itself, using semantic markup! Meanwhile, his blog is suggesting that using .TEL in some way should be done instead of using hCard! And, I totally fail to see the significance of protecting info he’s referring to with privacy settings — are the .TEL pages a publicly-findable directory of contact info, or not?!? It’s disappointing that they wouldn’t simply incorporate the hCard and thereby gain additional advantage from the special display treatments that Google has begun applying to microformat-enriched pages.

Telnic is partly promoting their service as a way of providing individuals’ and businesses’ contact info on the internet, “even if you don’t have a website”. Ummm… don’t the online white pages and yellow pages already do this?

For companies considering adopting the .TEL for online marketing advantage, you should seriously reconsider. This is not going to become the defacto online standard anytime soon, and expending time playing with this domain is going to take resources away from efforts which are likely to be far more beneficial. At worst, linking to new .TEL domains could also subtract some of your existing PageRank value to little advantage.

The only case in which a .TEL domain could potentially provide advantage is in the case of a project to improve online reputation, if you’re looking for additional webpages to come up in SERPs, helping you to push down some sort of negative content which may be ranking for your brandname. However, there are a lot more social media sites, business profile pages, and additional strategies which you should be employing in that case, and the unproven nature of .TEL sites in organic search rankings relegate use of the new TLD to the bottom of your list of possible online reputation weapons.

Save Yourself A Thousand Dollars On Simba Yellow Pages Report

Simba Information has released a report on the state of the yellow pages industry entitled “The RBOC Bankruptcies 2009: The Impact on the Future of the Yellow Pages Industry” and will offer a webinar this week to those who bought it. At $995, I think the report is likely overpriced, and I thought I’d save you some money if you were tempted to pay that much to find out why some of the major yellow pages publishers are filing for Chapter 11 bankruptcy protection, what this means for the industry, and where things are headed.

Yellow Pages

I guess I’m reacting to the somewhat hyperbolic language found in the press release which I think was intended to appeal to fears of yellow pages publishers possibly the very people who should least afford to pay this much for the analysts’ report.

First of all, I think it’s a stretch to refer to Idearc and R.H. Donnelley as RBOCs. Since Idearc was separated from the telco function of Verizon and then spun off, I don’t believe people really consider it to be an RBOC any longer. I don’t think R.H. Donnelley could ever have been considered an RBOC, even though it acquired directory parts of old RBOC companies. “RBOC” refers to “Regional Bell Operating Company”, used to describe those companies which originally made up the American Telephone & Telegraph Company, earlier known as Bell Telephone Company, which were broken apart into separate regional companies as part of antitrust requirements. The main focus of the original splintering of the RBOCs was placed upon the phone services, and the general convention is to consider those telco functions as being the “phone company”, while when non-telco company portions are spun off, they are no longer refered to as “RBOC”. This is maybe pedantic of me, but I think such loose accuracy of description is inauspicious in an expert report.

Secondly, there’s not a whole lot of mystery about why Idearc and RHD got into financial straights and had to file for Chapter 11. Both were overly debt-heavy and when the economy turned sour, they could not properly service those debts. I wrote in-depth about Idearc’s case in a post on Search Engine Land originally titled “Idearc’s Chapter 11 Bankruptcy: Who’s Really Responsible?“, and you can see Bloomberg’s and other reports stating that R.H.Donnelley’s bankruptcy was due to overly high debt. Yell Group’s problems also stem from debt. Ambassador Media Group, another well-known yellow pages publisher, has also filed for bankruptcy protection this week as well.

So, let me save you a thousand dollars with the simple explanation of this. For a hundred years, the print yellow pages industry was a very profitable business. It was a very safe bet, as investments go. Such a long-standing business model, “ecologically adapted” to be interdependent with many other businesses, was simply not expected to see any major declines. However, the technological disrupters of first the internet, then Pay-Per-Click advertising, and then the Google search engine all had a very unforeseen effect. These companies increased capital investment, expecting longterm wins, but the rapid erosion of print advertising undermined their ability to pay on their loans quickly enough. Even though there’s increasing profitability on the part of their internet sides in many cases, the volume of internet profit is insufficient to both cancel out the losses in print revenue and simultaneously pay off their loans. In Idearc’s case, I further outlined how they were sandbagged from the very beginning by Verizon offloading an unreasonable debt load upon them.

What does the future hold for Yellow Pages?

Immediately, these companies which are restructuring will come out far stronger. They will be forced to further pare down their print divisions. Print will continue to see erosion in revenues, since overall usage is declining, just as it is with other print media (I solidly established that the yellow pages industry’s own statistical projections were considerably inaccurate, and that print directory usage likely continues to drop each year).

I’ve also been stating for quite some time that there appear to be too many players in the internet yellow pages (IYP) sector, and that I foresee collapsing of this is likely we can expect that there are likely to be some mergers of these companies in the near future.

There is also further weakening of these online directories in terms of marketshare from my perspective. For now, they can be profitable, but I see too much incestuous interselling among the players. It’s possible that once collapse within this sector occurs, that the resultant players left standing may be strong enough to continue competing and to grow. But, there is significant cause to be concerned with the growing local search marketshare taken over by the major search engines such as Google. If the IYPs cannot improve their game well enough and rapidly enough to compete with the major search engines, then there will continue to be financial instability on the parts of the yellow pages companies.

Simba’s press release mentions in passing how “…bloggers jump to their computer keyboard and pound out a call for the outright ban of books for the good of the people whether they want them or not and toss in the good of the environment as well…” wording that plays very well to those in the YP industry which have been very defensive about the attacks on the printed books. Yet, rather than playing up to the anti-environmentalist defensiveness of the YP industry, it’d probably be more productive to resolutely face into the difficult current truths. People who don’t use the printed media are increasingly irritated by having the books from multiple providers dropped unsolicited on their doorsteps these days, and environmental progressivism is a popular and rising trend, turning mild irritation into full-frontal attacks. It’s undeniable that in quite a number of markets throughout the U.S. there have been significant movements to restrict directory distribution. Quite simply, this trend is going to continue, and the industry’s thin bandaids in many cases are not going to perform well at resisting the attitudinal change.

Finally, why should you trust my analysis more than Simba’s (even though I’m saving you the thousand dollars)?

For one thing, I was one of the earliest analysts to state that I saw weakness in the yellow pages industry, and later that there were serious problems in store for yellow pages. There were quite a number of other research firms and analysts that cater to the yellow pages industry that were offended back then by my findings, but it’s now undeniable that print yellow pages have indeed experienced substantial declines. I forecast the decline, I warned of weakening in print, and I stated it out loud, even as other major analysts were dismissive and even angrily reactive. I simply observe the facts and attempt to project realistic possibilities rather than merely catering to popular notions I’m not afraid to speak the whole truth as I see it.

Interestingly enough, AT&T’s directory division hasn’t been experiencing the same degree of problems seen by other directory publishers, but they’ve been kept “within the fold” of the overall AT&T telco corporation, which can insulate them from problems experienced by the standalone directory companies. Simba’s webinar is including Frank Jules, AT&T’s president & CEO of Advertising Solutions, but I’m not at all convinced that AT&T’s yellow pages group will be all that informative beyond speaking to their directory products offered.

As I pointed out in my article showing weakening in online searches for the “yellow pages” keyword phrase, online consumers appear to be seeking out yellow pages sites less and less. It stands to reason that as Google’s blended search bubbles up local businesses to user keyword search requests more simply, there’s less reason for those consumers to be seeking out business directories. The younger generation is forgetting what a “yellow pages” is altogether, and sites like AT&T’s Yellowpages.com which have placed all their branding around the eroding concept will stand to lose out.

Simba’s report undoubtedly will have some good information in it. But, will it really be worth a thousand dollars? I seriously doubt it. If you’ve read my blog post here, I think you can safely save yourself the cost.

Is Geotagging Worthwhile for Search Engine Optimization?

I posted an article today on “Should You Geotag Pages For Local SEO?” on Search Engine Land. In it, I describe the cases in which I think you should geotag a webpage.

Geotags

Essentially, I state that locally-oriented webpages for businesses or content pages which have full street addresses should probably be tagged with hCard microformats, as I’ve described before.

Otherwise, if you have a locally-oriented webpage about something which has a place in the physical world, but which is not associated with an actual street address, I believe use of geotagging makes sense. Increasingly, specialized search engines (and even Google Maps, Yahoo! Maps, and Bing Maps) are pinpointing such content and making it readily available for online users.

The other frequently-confusing aspect of geo tagging of webpages is caused by the fact that there’s no clearly-dominant standard for formatting of geotag information. At least four major standards have been deployed out into the wild, with no clear winner! Luckily, one could probably use all four simultaneously on a page without taking extreme measures and, considering how relatively easy it is to add the geocoding to the page in the first place, I see no reason not to add all four at once if you have a valid reason to geotag the page.

I believe we’re going to see increasing adoption of geotagging as the major online mapmakers make more geographic information available to map consumers. Google’s recent deployment of “Rich Snippets” is a prime indication that more semantic markup data may become enabled in order to enrich online users’ search experiences, and local mapping data is one of the prime areas where they’re likely to add more functionality. (See also my article on Optimizing Search Listings for more details about how semantic markup such as hCard Microformats may position your site for greater online success.)

Do Page Load Times Affect Your Search Rankings?

How Fast Does Your Page Load?As average internet access speeds have improved, many websites have become pretty lazy about paying attention to how fast their pages load, designing bloated content full of heavy images, multiple Javascript and CSS files, and ads or iframes pulling from dozens of sources. This neglect could affect your search rankings, and here’s why.

First of all, Matt Cutts, head of the webspam team at Google, stated in a recent Q/A video that sites’ load times are currently not a ranking factor.

However, there are three reasons to believe that site load times could affect search rankings in the very near future:

  • Matt’s opinion is that it would be a great idea for a ranking factor! And, he leaves open the possibility that it could be used as a ranking factor in the future. He’s influential within Google and is named on some Google ranking patents, so this is significant. Other significant Googlers also have indicated that this may be a focus area of increasing importance to them. Larry Page apparently stated that he wanted Google to be central to efforts to make the internet speedier, allowing users to get pages as fast as turning pages in hardcopy books.
  • Google recently released Page Speed, an add-on for Firefox browsers which can diagnose a number of elements which impact page load times (such as Javascript and CSS files, image file sizes, etc). (This is also likely Google’s competitive response to Yahoo’s similar tool, YSlow, which even Google recommends as a tool for diagnosing speed issues. Combined with these other reasons, I believe there’s cause to believe it’s not just a competitive checklist item, but part of their strategy to speed up the internet experience.)
  • Last year, Google introduced Page Load Time as a ranking element in Google AdWords ads.
  • Internal research at Google has shown that slower page delivery times will reduce the number of searches users will conduct by 0.2% to 0.6%. While this may appear negligible, it undoubtedly would add up to a lot of lost revenue over time for Google, and it proves their point that slowness has a chilling effect on internet use and traffic.

Based on the above reasons I outlined, I think page load times are very likely to become integrated into Google’s ranking “secret sauce” soon, and that sites which seriously neglect page load time will find themselves at a disadvantage.

Classic Search Engine Optimization (“SEO”) lists of tricks rarely include mention of improving page speeds, but Google has steadily been evolving their ranking methods to reduce the impact of technical code tricks and moving toward more human-centered design factors. In fact, one part of their process already includes having their quality team assess the webpages found in search results for many thousands of sample queries. If one of your site’s sample pages fall into their sample set, the assessor’s rating of the page compared to competitors could result in an average quality score being applied to all the pages on your site.

I’ve believed for some time already that Google applies some automated quality scoring to natural search rankings, similar to how they’ve applied such factors to their paid search ads.

My suspicion is that there will likely be some sort of scale of site loading speeds which might be used to impact rankings in the future. And, I’d also suspect that this factor would be used primarily as a negative ranking factor, as opposed to a positive one. By this I mean that pages from competing sites which have all other stronger relevancy ranking elements essentially equal could drop lower in search results if their load times don’t meet some minimum standard. Load time might negatively impact a ranking, but likely wouldn’t necessarily help it rise above a page which has slightly stronger relevancy/importance factors unless that page had serious slowness itself.

I’d further expect that Google would apply some sort of adjustment to try to assess whether one Googlebot visit ran across just a momentary lag condition, versus a page delivery speed that’s always slow. So, I don’t see any reason to freak out if you have experienced a server or application issue for just a brief period!

Even if Site Load Time were not to become an official member of Google’s list of over 200 ranking factors, load time could still indirectly affect your rankings. Avanish Kaushik at Google has strongly encouraged webmasters to pay attention to Bounce Rate (a factor determined as a percentage of site visitors that only visit one page and/or who only land on a page for a few seconds before hitting the back button).

Google can also easily see if a user immediately backs out of a page they find in the search results, and such a high bounce rate may indicate a poor quality result for a particular query. One prime cause of a user hitting the back key can be if a page is extremly slow at loading. So, if Bounce Rate is a factor affecting rankings, then a page’s load time may impact it, indirectly affecting rankings.

Finally, let’s go to Google’s original point about why this is important in the first place: good User-Experience. Along with faster network speeds, sites need to load rapidly for endusers in order to provide a positive user-experience. Even if this were never used directly or indirectly by Google in rankings, it will still affect how users experience your site, and that can affect your ultimate conversion rates and repeat visits.

But, Page Load Time / Site Load Time will almost certainly be a direct or indirect ranking factor.

So, how to prepare for this important and basic factor amongst all your site’s various optimization strategies? Well, very easily and cheaply, you could get a copy of Google’s Page Speed extension and run it against samples of your site pages to see what speed factors it might recommend for you to improve upon.

Also, note that this browser-based diagnostic tool does not assess a number of factors which can still affect site load times, such as network connection times and conditions which cause sites to buckle under higher loads.

KeyRelevance has long considered site load times to be of prime importance and has included a number of factors affecting page load speeds in web site reviews that we provide for clients. In fact, we even provide clients with improved compression versions of their site images for smaller filesizes. Speed of access has long been important to a site’s overall user-experience, and Google’s increasing focus in this area is now making it of central importance to keyword rankings in search results. So, if you want to be at the top of your SEO game, you need to be paying attention to your site’s page delivery speed Google is!