Embattled Yellow Pages & SEO

Online Yellow Pagesby Chris Silver Smith

My article on the “Brave New World For Yellow Pages” aired today on Search Engine Land. In it, I describe how Google Trends is showing that each of the major internet yellow pages has taken a sharp dip since the end of last year and early this spring. I diagnosed the cause of this apparent downturn in referred visits from Google as being due to Google’s change to display their local 10-pack in more cases where user queries don’t include geographic modifying terms (they’re incorporating users’ IP address geolocations).

There are many variables involved, so others may easily dispute my diagnosis. However, what is indisputable is that from Google’s perspective, these sites are now getting fewer referral visits.

This isn’t a complete surprise. There have been indications for some time that overall trends could go in this direction, and many of us in the yellow pages industry were concerned about search engine incursions into YP territory from the beginning. I’ve previously pointed out another concerning behavioral change shown by Google Trends – fewer and fewer users are searching for “yellow pages” in Google keyword search over time. That trend is still continuing:

Yellow Pages searches according to Google Trends

John Kelsey recently wrote about how these companies can turn things around, even though he implies their “backs are against the wall”. I agree. From an SEO perspective, it’s nowhere near the end-of-the-line for these companies.

It’s ironic that local search marketing experts all recommend that businesses update and enhance their listings within these websites, in large part for local SEO value. Yet, these sites now are struggling with their own SEO.

Almost uniformly, each of them have a huge amount of trust and PageRank from Google. This SEO goodwill can be employed to turn these trends back around, if that goodwill isn’t squandered.

In the SEL article, I mentioned Yelp’s success over the same period of time, and it hints at one of the elements needed. Good user-experience and an engaging interface can do quite a bit. Also, subtle details can make the difference between a site where people wish to interact and add value versus those where people really don’t wish to hang around.

Various SEO improvements should also be used – improvements to the amount of content on pages, the breadth of information about businesses, and forming that content into signals which effectively “sing” to the search engines. There are quite a few areas neglected by these sites, with little excuse. For instance, as far back as 2006, I recommended employing Microformatting for local SEO value. If all of these sites had been following my recurring recommendations on integrating Microformats, many of them would now be sporting improved display in Google search results, similar to Yelp — when Google rolled out Rich Snippets a few months ago, suddenly Yelp listings were decorated with eye-catching star rating icons, and stats have shown that these treatments likely increase click-through rates considerably. (Insiderpages was the only other one of these sites which I noticed were using the Microformats, and the only other ones which enjoyed the Rich Snippets icon treatment in Google SERPs.)

Microformats are only the tip of the iceberg for most of these sites. Basic building-blocks of SEO are lacking in many cases. Search engine friendly infrastructure such as bot-friendly URLs, robust linking hierarchies, good page titles, descriptive metadata, and stable URLs which don’t continuously appear/disappear are some of the items which these companies struggle to have.

I believe these companies can turn the trend back around and increase their natural search referral traffic dramatically. But, are they willing to make the changes necessary to do so? It will almost certainly require them to pull out all the stops in taking their SEO games to the next level.

The Associated Press’s News Microformat

The Associated Press (AP) recently announced a semantic markup standard they’d like to see adopted online for news articles – the “hNews Microformat“. The proposed microformat was announced simultaneously with their declaration of a news registry system to facilitate protection and paid licensing arrangements for quoting and using news article material. While the overall announcement and news registry system was widely ridiculed in the blogosphere (in part because of a confusingly inaccurate description which stated that the microformat would serve as a “wrapper” for news articles, and the overall business model and protection scheme seems both naively optimistic and out-of-touch with copyright “fair use” standards and actual technological constraints), but the hNews microformat part itself could potentially gain some traction.

So, if you’re an online marketer of a site which publishes large amounts of articles and news stories, is the hNews microformat worth adopting to improve your online optimizations?

AP Protect, Point & Pay Diagram
(AP's Diagram Illustrating "Protect, Point & Pay" System & hNews Microformat)

I’ve long been a proponent of incorporating microformats within webpages as a component of overall good usability and potentially valuable formatting for search engine optimization purposes. Microformats can provide some additional, enhanced usability for advanced users who are using devices which can read the information and store it for future use, and they can potentially improve search engines’ ability to understand the content within webpages which could lend a marginal increment more SEO value.

Both Yahoo! and Google have been sending signals for the past few years that they consider some of the microformats to be potentially useful as well. They’ve both marked up their own local search results with hCard microformatting for end users’ benefit, and they’re both starting to make use of microformatting to give certain types of data special treatment. In the case of Google, they announced that they’d begin displaying some microformat data with slightly different listing layouts in the search results, a treatment that they’ve dubbed “Rich Snippets”. And, they say they’ll be rolling out more treatments based on microformats in the future.

With this background in mind, it’s not surprising that the AP has jumped on the microformats bandwagon, but it also appears that they’re trying to influence the development of them where news articles are concerned, with a major agenda in mind. They wish to include some sort of webbug in each news story’s markup, so that publishers of the content can be tracked more easily by them – it will be clearer when sites are reprinting news stories, and how frequently those stories are visited and viewed by consumers online.

Other portions of the hNews microformat appear to be more useful from both a search engine viewpoint and publisher site aspect. Labelling of items including keyword tags, headlines, main content, geographic locations and including author’s vcard info all appear to be valuable standards.

(I could really criticize their “geo” tagging of the articles as quite inadequate, though. Merely adding a longitude and latitude to an article seems quite short-sighted, because there needs to be further definition of what is being geotagged. If an article is about multiple locations, it would be ideal to label each geotag to tell what item is being located. Further, it would be ideal to label the article with an assumption of the geographic region that the article should be expected to appeal to. Is it mainly of interest to people within a particular city, state/province, region, nation, or is it of international interest? Still, having some geotag is better than nothing.)

For any marketers out there considering adopting the hNews Microformat standard, I’d advise waiting until the dust settles on this one. Other microformats developed perhaps more objectively, and there’s a lot of distrust and disaffection with the heavy news industry influence that is involved in this proposed standard. Currently, I’m not convinced that it will be widely enough accepted to become valuable for use. While having AP partners all adopting the standard may be sufficient enough to reach a tipping point where many other sites and companies will make use of hNews, Google’s public response to it was unusually cold-sounding.

Blogger/reporter Matthew Goldstein quotes Google’s response on the matter: “Google welcomes all ideas for how publishers and search engines can better communicate about their content. We have had discussions with the Associated Press, as well as other publishers and organizations, about various formats for news. We look forward to continuing the conversation.” While sounding expectably neutral and noncommittal, Google is also stating that this has not been widely-accepted by everyone, even within the news industry itself. This in combination with widespread skepticism within the developer/microformat community and blogosphere signal that hNews may have a very long way to go before it becomes something worthwhile for optimizing articles on publisher sites.

So, for now I advise avoiding this proposed standard, sit back and see how the dust settles. If you’re already syndicating content via RSS and Atom feeds, then you’re already distributing your content in a manner that’s easily absorbable and readable by search engines.

Do Page Load Times Affect Your Search Rankings?

How Fast Does Your Page Load?As average internet access speeds have improved, many websites have become pretty lazy about paying attention to how fast their pages load, designing bloated content full of heavy images, multiple Javascript and CSS files, and ads or iframes pulling from dozens of sources. This neglect could affect your search rankings, and here’s why.

First of all, Matt Cutts, head of the webspam team at Google, stated in a recent Q/A video that sites’ load times are currently not a ranking factor.

However, there are three reasons to believe that site load times could affect search rankings in the very near future:

  • Matt’s opinion is that it would be a great idea for a ranking factor! And, he leaves open the possibility that it could be used as a ranking factor in the future. He’s influential within Google and is named on some Google ranking patents, so this is significant. Other significant Googlers also have indicated that this may be a focus area of increasing importance to them. Larry Page apparently stated that he wanted Google to be central to efforts to make the internet speedier, allowing users to get pages as fast as turning pages in hardcopy books.
  • Google recently released Page Speed, an add-on for Firefox browsers which can diagnose a number of elements which impact page load times (such as Javascript and CSS files, image file sizes, etc). (This is also likely Google’s competitive response to Yahoo’s similar tool, YSlow, which even Google recommends as a tool for diagnosing speed issues. Combined with these other reasons, I believe there’s cause to believe it’s not just a competitive checklist item, but part of their strategy to speed up the internet experience.)
  • Last year, Google introduced Page Load Time as a ranking element in Google AdWords ads.
  • Internal research at Google has shown that slower page delivery times will reduce the number of searches users will conduct by 0.2% to 0.6%. While this may appear negligible, it undoubtedly would add up to a lot of lost revenue over time for Google, and it proves their point that slowness has a chilling effect on internet use and traffic.

Based on the above reasons I outlined, I think page load times are very likely to become integrated into Google’s ranking “secret sauce” soon, and that sites which seriously neglect page load time will find themselves at a disadvantage.

Classic Search Engine Optimization (“SEO”) lists of tricks rarely include mention of improving page speeds, but Google has steadily been evolving their ranking methods to reduce the impact of technical code tricks and moving toward more human-centered design factors. In fact, one part of their process already includes having their quality team assess the webpages found in search results for many thousands of sample queries. If one of your site’s sample pages fall into their sample set, the assessor’s rating of the page compared to competitors could result in an average quality score being applied to all the pages on your site.

I’ve believed for some time already that Google applies some automated quality scoring to natural search rankings, similar to how they’ve applied such factors to their paid search ads.

My suspicion is that there will likely be some sort of scale of site loading speeds which might be used to impact rankings in the future. And, I’d also suspect that this factor would be used primarily as a negative ranking factor, as opposed to a positive one. By this I mean that pages from competing sites which have all other stronger relevancy ranking elements essentially equal could drop lower in search results if their load times don’t meet some minimum standard. Load time might negatively impact a ranking, but likely wouldn’t necessarily help it rise above a page which has slightly stronger relevancy/importance factors unless that page had serious slowness itself.

I’d further expect that Google would apply some sort of adjustment to try to assess whether one Googlebot visit ran across just a momentary lag condition, versus a page delivery speed that’s always slow. So, I don’t see any reason to freak out if you have experienced a server or application issue for just a brief period!

Even if Site Load Time were not to become an official member of Google’s list of over 200 ranking factors, load time could still indirectly affect your rankings. Avanish Kaushik at Google has strongly encouraged webmasters to pay attention to Bounce Rate (a factor determined as a percentage of site visitors that only visit one page and/or who only land on a page for a few seconds before hitting the back button).

Google can also easily see if a user immediately backs out of a page they find in the search results, and such a high bounce rate may indicate a poor quality result for a particular query. One prime cause of a user hitting the back key can be if a page is extremly slow at loading. So, if Bounce Rate is a factor affecting rankings, then a page’s load time may impact it, indirectly affecting rankings.

Finally, let’s go to Google’s original point about why this is important in the first place: good User-Experience. Along with faster network speeds, sites need to load rapidly for endusers in order to provide a positive user-experience. Even if this were never used directly or indirectly by Google in rankings, it will still affect how users experience your site, and that can affect your ultimate conversion rates and repeat visits.

But, Page Load Time / Site Load Time will almost certainly be a direct or indirect ranking factor.

So, how to prepare for this important and basic factor amongst all your site’s various optimization strategies? Well, very easily and cheaply, you could get a copy of Google’s Page Speed extension and run it against samples of your site pages to see what speed factors it might recommend for you to improve upon.

Also, note that this browser-based diagnostic tool does not assess a number of factors which can still affect site load times, such as network connection times and conditions which cause sites to buckle under higher loads.

KeyRelevance has long considered site load times to be of prime importance and has included a number of factors affecting page load speeds in web site reviews that we provide for clients. In fact, we even provide clients with improved compression versions of their site images for smaller filesizes. Speed of access has long been important to a site’s overall user-experience, and Google’s increasing focus in this area is now making it of central importance to keyword rankings in search results. So, if you want to be at the top of your SEO game, you need to be paying attention to your site’s page delivery speed Google is!

Optimize Your Search Engine Listing for Improved CTR

Earlier this month when I spoke at SMX Advanced on the topic of “Beyond the Usual Link Building”, one of the suggestions I made in the presentation was about how to improve how your listings appear within the search engine results.

There are a lot of people I’ve met who tend to be hyperfocused on whether their pages rank, and don’t spend as much attention on how those pages’ entries appear within the search results pages.

It seems like common sense that if the entry looks like what a user is seeking, they’d be more likely to click upon it. Therefor, if you were to improve your search engine results page entries, you’d also likely improve your click-through rate increasing your traffic.

Compare these listings on Google for a search for “Seattle indie records shop“:

Seattle Indie Music Shops Listings in SERP

You can see that the star ratings and review on the listing for “Easy Street Records” is slightly more eye-catching if you were a records shop afficianado the stars and the dollar-sign price range and the easy-to-read sample review text give it an advantage over the listing for the record shop below it. A consumer who is rapidly scanning and clicking to find what they want is going to be more likely to click here.

How much more likely is such a listing to gain clicks? According to Vanessa Fox, Yahoo! has reported a 15% click-through-rate (CTR) increase on similar types of listing treatments! Their results were based upon comparing the CTR of typical search result listings with CTR of listings sporting their special treatments developed through SearchMonkey. The customized listings really stand out from the other listings, drawing the eye and clicks, too.

Yet, before these research results were released, I’d already seen how merely fine-tuning the listing text alone could improve both CTR and rankings. Using savvy methods for forming TITLEs and Meta Descriptions on pages, one can improve keyword relevance, ranking, and click-through-rates.

Now that Google has launched their own type of enhanced listing treatment, dubbed “Rich Snippets“, there’s starting to be even more options for optimizing listings in search results. The first special treatment they’ve enabled are the ones for reviews and ratings, and it seems clear that they intend to launch more, particularly ones related to the use of Microformats, such as hCalendar, hCard, and hProduct.

One person at SMX who liked this concept of “optimizing listings” for improved CTR was Matt Cutts, who Tweeted out a mention of it:

Matt CuttsTweet re Rich Snippets

While these tactics likely have no direct effect on search engine keyword rankings, I’ve theorized for some time now that they could have an indirect effect upon rank. Google’s frequently-discussed patent for “Information Retrieval Based On Historical Data” includes within its descriptions of ranking methods (“scoring”) the possibility that pages might be ranked according to how often they’re clicked upon when they appear within particular searches. The patent states:

“…scoring the document includes assigning a higher score to the document when the document is selected more often than other documents in the set of search results over a time period…”

Very loosely interpreted, this means that if your page’s listing is clicked upon at a better rate than other pages appearing for the same keyword search, that click-frequency or CTR could actually affect that page’s future rankings for that keyword.

It’s long been controversial as to whether Google implemented many of the methods outlined in various patents like this one, but you already have a good excuse to fine-tune your listings: regardless of theoretical impact on rankings, it could easily improve your click-through rate, improving your site’s qualified traffic!

Quick Tips on Optimizing Listings:

  • Title should be brief and state what the page is about, and who you are.
  • Meta description should be brief and expand upon what the page is about or how it may be better than others listed for the same keyword search.
  • Currently, mentioning deals/discounts/rebates may improve CTR since the economy has pushed people to be more price-conscious.
  • Implementing Microformats now on your site for appropriate types of content will likely position you to take advantage of future rollouts of “Rich Snippets” treatements in Google results.
  • Building a search application with Yahoo!’s SearchMonkey platform will help you to understand how Google’s developing similar types of listing enhancements.

Good listing engineering is a complex task involving semantic tagging, taxonomic research and development, good copywriting, and SEO knowledge. Don’t make guesses when doing this use a good expert if you don’t have experience with it.

Optimize your snippets and SERP listings, and improve your CTR and Performance!

Why Use Microformats?

Microformats LogoI’ve written numerous times about how and why to code Microformats into the webpages of local businesses (see here, here and here), yet the question keeps coming up “Should I spend the time and effort on integrating Microformats into my site’s pages?”

Just during the past couple of weeks, the question has arisen yet again, and along with it there was an additional development which further emphasizes why it’s a good thing and why webmasters should be incorporating the protocol sooner than later. More on this in a minute.

I believe I was likely the first to ever propose using hCard Microformats as a component of local search engine optimization, back in 2006 (see: Tips for Local Search Engine Optimization). Back then, I had seen how Microformats were begining to take off, and I saw indications of converging trends: the sharp interest from the major search engines in local search and yellow-pages-like functionality; the increasing uses for types of open formats and extensible semantic tagging; and, most telling of all, the involvement of a number of key technologists from within Yahoo! in the Microformats movement.

I knew that as search engines attempt to match up websites which they crawl with more formal, local business listing data, they would encounter some difficulties in using algorithms to interpret the data properly. Questions such as: What is the street address of this business webpage? What is the Business Name vs. other text on the page? What is the Street Name vs. the City Name? Other questions arise as well, since website designers mostly design towards their human audience rather than algorithms attempting to interpret meaning from raw data. For instance, what Business Category should this local business website be associated with?

Like other forms of semantic markup, Microformatting labels webpage content behind the scenes, specifically telling what each piece of data is while still displaying the webpage normally for human users. If webpages of local businesses were to incorporate hCard Microformatting, I reasoned, then search engines would have an easier time of associating the sites with map locations and business directory listings. Further, if a site contained such markup, the search engine could have a higher degree of confidence in accurately normalizing their data and matching up with user queries, so such pages could potentially rank better in the future.

However, when I introduced the idea, I was not aware of any search engine that was specifically seeking out this type of semantic data. While some Yahoo! personnel were throwing support behind the movement, there was no clear indication that their search engine would seek out specially labeled data fields nor treat them any differently.

Still, there were additional reasons for using the Microformats: they provided additional functionality for some devices and for users who installed special applications to read such content out of pages in order to easily make use of it. A great example would be the Operator Toolbar for Firefox which could allow a user to easily copy out the contact details from a webpage and save it into an address book, quite similar to how vCard electronic business card info can be transfered and harvested easily from email notes (vCard is supported by such mainstream applications as Microsoft Outlook).

The Yahoo! Local search team obviously believed that people could find Microformatting potentially useful, because they incorporated it into their Local Search results earlier in 2006.

Further supporting my prediction that this was an important and growing protocol, Google subsequently immitated Yahoo by incorporating hCard Microformats in Google Map search results in 2007.

Meanwhile, at conferences and via email, many individuals asked me whether Google Maps was “reading” Microformats from webpages. I’d spoken with a few Google engineers during this period, and they answered pretty uniformly: if sufficient numbers of sites made use of this, they’d almost certainly incorporate it as yet another signal in local search data. I knew that there really wasn’t “sufficient numbers” of sites incorporating it yet, but I continued to see indications that the protocol was growing as a trend, and a number of other optimization experts also threw weight behind supporting it as a component of good, local site design. So, I’d still have to truthfully answer, “no, it’s not any sort of factor that will directly make your pages rank any higher, BUT, you should make use of it anyway!” In most of the cases of local info pages I analyze on the web, it seemed like integrating the Microformats should be relatively low-impact in terms of development effort required.

SearchMonkey LogoIn the Spring of last year, Microformats may have finally achieved a tipping point when Yahoo! announced the release of their innovative Search Monkey development platform. SearchMonkey allowed developers to somewhat customize the display of their site’s listings when they appeared in Yahoo’s keyword search results. More to the point, SearchMonkey showed us that Yahoo’s bot and content processing systems could and did read in Microformats from webpages along with other structured data protocols including RDF and DataRSS. While this did not prove that Microformatting influences rankings in Yahoo! Local, it showed that an important step had been reached in a major search engine which could enable the protocol to be a ranking and normalizing factor in local search.

Now fast-forward to the present in 2009, and the question of whether to use Microformats is still getting posed to search marketing experts. On May 4th, someone asked well-known SEO, Michael Gray whether hCard and other Microformats matter for SEO. I think Michael gave a pretty well-reasoned answer overall, although I believe Microformat protocols are just about excruciatingly simpler than he represents, and I think there’s some good reasons to not be quite as conservative about using them as he suggests.

First of all, I believe the main advantages to using Microformats are:

  • They can help search engines identify Business Name, Address, Phone, and Categories on webpages. Variations in formatting on various sites can contribute to misassociation of data elements. Imagine “Houston’s Restaurant on Dallas Street in Paris, Texas”. If an algorithm is attempting to interpret this in order to index the business/site, how does it know for certain what element is name vs. street address vs. city?
  • They can help search engines in associating the website with their listings within the engine’s directory listings content a vital step in “canonicalizing” business information. Google gets business listings from data aggregators and business directory partners, and they have to associate all the various sources of data for a particular business location with a single business listing. This is not a simple activity! Differences in ways a business name is spelled, different ways addresses are written, and different phone numbers all can result in businesses’ listings getting duplicated and diluted in ranking ability within Google Maps. So, having Microformats on your business webpage could help it get properly associated with directory listings already within Google.
  • Microformats facilitate the ease by which users can copy off a business’s contact information to store in their address books and elsewhere.
  • Microformats could also help open up content for use by other developers in unforseen and advantageous ways. For instance, by including the longitude and latitude of your business address in your pages, others can easily port the precise location over to the mapping app of their choice if left up to just using the street address, mapping systems can frequently make significant errors.
  • It’s just not all that hard to add them to sites which display addresses of local places. Some very simple development and coding which could be done within just an hour or two are all that’s required for most sites.

Google actually does a pretty good job of “canonicalizing” classic business listing data from local biz websites, so if my theories on why it could be beneficial for SEO in the future are correct, there are a lot of sites where it likely wouldn’t have all that much impact upon performance even if/after Google begins recognizing it as a local site search signal. It could help them collapse dupe listings down to a single one, which could boost that listing’s ranking weight. But, for businesses with already easy-to-interpret addresses or where Google hasn’t had difficulty in grouping related listings together, it likely wouldn’t have any ranking effect whatsoever.

As of just last week, there’s an even more compelling reason to incorporate Microformats, though: Google is following close upon the heals of Yahoo again and has announced that they’re introducing “Rich Snippets” in Google search results pages essentially the Rich Snippets are more enhanced search result listings, allowing the display of star ratings and the numbers of reviews for content on the pages. Similar to Yahoo’s SearchMonkey which allowed some customization of search listings, Google is allowing this special content display initially for pages which incorporate hReviews Microformat.

Google SERP listing for Yelp with Rich Snippets

Many of us theorized that Yahoo’s SearchMonkey could be potentially advantageous to sites, since search result listings which look different can stand out from the crowd, attract more users’ notice, and therefor have a greater chance of being clicked upon. Indeed, subsequent research showed that SearchMonkey’s special listing treatment could increase CTR by 15%!

There’s every reason to believe that display enhancements likely could improve CTR within Google search results as well, so there are great incentives to adopt the hReview protocol for those sites which have reviews and ratings content. This is only the first stage of Google’s work in Rich Snippets, however, and it’s pretty certain that Google will introduce more types of structured data into special display within search result listings. hCard and hCalendar content are some top candidates poised for imminent introduction when Google expands this.

We’re now seeing adoption of hCard in even some high-popularity sites such as Twitter now, so it may be time to actually declare Microformats to be “mainstream”!

So you see, there are compelling reasons to use Microformatting in the here-and-now, rather than putting it off. It’s generally not difficult to implement, it enhances site functionality for good user-experience, it generally won’t interfere with existing graphic layout, it could eventually help in rankings, and it might soon help in terms of click-through rates or overall conversions.

Technorati Profile

Dallas Cowboys Practice Field Disaster – Citizen Reporting & Photos

I’m saddened to report that there was a terrible disaster in my neighborhood today – the Dallas Cowboys’ indoor practice field in Valley Ranch was hit by the strong winds in the violent Texas storm that blew through this afternoon, and the lightweight structure collapsed under the wind strain.

Firemen look over Dallas Cowboys Practice Facility Wreckage

It was just shortly after the storm passed that I was listening to the TV in my home with half an ear, relieved that my trees didn’t fall on my home, when I heard the news that the Dallas Cowboys’ facility collapsed. This place is just south of my home a few blocks, and I got really familiar with the location when my kid sister moved out of my house (she’s about half my age and lived with me when she started college in Irving) — she lived literally right across the street from the giant structure.

I call it a gym, but the place was really an indoor field — a large, inflated roof covered the whole thing, much like the roofs over some sports domes. I knew immediately that the storm must’ve really hit the roof hard, and finally gotten ahold of it and ripped it off, similar to how the Superdome roof was ripped off in New Orleans, back during Hurricane Katrina.

I’m an amateur photographer, and I couldn’t resist jumping straight outside to snap a few pics of the collapsed building.

Wreckage and Emergency Personnel - Dallas Cowboys

I speak at internet marketing conferences and write about optimizing websites for search, and one area I’ve often spoken upon is how to leverage photos to get links and to drive traffic to websites. There’ve been a number of occasions when journalists have contacted me, asking to use my photos to illustrate their stories, and I almost always allow them to do so for free, so long as they give me a link back in return. A link is the online, technical equivalent of a by-line or credit-line, and it’s only fair that I get credit for my work.

I’ve had bloggers often ask me how to promote their blogs, and this is an example of how to go about it. Most of us see or attend various news-worthy or interesting events in our lives, and it doesn’t take much to snap photos of them and provide them for others to use in return for a link back.

So, if you’re a journalist or blogger interested in writing on the Dallas Cowboys facility’s collapse, you’re welcome to use any of my photos – click on the ones in this story and they’ll take you to my Flickr account where you can find more, and you can see instructions on how to cite me as the photographer.

My heart goes out to the players who were injured today, and to their families. I really hope that everyone will be okay!

(* I’m right now weathering the second strong storm moving through the area – I sure hope my home and trees survive it, too!)

Why Free Photos = Good SEO

I’ve written articles and spoken at conferences on the subject of using images for search engine optimization for a number of years now, and one concept that many individuals and corporations miss is the idea that looser copyright restrictions can often equate with wider promotional value and greater SEO power.

Many companies are still operating under “Business 1.0” mindsets in this “Business 2.0” world, and that failure to adapt is often resulting in very real lost market share potential.

Photographic images are often a type of content that is still sometimes hard to come by. If you have images of subjects that could be of interest to someone out there, then you can leverage this demand to obtain additional links to your sites. And links to your site are still valuable and worthwhile in terms of increasing your chances to rank higher in search results for keywords that are important to you. A greater number and variety of links equals a greater chance to rank higher than your competition.

But, if you’ve slapped all sorts of restricting copyright notices and language to all of your photos, then you’ve caused a real chilling effect in terms of the links you could be getting.

I post a lot of my pictures to the image sharing service, Flickr, and while I often have each photo’s permissions set to display “© All rights reserved”, I have placed a notice on my profile page that I typically allow journalists to use my photos if they will give me a credit line when stories are posted online, and link my name back to my homepage. On images that I think are particularly newsworthy, I’ll even mention these terms in the description below the image.

The Grapevine Sun officesJust today, this tactic paid off again it seems that Belo Corporation is closing the small-town newspaper in Grapevine, Texas. The Grapevine Sun has been in operation for something like 114 years, and now it’s closing down like many other newspapers around the country. A journalist contacted me about my photo of the Grapevine Sun office, requesting permission to use it to illustrate their article about the closing. Just as per my terms, they used the logo and linked it back to my site homepage.

This is really a win-win scenario. If I were all uptight about restricting my photos overmuch and forcing people to pay high fees for usage, it would cause all sorts of barriers for distribution for me. It might be one thing if I had some sensational photo of a celebrity doing something fantastic, or a political figure, but for most of my photos the popularity factor just isn’t high enough to warrant pretending I’m the next big photographer of the century.

The journalist got a photo to raise the human-interest feel of their story, and I got a small amount of link promotion value out of my picture. It’s not precisely a “free use” of my image, but it’s close enough from the journalist’s viewpoint, and my providing permission super-rapidly is a sensitive acknowledgement of their story deadline pressure.

News sites and blogs are treated very well in terms of link value by Google’s algorithms, so providing your images in ways that could facilitate bloggers and journalists in finding the images and making use of them can help insure that you could get more inlinks than you otherwise would.

By stating clearly on your photo pages that you’ll allow news and blog stories to use photos in return for a link back to you, you use a very mild and benign form of social engineering to increase the chances that you’ll get some links for your site.

A couple more notes on permissions most companies and PR departments are too restrictive. It’s understandable to fear that someone might use your images to illustrate stories that could be negative about you, but it’s important to keep sight of the big picture: disallowing photos for this use likely will not stop the story from happening, and even links from negative articles can help in your overall rankings. So, it’s better for you to provide the photos for open use regardless of whether you really like the story or not. It’s completely valid to state that the images may be resized but the content within the image cannot be altered.

Even better, using Creative Commons licensing can help encourage more use, and will allow you to specify terms of use that are standard and more easily understood.

In the Business 2.0 world, companies which are not providing easy-to-find and easy-to-use press kits on their corporate websites which include lots of photos of products, services and people in their company well, they’re really behind the times. You too can easily gain valuable inlinks from blogs and newspapers, just as I have.

Why Flogging is a Bad Idea for Companies

being-fakeThere’s a lot of definitions that float around about what a “Flog” is. Basically when it comes down to it, Flogs are fake Blogs. How they are fake can be a matter of subjection. However when it comes down to the bottom line, if you are the owner (or being portrayed as the owner by your agency) and you aren’t contributing to the blog itself and the community with which you are trying to speak to with your blog posts, its a Flog. Agencies that set up these types of blogs with or without their client’s knowledge are doing a disservice to their clients and could possibly harm the brands

Take for example the infamous Walmart Flog from 2 1/2 years ago, “Walmarting Across America”. When it was outted it hit the front page of MSNBC back in October of 2006, a firestorm of ethics errupted for both Walmart and for the company who started the Flog, Edelman. A sister of an Edelman employee and her photographer boyfriend were responsible for the posts and photos, problem was they weren’t “real” in terms of the typical person who would RV across America using Walmart as a rest stop.

Granted, it is now over 2 1/2 years later, but people are still pointing to this as the quintessential idea of a flog, but Walmart wasn’t the first to be outted for Flogging. Mazda seems to get that honor for its Flogging attempt back in November of 2004.

Media conglomerate & electronics manufacturer, Sony, has also tried its hand at flogging for retail promotions. Their “All I Want for XMas is a Sony PSP” blog didn’t get more than a few blog posts posted before it was outted for just being a very poor marketing piece put out by fake bloggers. The agency who set this flog up wasn’t even smart enough to put the domain’s registration under Sony or a different name. When this broke, a few weeks after Walmart’s, the ire of the bloggers across the globe who are transparent and truthful was raised.

Agencies that set up blogs without being transparent that it isn’t the real company writing the content, walk a really thin ethical line. Hiring writers to write content exclusively for the blog, that don’t work for the company and having the mindset that “its just content, content will rank”, isn’t the real purpose of a blog. The idea behind a blog is conversation and building communities. If you are just setting up a blog to gain a foothold in the search engine results on Google, Yahoo, or MSN for your client and have no intention of having a discussion about what has been written, essentially this too, is a flog. What agencies and companies who wander down this “fake blog” path tend to forget is that when a flog is outted its just as bad for the Walmart-sized brands as it is for the mom & pop retailer online, it just manifests in different ways.

For Walmart, it was losing the domain “Working Families for Walmart” (because they didn’t register it to start off with), the name under which the RVing Flog was registered to, and it being bought by the union group trying to unionize Walmart workers. For Sony, it was lower sales of the PSP that Christmas. What can it be for other companies? Well readers of blogs, and other bloggers have become increasing savvy over the last 2 years. They can usually spot a fake a mile away. Blogs with no comments, blogs with no readily identifiable authors, blogs with writers who don’t interact, are usually outted in some fashion on another blog. That blog who did the outting, well their audience now has put in their mind “fake”, “untrustworthy” or even a worse label, “Spammer” for the outted blog. Once those labels are applied, usually word of mouth spreads and the once promising “content” blog the agency launched in hopes of gaining a foothold in the search engine results, dies a pretty slow and painful death.

So is building a flog worth the time and effort you might get from a temporary boost in the search engine rankings? Maybe, at first you could be fooled into thinking so. However, once readers and active community participators realize that the blog is consistently about the same topic, the same products and the writers aren’t listening to the community and responding, sure enough, the efforts of the fake blog will be for not.

Google Profiles & Online Reputation Management

A few weeks ago Google launched “Google Profiles“. Looking at how Google Profiles works, its reminiscent of an online dating site ad minus the creepy old guy that could be my grandfather sending me winks. With that said, Google Profiles can be a powerful tool in online marketing, especially when it comes to online reputation management. Already, Google profiles are showing up in the search engine results. They may not be showing up number 1 for all vanity searches, but they definitely have the power to rank in the top 20 and the potential to rank even higher. Why? Well, Google I guess must really trust itself.

I created a Google profile early last week. This morning I decided to test and see how it was affecting searches for “Liana Evans”. While not in the top spot for my name, the Google profile is now ranking in the top ten, along with several other profiles and videos from social media sites. So keep that in mind, its not just your profile on Google that has the potential to rank and usurp static websites, its profiles on just about any social site. Take a look at the screen shot below:

The social media profiles and videos I’ve got highlighted in red boxes are all ranking for “Liana Evans” near the bottom of the rankings on the first page of the search engine results for my name. Accept for the power of Google, the other profiles don’t rank “just because”. They rank because they are my more “social” profiles. What that means, is that it’s not just because it’s “Twitter” or it’s “FriendFeed”, I’m actually social in those platforms – I hold conversations, I have “friends”, I comment, I share, I watch other videos than my own, etc., that’s what gives these profiles their ability to rank. They also rank because I make sure they are properly optimized, for “Liana Li Evans”, incorporating both my real and my nick name. While being social is the primary key, you also need to remember how you want people to relate to you in these social settings, and make sure your profiles reflect that.

Now before anyone screams “Google Conspiracy” about Google having all your information from your profile, there’s one thing to remember. You do not have to fill it out completely. In other words, you choose what you want to provide to share in your profile. I don’t share all my contact information, just general information about myself and what I do, and my Flickr photos which are already visible through my public Flickr stream.

If you or your company is actively pursuing reputation management, establishing a Google profile might be a wise step in that campaign effort. If you are monitoring your CEO, CMO or any other prominent names that matter to your company, you should be encouraging them to fill out a Google profile with the information related to your business. There are some sacrifices, you are giving Google a little bit more information about yourself, however, again you choose what information to give. The individual is the primary owner of that Google profile, and can choose what information to share, but as an online marketer you can guide the person how to make sure they are presenting the information in a manner that positively affects the reputation management efforts you are undertaking.

URL Shorteners That Frame Websites Hijack Your Content

By Liana “Li” Evans

hijackinghotspotWith the rise of Twitter and it’s limit of 140 characters (250 if you turn off javascript), when it comes to maximizing space to get your message across, every character counts. With that fact in mind URL shorteners are cropping up all over the place. There are some great URL Shortening services, Tweetburner, Bit.Ly, TinyURL and Cli.gs are some great services and actually will track your click throughs.

Then we have another new crop of URL shorteners appearing. These “frame” your content underneath their own branded bar. Digg of course is the biggest well known implementer of this kind of bar. There are several others that do this as well, Ow.Ly and BurnURL are just two. So what’s the big deal, why all the fuss? What could be wrong with what Digg’s doing, after all they are still sending you traffic, right? Well to start with, some of these services have the potential to play havoc with some analytics code. Then there’s the whole “hijacking” of your URL, which is likely one of the things that surfers on the internet are trained to remember, this is essentially hijacking your content for their own benefit – increasing the number of uses of their service.

What’s the difference between what Cli.gs does and what Digg does? Well Cli.gs does a 301 redirect straight to your content when someone shortens your URL, therefore when people click on a shortened URL done by Cli.gs you end up on the content and see the true URL. What Digg does is puts your content under their bar, with their own URL. The visitor NEVER, EVER sees your full URL. Sure some of these allow people to click out of the bar and show you a truncated URL stream to click on, but it’s certainly not the same as someone looking into the address bar for your site’s URL.

What happens when they want to bookmark your site and then entered through Ow.Ly, BurnURL or Digg’s bar? Their shortened URL is what is bookmarked not your site’s URL, doesn’t matter if they are bookmarking to their browser or to a social bookmarking site like Delicious or even StumbleUpon. Again, they are highjacking your content by keeping the framed bar with their URL in the address bar and not 301 redirecting like the other URL shortening sites are!

Sure, some of these URL shorteners that put the frames around can say “oh we make it easy to share with out pull down menu”. Well here’s the thing, people are already “trained” to bookmark or stumble through the bars they have installed in Firefox or IE, that’s where they are going to go first, not to a pull down on a frame. It’s tough to retrain people who’ve been stumbling or bookmarking for well over two years to use some “framed bar” from a new service that isn’t familiar to them, they are going to go with what they trust.

content-hijackThen lets look at the whole “oh I found this I want to blog about it” piece of the marketing and social media puzzle. Someone who finds some great content via one of these framing URL shortening services and isn’t quite tech savvy, pulls the shortened URL from the address bar. Guess what, your site doesn’t get the credit for that link, the shorten URL does. Again, this is basically like hijacking your content.

These URL shorteners make claims that it makes it easier to get your content to be more viral. Personally, in my honest opinion, that’s a load of bunk. It isn’t this tool that makes the content go viral – it’s the perceived value of the content itself that makes something go viral. Then stop and think, what is the sense of your content going viral if the visitors viewing it can’t even see your URL? What is the sense if they themselves can’t share it properly with their own communities like StumbleUpon, Delicious or Magnolia? Your URL is how people remember you, and a lot of sites don’t put their URL in their graphics or headings, they rely that its always going to be in the address bar.

I’ve been having discussions on Twitter about this, and one person claimed I was afraid of them stealing my “Google Juice”. I had to suppress a laugh at that term. I guess because I came into the industry as an SEO, some people will assume I “want my Google Juice” darnit! It’s not about Google Juice at all, at the end of the day this is about who owns the content. The publisher owns the content – not these framed URL shortening services who are hijacking URLs. It’s about it’s perceived value to the visitor and if the visitor perceives its value to be great, shouldn’t the original publisher get that credit, not these framing URL shorteners?

Here are some other great reads on this subject: