Domain Moving Day the Key Relevance Way

Domain Moving Day the Key Relevance Way

So, you’re gonna change hosting providers. In many cases, moving the content of the site is as easy as zipping up the content and unzipping it on the new server. There is another aspect of moving the domain that many people over look: DNS.

The Domain Name System (DNS) is the translation service that converts your domain name (e.g. keyrelevance.com) to the corresponding IP address. When you move hosting companies, it’s like changing houses, if you don’t set up the Change of Address information correctly, you might have some visitors going to the old address for a while. Proper handling of the changes to DNS records makes this transition time as short as possible.

Let’s assume that you are changing hosting, and the new hosting company is going to start handling the Authoritative DNS for the domain. The first step is to configure the new hosting company as the authority. This should best be done a couple or more days before the site moves to the new location.

What does “Authoritative DNS” mean?
There are a double-handful of servers (known as the Root DNS servers) whose purpose is to keep track of who is keeping track of the IP addresses for a domain. Rather than them handling EVERY DNS request, they only keep track of who is the authoritative publisher of the DNS information for each domain. In other words, they don’t know your address, but they tell you who does know it.

If we tell the Root level DNS servers that the authority is changing, this information may take up to 48 hours to propagate throughout the internet. By changing the authority without changing the IP addresses, then while visiting browsers are making requests during this transition, both the old authority and the new authority will agree on the address (so no traffic gets forwarded before you move).

Shortening the Transition
The authoritative DNS servers want to minimize their load, so every time they send out an answer to a request address for a given domain, they put an expiration date on it. This is called the “Time To Live”, or TTL. By default, most DNS servers set the domain TTL to 14,400 86,400 seconds, which equals 1 day (thanks Andrew). Thus, when a visitor requests the address of the authoritative DNS, it returns the IP address and says “and don’t bother asking again for 24 hours.” This can cause problems during the actual transition, since the old address might continue to be accessed for a whole day after the address has changed.

The Day Before the Move
Since the new hosting company is the authority, they can shorten the TTL to a much shorter value. We recommend that 15 minutes (900 seconds) is a good compromise TTL value during the transition time.

Moving Day
When you are ready to make the switch, have the new DNS servers change the IP information to the new address(es). Since the TTL was set to 15 minutes, very quickly the other DNS servers on the ‘net will be asking for the IP address of the domain. They will be provided with this info, and the switchover will happen much more quickly than if the authority had not changed. Once the new site is live and you have verified nothing is horribly wrong, you can change the TTL on the new DNS servers back to 1 day. If on the other hand, something IS wrong with the new site, you can change the DNS back to the old IP address and within 15 minutes most if not all traffic should be back to the old servers. We also recommend changing the old DNS info to point to the new IP address as a precaution, but if you follow these steps, most of the traffic should have already trasnsitioned to the new DNS servers.

A Bug in BIND
There is a bug in some versions of the BIND program (which executes the DNS translation). This bug will cause a DNS server to continue to ask the same authoritative DNS server for the info as long as he is willing to give it. To complete the transition cleanly, you need to turn the DNS records for the domain off in the old DNS servers. This will cause it to generate an error, which in turn will cause the requesting DNS server to ask the Root level servers for the new authority. Until you make this change, there is still a chance that some traffic will continue to visit the old domain.

Change of Address Forms
The USPS offers a Change of Address kit to help make moving your house easier. Below is the Key Relevance Change of Address Checklist that may make you site’s transition painless.

 

 

 

Key Relevance Domain Change of Address Checklist

2+ Days Pre-Move
Set up new DNS servers to serve up the OLD IP addresses

  • – handle old subdomains
  • – handle MX records

Once that is complete, Change Authoritative DNS records to point to new DNS servers.

1 Day before move
On new DNS servers, shorten TTL to 15 min (900 sec)

Moving Day
On New DNS Servers

  • – Change IP Addresses to new server
  • – Change TTL to 1 day (86,400 sec), or whatever the default TTL is once you are sure all is OK

On Old DNS Servers

  • – Change IP Addresses to new server to catch DNS stragglers

2 Days Post Move (or when convenient)

  • – Remove DNS records from OLD DNS servers (assuming they are still up)

Google Now Imitates AOL With New Page Speed Service!

Google’s announcement of their new Page Speed Service was so very expected by me that it nearly didn’t form a blip on my radar screen when it flew by in my streams today! Google and AOL Page SpeedIt’s a sort of combination of Content Delivery Network (“CDN”) and automatic page code optimizer which will allow them to make your webpages more efficient at delivering and resolving in browser windows, and it will allow them to cache your site content on servers deployed around the world so that your content won’t have to travel as far through the network to reach anyone at the moment that it’s requested. It’d be super-cool, except this kind of technology was first invented by AOL! Let me explain. Continue reading

12 Tips To Optimize For Google Instant Previews

Earlier today, I outlined how Google’s Instant Preview doesn’t display Maps, Flash, YouTube, AJAX, and lots of other rich media commonly found on webpages. If your site pages or homepage have this stuff on it, chances are your Instant Preview image is less-than-stellar and may actually reduce your CTR.

There are a lot of professional websites which have “borked” Instant Previews. For example, check out this Los Angeles dentist’s homepage, which appears with this jaunty giant jigsaw puzzle piece taking up most of its space:

Los Angeles Dentist Website Instant Preview

Google has said that the Instant Previews were found to improve their users’ satisfaction with search results significantly during internal testing prior to rolling out the feature. Users can rapidly glance at the preview images to see if the webpages might hold what they’re looking-for, increasing their confidence and helping them select webpages to click upon which are more likely to hold what they want, avoiding clicking on stuff they don’t want.

If that’s true, then the opposite is likely also to play into users’ behavior: if a preview image looks bad and doesn’t look like what they’d expect or want, they might avoid clicking on it.

For anyone who has a site which doesn’t look right in Google Instant Preview mode, this is alarming, since their introduction of this feature could wrongly reduce your clickthrough rates. Even if you’re not worried about the collective effect over time, you still are likely not thrilled that the image representing you may not reflect a true picture nor show you up in the best light.

I’ve been asked before on how to optimize for Google Instant Previews, so here are a few tips I’ve put together: Continue reading

Google Sneakily Ignored Noarchive With Instant Previews

Google rolled out Instant Previews three months ago in November. After I looked over the new utility, it struck me as very odd that I found pages in Google search results that had no cached view of the indexed content, while they did have this new viewing option of a screengrab of the page.

For instance, quite a number of newspaper websites choose to disable the cached views of their pages. Just search for “newspaper”, and you can immediately see an example such as the New York Times:

Instant Previews view of New York Times homepage

As you can see, the listing in the Google search results for The New York Times has no link under it for “Cached”. However, it does have a magnifying glass — the Instant Previews button — which, when clicked, reveals a screengrab of the NYT homepage from Google’s copy of the page when they last spidered it.

The reason the New York Times doesn’t have a “Cached” link is that they purposefully set up Continue reading

A Few Interpretations of Google’s Response to DecorMyEyes.com

All the recent hubbub over DecorMyEyes.com, and their claim that treating customers poorly in order to obtain more negative reviews resulted in better Google rankings, has left a small cloud of confusion. The ruckus was sufficient to get Google’s interest, and motivated them to react to it, but what they may have done is worth considering, not least because their statement around it has caused part of the confusion, perhaps purposefully.

First, it seems likely that Vitaly Borker, the offensive proprietor of Decor My Eyes, is likely not some stealth marketing genius. Rather, he sounds more like he rationalizes bad behavior in a variety of ways, according to the NYT article about him, and one of his prime beliefs is that negative ratings have helped his Google rankings. His supposed reasons for this were likely wrong by some degree, but he may’ve accidentally derived some benefits from the practice without knowing actual causality.

What makes him more important is that he got Google’s attention, and caused them to react — or claim they’ve reacted — by making some changes to their algorithms. It’s possible that Google responded mainly out of concern over negative press. It’s also possible that they may’ve said they’ve made a change but did not, but it seems equally possible that they could have indeed tweaked their algorithm. The incident really seems to call for us to consider that “where there’s smoke, there may be fire.” Continue reading

Facebook SEO Tip: Add Your URL To Your Wall

Here at KeyRelevance we’re researching a number of different avenues for online marketing for our clients, so, along with our bread-and-butter work on Paid Search (PPC) management, and Search Engine Optimization (SEO), we’ve done quite a bit of exploration of ideas on how to leverage the massive audiences found in various Social Media such as in Facebook and Twitter.

Yesterday, I published an article on a somewhat subtle technique which can be used when posting status updates on Facebook in order to increase the numbers of people who might see each updates. However, there are a number of very straight-forward things which businesses and organizations can do to extract marketing advantage from Facebook without getting all tricky. Sometimes the most basic steps can give you the greatest advantage, but it’s not always obvious how to go about it.

So, here’s a ridiculously basic tip which I’ve found many businesses have utterly failed to accomplish in setting up their Facebook presence: add your website link to your Facebook wall page!

There are a great many companies, organizations, and small businesses which haven’t figured out how to do this, and so you can encounter pages all the time which do not sport that most basic element of their online marketing. For instance, the official Facebook page for the University of Texas at Austin, one of the largest universities in the country, has completely missed the boat by leaving their URL off their Facebook page:

University of Texas at Austin on Facebook

By contrast, their rivals at Texas A&M University have implemented their website URL on their Facebook page:

Texas A&M University on Facebook

(Disclosure: Texas A&M was my alma mater, so I did get a grin when I noticed that the TAMU University Relations Department did this most basic element correctly while the "Tea-Sips", as we like to call them, did not.)

Oh, to be certain, I should point out that URLs on Facebook pages are nofollowed (not to mention that they're apparently dynamically written to the pages onload, via Javascript), so they're not precisely as optimal as many search engine marketing experts might like. However, there's much to indicate that Google, if not the other search engines, Continue reading

Using Bing’s New Webmaster Tools For SEO

You may be aware that Bing recently released a new version of their Webmaster Tools which are intended to help webmasters in improving their sites’ performance in Bing search. One of Microsoft’s Senior Program Managers and SEOs, Duane Forrester, asked a number of us to give feedback to their team on what could be improved about the interface. So, I thought it might be good to provide that feedback via blog post, openly — not to beat up on Bing, but to further bounce ideas among the community.

Bing Webmaster Tools

Giving feedback to any of the search engines about their tools for webmasters seems a bit fraught with the near-futile dichotomy between the desires of Search Engine Optimization experts and the desire of search engineers to neutrally provide positive/fair rankings of search results. However, the exercise of me providing a little feedback is worthwhile, because if the tools are useless or pointless to us, then there’s little point in the search engines going to the effort of providing them in the first place.

Having worked in a major corporation before, I almost feel repressed about throwing out suggestions that I know could be deemed no-gos from the point of view of Bing engineers. I tend to self-censor to a degree because I don’t want to be interpreted as naive of the issues the search engines must take into account in trying to limit undue influence of those attempting to subvert the SERPs.

Even so, I’m aware of the potentially conflicting considerations, and as I described earlier, it’s an exercise in futility if the tools don’t provide worthwhile functionality to the intended users.

One of the primary problems I see with Bing’s Webmaster Tools is the sense of “keeping up with the Joneses” one gets when reviewing their interfaces. Bing’s development team is in a near no-win situation with whatever they do in this area. On one hand, if they copy the same functionality found in Google’s Webmaster Tools, they’d be accused of being mere immitators. However, there are some good elements in Google’s toolset which really ought to be provided, perhaps. On the other hand, if they went even further in providing usefulness to webmasters, it could make them more prone to unethical marketing exploits. So, there likely were not a lot of easy solutions nor perhaps obvious things which they should have done.

Further, their focusing upon their tool-vs-Google’s tends to be a bit incestuous, and there’s the usual engineer myopia in providing what they think people would need/want versus trying to really look at the problem directly from the point of view of a webmaster. (Now, this bias in perception can’t be accused of Duane, because he was an external SEO prior to working for Microsoft — but there’s a definite sense of this basic utility design problem inherent in both Bing Webmaster Tools as well as Google Webmaster Tools.)

Likewise, Google Webmaster tools suffers a bit from the conflicting goals of the engineers and the needs of the tools’ target audience. So, I’d prefer that none of the search engines look at one another’s offerings when designing such things, but instead try to focus solely upon providing as much functionality as webmasters might need. As things currently stand, there’s a sensation that all of the search engines are providing something of “placebo utilities” to webmasters — the interfaces have some confusing melange of features which are ultimately not all that useful, but are instead intended to throw up some smoke and mirrors to make it appear that they’re trying to help webmasters with the optimization of their sites.

Moving past my perhaps-unfair assertions, let’s look at what the new Bing tools provide, and what could be done better.

First, a head-nod to Vanessa Fox for her comparison between Bing’s and Google’s Webmaster Tools — as the creator of Google’s Webmaster Tools, she is likely one of the best people around to examine such utilities with a critical eye, and in the best position to know how much info a search engine might realistically be able to provide, and in what format. Likewise, a nod to Barry Schwartz’s post about Bing’s tools.

Both Vanessa and Barry berate Microsoft for building the tools while requiring Silverlight technology to view/use them. I don’t consider that as much of a big deal, because I consider that a sort of “religious difference” in how the tools were constructed (most of us who are jaded about how Microsoft has strong-armed proprietary technology in the past might react negatively against Silverlight, as well as those who avoid it out of conservative privacy practices).

However, if I’m looking at Bing Webmaster Tools purely from the perspective of how well it does or doesn’t function, I’m not concerned about this tech dependency built into it, since I think the majority of webmasters out there will be unaffected by this. I’m not a fan of MS programming protocols AT ALL, and it may be a bit of my former bias as a technologist within a megacorporation creeping in, but the Silverlight criticism just appears slightly out of sync with the primary issues of whether the tools provide vital functionality or not — and, it may not be unfair of Microsoft to decide that if you wish to play in their Bing sandbox, they have the right to promote their proprietary technology to do so. In comparison, I have a friend who is a privacy freak, and he surfs with Flash disabled — Google’s Webmaster Tools requires Flash for one or two graphs, and would be equally irritating to him as Silverlight.

Both Barry and Vanessa mention how Bing’s new interface revoked the backlink reports, and I agree with them both on this point. This was one area where I’d hoped Bing would take the opportunity to be more open than Google. If the engineers looked at competitors’ tools while building Bing’s, they should have tried to recreate the backlink reports that Yahoo! provided in Site Explorer — which seems to give a more comprehensive picture of backlinks. Since webmasters are told that inbound links are one major criterion for rankings, avoiding providing this info is a major void.

Bing obscures the numbers of pages indexed when one performs a “site:” search by domain, too, so revoking this functionality, such as it was, from the old interface eroded some of the usefulness. Perhaps their pre-development surveying of webmasters resulted in feedback that their earlier backlink report “wasn’t useful”, but that would’ve mainly been because it was less-robust than one like Yahoo’s.

Vanessa mentions that they don’t provide data export features, and I agree completely that this is a major oversight. In fact, as a programmer I happen to know just how relatively easy it is to code data to export in XML or CSV, and considering how long it took to launch the product it’s sort of shocking they didn’t include this upon launch. (You’d think Microsoft would not miss an opportunity to provide a “click to export to Excel” button!)

Vanessa stated that they also ditched the Domain Score, and remarked that this was a good thing. I disagree on this point because I think any insight into a ranking score that any of the engines give us is helpful in assessing how effective/important a site or domain is. Was this the same as the small bar-scales Microsoft had been providing for a handful of the more important site pages via the interface? Although these graphical page ranking scores were entirely derivative of Google Toolbar PageRank, I would’ve prefered they provide even more in that area. Bing’s in a position where they ought to be able to experiment with providing more info than Google does, and see just how dangerous it really is to be more open with marketers!

Vanessa did a great comparison between the analytics Bing provides versus Google Webmaster Tools and Google Analytics. While analytics from Bing’s perspective are interesting to us all, she notes one aspect that also strikes me as an issue with the graphs: as a webmaster/SEO, when I see indexation decreasing, I’d really like to know why. This is particularly irritating where Bing is concerned, because among the industry I think it’s widely felt that Bing simply indexes a lot less than Google.

Many of my clients want to know what they can do in order to increase their indexation with Bing. I see the same thing with my test websites. I may have 30,000 discrete pages and Bing appears to index a sharply lower number than Yahoo or Google. The feature allowing one to manually submit URLs seems to acknowledge this sad fact — but, in context, it’s nearly sending the wrong message! “Oh, our spider’s legs get tired out there, so bring your pages directly to us.” Vanessa’s got a point on this score — why should I feel I need to do this if you accept Sitemaps? And, if I or my clients have tens of thousands of site pages, fifty pages to be manually submitted at a time is simply not a sustainable solution. I can understand having the interface to rapidly submit brand new content pages, but what’s missing may be some clear communication as to what issue is restricting my indexation.

The Features showing whether there are robots restrictions, malware, or crawl errors which could impact a site are all great. However, if one already has everything functioning just fine, the tools need to answer futher questions: Why isn’t my site crawled more deeply? And: Why don’t my pages rank higher? Ultimately, webmasters ask: What can I do to improve my site’s performance? Understandably, Bing and other search engines are reticent to provide too much info in this area. However, there are things which they could provide:

  • Possibly a tool where a webmaster could select one of their pages or submit a URL to find out what primary keyword phrase Bing considers the page to be about?
  • Tools which report upon quality issues detected with specific pages. For instance, is the page missing a Meta Description or Title? Are there pages where the Title or Meta Description appears to not be relevant to the page’s content, or out of sync with anchor text used in inbound links? Are there images which could be improved with ALT text?
  • Why not merely inform webmasters that you consider their links and other references to be too few or of too low in importance?
  • Bring back the scales showing page scores, and actually go further in providing some sort of numeric values!
  • Actually show us that you spider all pages, even if you opt not to keep all in your active index! This would at least give the impression that you are able to index as deeply as Google, but choose not to display everything for other reasons.
  • How about informing us of pages or types of pages (based upon querystring parameters, perhaps) which appear to have large degrees of duplication going on?
  • Tie-in our Webmaster Tools with local listing account management for local businesses, so everything could be done via one interface.
  • Provide means for us to customize the appearance of our SERP listings a little bit, similar to Yahoo SearchMonkey and Google’s Rich Snippets.
  • Provide us with tools that help us improve overall site quality, such as if you see pages on our site with incorrect copyright, misspellings, or orphaned pages.
  • Consider providing us with an A/B testing type of tool so that you might inform us about which of two page layouts performs better for Bing search!
  • Inform us if you detect that a site has some sort of inferior link hierarchy — this could indicate usability problems affecting humans as well as spiders.
  • Provide more granular details on how well we perform in Bing Image Search, Mobile Search, Video Search, etc. Currently, I cannot tell how many of my images are indexed in Bing.
  • For that matter, it would be nice to enable an Image Sitemap file, like what Google offers.
  • Finally, for a really pie-in-the-sky request: You operate web search for Facebook — would your contract allow you to tell us how often our webpages appear in Facebook search results and how many clickthroughs we might get from those?
  • Anyway, there’s my feedback, criticism, and some ideas for additional features. I’m not trying to beat anyone up and I’m actually grateful for any feedback we receive from all of the search engines on our performance within them. I mainly ask that the search engineers will keep in mind that we mainly want to know “What can we do to improve performance?” and to provide us with tools to accomplish that in as much as they’re able to do without compromising the integrity of the engine.

    Where Bing is concerned, I believe it could be possible to be even more open than Google is in order to further differentiate yourselves and experiment to see what’s really possible in terms of openness!

Without Usability, You’re Not Doing Advanced SEO

My article covering how Google’s fixation with Usability reveals local search ranking factors published yesterday on Search Engine Land. In it, I described a number of common website elements which few-to-no marketers have ever cited as ranking signals. Some of these elements, such as whether or not a site may have employee profile pages, or whether a site displays prices for products and services offered, might be controversial in search engine marketing circles.

CNNs homepage checked with Google Site Speed

CNN's homepage checked with Google Page Speed - Google introduced Site Speed as a new ranking factor in 2010, and provided tools like this Page Speed extension in FireFox to assist webmasters with Usability improvements.

Other elements I described have been cited by other experts as beneficial for search marketing, even though they may’ve recommended them for reasons other than those I outlined. Inclusion of images, maps and locations pages make sense for multiple reasons in local business websites.

The thought and methodology behind coming up with these factors is sound, and has allowed me to successfully predict present and future search engine optimization factors where others have not. It makes logical sense that while Google is interested in Usability, they will seek ways to quantify and measure it on websites, just as they have done with Site Speed. And some very easy usability elements to quantify include common website elements such as the About Us, Contact Us, and Locations pages.

Back in 2006, I began predicting that the practice of Search Engine Optimization might become replaced by Usability. Unquestionably, this change is occuring to some degree right now.

I’ve known a lot of top corporations which are involved in very sophisticated paid search marketing and search engine optimization, but few of them are also including usability testing and user-centered design considerations when performing a site redesign. Google has tried to make the importance of user-experience abundantly clear by actually going public with their adoption of page load times in determining search result rankings, but many companies are still not connecting the dots.

Here at KeyRelevance, we have long prioritized usability in our assessments of web sites’ design. When companies contract with us to audit their websites, we offer both a Technical Website Review as well as a Usability Review. However, many companies eschew our Usability Reviews or dismiss them as less-important.

For some reason, people often react to usability recommendations from experts in an emotional way, rather like how a portion of the population avoids going to their doctors for a yearly physical. For some companies, there are already so many dependencies and requirements going into web design projects that they can’t include more without losing impetus. For others, individuals with authority over projects have egos which do not want to lose discretionary control over project decisions which could be altered if usability research ran counter to what they desire to do.

Usability testing can be the difference between a design that becomes highly popular versus one which is rapidly forgotten. Google itself is an example of how user-centered design will translate into success. More design options can be scientifically decided, honing down to interfaces which will maximize ease-of-use and enjoyment-of-use. Instead of being avoided, usability testing should be embraced — after all, in the business world we’re looking to increase the potential for success in our company projects, right?

Knowing Google’s heavy focus upon usability factors, consider that if you’re not doing iterative Usability testing and adjustment for User-Experience, you really may not be doing “Advanced SEO”.

If you’d like a thorough Usability Audit of your site, contact Key Relevance today to schedule our review and get a report of items to consider before your next sitewide redesign is completed.

Also, check out some of the free tools that Google has been providing to help you with portions of usability analysis. Try out Google Browser Size, Google Page Speed, and look at the Site Speed reports in Google Webmaster Tools for your website.

Town & City Name Sponsorships

I just wrote an article which published at Search Engine Land yesterday on the subject of some innovative and occasionally guerrilla marketing tactics that might be used to display advertising promotion via Google Maps. (See: Six Odd Tactics For Getting Ads Into Google Maps)

One aspect the article touches upon is how some smaller towns and cities might find it attractive to sell the rights to their names in return for sponsor dollars. I find this concept interesting, particularly as many municipalities have begun considering flogging the rights to name all sorts of things from auditoriums to subway stations to city service departments.

In the article I mentioned “DISH, Texas” which sold its name a few years ago to a satellite dish company in return for free satellite TV service for all of its residents. While this is one of the more recent examples of “City Name Sponsorships”, it’s not the first. My coworker, Mike Churchill alerted me to the fact that the small town of “Truth or Consequences, New Mexico” actually changed its name from “Hot Springs” back in 1950 in order to win a radio contest.

Truth or Consequences, NM

The NBC radio program, “Truth or Consequences” offered to broadcast their show from the first town that renamed itself for the show.

In American history, quite a number of towns and cities went through various name transitions over time, but most of these monikers were inspired by people’s names or were descriptive in some way. These days, I suspect that most larger cities would find a lot of resistance to selling off their names — and for well-known cities they’d be losing a lot of “brand equity” if they dropped a well-known name. But, for small towns, there could potentially be a lot of places which might find large corporate investment attractive enough that they could overcome constituents’ resistance to name-change.

Selling a placename is bound to create controversy whenever it happens. Winnipeg’s plans to sell off naming rights on everything from parking meters to bus tickets and even city services has apparently gotten significant criticism.

Kalle Lasn, founder and editor-in-chief of Adbusters magazine, says selling off naming rights to city services is an example of backward and unimaginative thinking.

“It’s really depressing … They should learn how to be a little bit more innovative. There are ways of cutting back and ways of generating revenue that don’t include selling your soul to corporations.”

(Adbusters is famous for helping promote “Buy Nothing Day” and other anti-commercialism and anti-advertising philosophies.)

Regardless of the controversy, the prospect of abruptly having some saleable assets available is likely to prove too attractive to resist for many city managers during these cash-strapped times. I expect we’ll see some more instances of corporate-sponsored city names appearing in online mapping systems like Google Maps.