Skip to content

Posts from the ‘Google PageRank’ Category


Making Your World Flat Again: Much Ado About No Follow

PageRank Sculpting Dead? (Not Exactly)

With all the chatter over the last week and half over “rel=nofollow” and Matt Cutts’ statement about the link attribute, I think it’s time to weigh in on it. There are a few great post that should be considered required reading before reading this post:

Is “nofollow” Worth Using Anymore?

After listening and reading hours of material from everyone and anyone, my consensus is no.  Additionally, I’ve also started to notice that “nofollow” has stopped it LinkJuice drafting ways on numerous sites I work on. Not to mention Matt sounds quite serious about the levying penalties for sites employing nofollow to sculpt and/or restrict content as a means to “manipulate” PageRank. And, remember, you are an “enemy” to engines.

What Matt Cutts Has To Say About Nofollow

Q: Does this mean “PageRank sculpting” (trying to change how PageRank flows within your site using e.g. nofollow) is a bad idea?
A: I wouldn’t recommend it, because it isn’t the most effective way to utilize your PageRank. In general, I would let PageRank flow freely within your site. The notion of “PageRank sculpting” has always been a second- or third-order recommendation for us. I would recommend the first-order things to pay attention to are 1) making great content that will attract links in the first place, and 2) choosing a site architecture that makes your site usable/crawlable for humans and search engines alike.

There may be a minuscule number of pages (such as links to a shopping cart or to a login page) that I might add nofollow on, just because those pages are different for every user and they aren’t that helpful to show up in search engines. But in general, I wouldn’t recommend PageRank sculpting.

Additionally, there’s this nifty NEW tag that Google created a while back called the “rel=canonical”, which should help the majority of sites out there with duplicate content issues.

How To Sculpt Without Nofollow:

Rand posted some good methods for sculpting, which work, but it feels unnecessarily sneaky to me.  A better way is to allow, as Matt Cutts suggests, to allow your PageRank to flow freely.  Interconnectivity (internal site link structure) is a key element here. You want your site to have a “flat architecture”, and even though it may not be.  It’s really a play off the Bruce Clay method of siloing, but ensuring that major sections of the site become visible through an internal linking structure. This allows those pages to, if the practice holds, to share and share-alike the PageRank while only giving minimal amounts to other site areas.

And, yes, removing the nofollows will be a pain in the ass, but it’s one that worth it. And, yes, I really don’t like that Google went back on their word with this attribute. But to put it in perspective: “It’s Google’s World. I just live in it.” And, that’s the fact. I’m an enemy combatant.


Get Keyword Rank Data from Google Webmaster Tools

Matt McGee’s post on Search Engine Land last Wednesday “Google’s New Referrer String Includes Ranking Data (At Least For Now)” made me think about a lot of the SEOs I know. It’s a great post, with a lot of valuable information. Not to mention the great comment from WebShare, with detailed instructions how to set up an advanced GA (Google Analytics) filter to track the Rank Data in the referrer string. (I’m testing out the filter with a couple of heavily trafficked clients to see if it lives up to its claims.)

How many SEOs Really Check Their Web Logs?

The problem is, that most SEOs and SEMs I know:

a) have no idea how to check their web logs, and have probably never checked that raw data. Ever.

b) have installed Google Webmaster Tools, if for no other reason, to submit a complete XML sitemap to Google.

The short answer is that not many do. Perhaps Matt’s post will motivate a few more to open up that data, or at the very least find it. And if you still don’t have any desire to open up the Web Logs, then I offer another solution.

Use Google Webmaster Tools “Top Search Queries” to Get Rank Data

It’s all there, you just have to take the time to sort it out (literally) in Excel.

1) Find the “Top Search Queries” link in Webmaster Tools:


2) The Raw Webmaster Tools Data File (Un-Sorted)


Looks nearly unusable, right?  And, to be truthful, it can be a bit intimidating unless you know what you’re looking for in that rat’s nest of data.

3) Get Your Sort On

It depends on what target audience you’re looking to identify, but to keep it simple we’ll sort for “WEB SEARCH” and in the “United States”.

4) Finding Percentage of Clicks and the Position in the SERPs


While it’s much more “manageable” now, that’s still leaves a DaVinci-esque code to be broken.  What does all that data mean, and more importantly, how do I know?

Here’s a sample string from the spreadsheet above (marked to distinguish):

[water brake dyno (a), 2% (b), 6 (c)]

a ) is the keyword term search and/or clicked on by the user

b) Still on the fence for this stat. Could be the percentage the term is searched (which doesn’t seem likely). Or it is the popularity of the term in conjunction with the other keywords in the grouping.  And, you’ll notice that all the percentages add up to 100%, which leads me to believe that my latter assertion is more than likely correct.

c) is the position in SERPs.  Yes it’s true.  Test for yourself. Open up the data in webmaster tools and, without being logged into your Google account, search for the term in question.  You’ll find, 9 times out of 10, this is exactly where the term is*.

*My only caveat to “C” is that is seems to be taken at a “snapshot”. On the terms I’ve looked at, the position is up or down one. But other than that, it’s fairly accurate.

So, if you don’t feel comfortable checking web logs, or just don’t want to go through the hassle, Google Webmaster Tools will also provide the same data.


Is it the Links, the Traffic, or What?

Is Linking the End-All-Be-All of SEO?

We all know that Google Personalized Search is coming: Search 4.0 (if you’re keeping count).  And, we are already seeing instances of it surfacing within the SERPs.  Or if you need the skinny on it, check out Danny’s post on Search Engine Land.

Knowing all of this, SEOs must still work to optimize client sites.  With a bear economy, companies have begun shrinking marketing budgets, as well as personnel, giving SEOs, in most cases, smaller budgets to work with.  Our management and maintenance fees aren’t shrinking, so we’re left with finding the few essential services that need to be done in order to keep our clients converting and feeding the funnel to keep the potential conversions coming in.

The list for most of us looks something like this:

  1. On-Site Optimization: meta-data, optimized URLs, link juice sculpting, and optimized anchors
  2. Link Building: via directories, link baiting strategies, social media linking strategies
  3. Reporting

It’s pretty safe to say that #1 and #3 are indispensable, but is link building really the answer to SERP ranking issue?  Is this really a major part of the algorithm and a determining factor of placement?  For years we’ve been professing that a solid linking strategy will, in fact, create solid ranking within the result pages because the Google Algo, which revolutionized the SERPs based on this factor of trust, spawned copy-cats through Yahoo, Live, and a majority of others.  But there are other factors outside of on-site optimization that we also believe to be at work; factors that may actually play a larger role in determining where you place in the SERPs.

A) Site Traffic and Bounce Rate

B) Domain Age

With the advancement of SEO tools available, we’re able to see exactly what’s driving search results for particular queries (under the assumption you don’t have your own personalized searches on) and get glimpses into the algorithm.  Of course there are other factors that do influence the site’s SERP position, trust, and relevance, such as number of 301 redirects, C Class, and others, but we really want to look at major influences for this session.  Which is not to say that those factors could not cause a site’s SERP position to crash dramatically, but they’re less of an issue for the majority of sites.

That said, let’s look at some data I pulled for what I think would be traditionally consumer and B2B searches.

B2B Data

Search: “belt conveyors”

SERP Page 1 and Data for each Site:


When you examine the data, the links don’t really give us an indication that they have influenced the position of a particular company; for example, the 10th position company has quite a few more links than any other company on this SERP (excluding, of course, the directory sites).  It would seem reasonable then that this company should be in the first position, or at least in the Top 5.

However, it could be that this site’s on-site effort is not good at all and the linking effort alone has catapulted it to the first page (there are subjective items that cannot be taken into account through this data). Yet this data does not clearly indicate linking as a sole factor for position, so we move on to Traffic Data to determine if this is the factor that placed the company on top?

Traffic Data: 709 960,230 3,312 98,740

This does not really clear up the SERP position picture either. Based on linking data and traffic data, there is no reason that should retain the top position for “belt conveyors”.  The only advantage this site has over the other companies (not directory sites) is that its domain is older, and not by much.

The answer comes when we look at the site.  The entire site is dedicated to “belt conveyors”, multiple types of belt conveyors.  So the site’s content and keyword density for the term “belt conveyor” make it the obvious choice.

QC Industries Text Only

QC Industries Text Only


Based on the data above, it can be concluded linking and traffic did not directly influence this site’s SERP position. And, I think with more targeted, keyword-rich optimization, this site could potentially “box-out” competitors for this keyword for a long time to come.  Let’s check the B2C search term.

B2C Data:

Search Term: “search engine marketing services”

SERP Page 1 and Data for each Site:

"seach engine marketing services" SERP[/caption]

When you examine the data for the “search engine marketing services” SERP, on the surface the links appear to a guiding indicator. But then you see And again the simple theory of “more links = better rank” fails us. has the oldest domain on the list, the most links (outside of Google itself), and the most high quality links.  Everything says this should be the most trusted site for this particular query right?  Well, then it must be a question of traffic; must not get nearly the traffic or gets.  Let’s see.

Traffic Data: 65,392 98,723 111,266 6,481 102,679

The traffic does not really clear up the SERP position. Based on linking data and traffic data, should not be in the top position for “search engine marketing services”.  This site does not have any comparable advantages over those sites listed below them.  Could this be another case of onsite optimization



In my opinion onsite optimization leaves something to be desired as well.  So if it’s not the links, the traffic, or the onsite optimization efforts, what could possibly have this site ranking as well as it does?

In my opinion it is a test slot for a company or URL.  I have seen this happen in several other highly competitive search terms, which leads me to believe this is the case.  In this manner Google can “see” if this site is worthy to hold the number 1 spot for this highly competitive search term.  Is it possible that this site has hit the precise number of backlinks, has the exact right keyword density, and accumulates the right amount traffic to warrant a first position slot for this search term?  The odds are astronomical as well as improbable.

Overall Conclusions:

The data above leads me to believe that onsite optimization is inherently the most important thing a site can do to increase it’s visibility within the search engines, particularly Google.  Within highly competitive terms, I would recommend that link building be an essential task.  However, having said that, competitive research must be done in order to gauge the amount links needed to enter the first SERP.

It makes very little sense to pile on links for keywords, if your competition has weak link building efforts.  The site being optimized should garner a “comfortable gap” of link-separation between its competitors.  There’s no need to get 10,000 additional links if the competition is holding steady around 500 – 600 links.  In this case, it might be more “normal” to Google if the your client site built 1000 links over the next 6 to 12 months.  It will have the same effect, i.e. showing site relevance and trust, without drawing attention to the site.

Traffic may certainly play a factor within the SERPs, but it does not seem to effect a site already within the top 10 positions.  Traffic may be a  factor for sites on the cusp of first SERP, but once there, it seems to have very little direct implication.  The only thing that would seem to matter is that the traffic stays steady: no large dips and peaks.  We’re looking for a nice x-axis upward slope.

I think link building should still be a recommended measure to include within any optimization campaign, but don’t expect it to be the savior.  It can be concluded from the data above that inbound links do effect placement within the SERPs; however, the main focus should still remain on onsite optimization of targeted content, link juice sculpting, and optimized link anchor text.

%d bloggers like this: