Showing posts with label Link Building Strategy. Show all posts
Showing posts with label Link Building Strategy. Show all posts

Saturday, August 31, 2013

We all know it’s not worth our time to build sub-par links anymore. It’s time to get innovative with our link building tactics. We previously shared a few lesser-known link building techniques and post panda/penguin era link acquisition strategies here, but we are of the opinion that you can’t have too many.

So, here are a few more techniques that will help you push the envelope and approach link building with the mindset of a serious marketer:

1. Coin a Phrase and Set Up Alerts

The idea behind this strategy is to invent a new buzzword, try and get it to catch on, and then capture links as a result. Since not everybody who uses the phrase is going to send a link your way, you can set up Google Alerts to capture mentions of the phrase. If it’s clear front context that they’re talking about the buzzword you coined, this can be a great opportunity to build a link.
Now, if your site doesn’t quite have the exposure to get a buzzword out there, this might seem like a pointless exercise in futility. However, all it takes is $50 to get your article in front of 1,000 people on StumbleUpon.
If your campaign is targeted toward the right people, and the phrase is catchy enough, this could well be enough to start getting the phrase in use by many in the online community.

2. Produce a Resource and Set Up Alerts

Similarly, you can put together a video, white paper, or infographic, set up Google Alerts on the topic, and start contacting people. Any time a question about the topic comes up online, this is an opportunity to answer their question with a link to your resource. This is a great option because the answer is completely on-topic.
If you use this tactic, be sure to be as helpful as possible. The resource itself should be tremendously helpful, of course, but if it doesn’t answer every aspect of the question, make sure you address those directly in your comments, emails, etc. Don’t make it feel like a sales pitch, offer a genuinely useful answer.

3. Update Somebody Else’s Content

You know those pieces of content that seem to just keep on giving? This is the content that you want to keep investing in with updates, corrections, etc. to keep it relevant. It’s common practice to revisit your best content, keep promoting it, and keep improving it.
But fixing up your old content can be also be a chore, and that’s where you come in. Instead of submitting a guest post, why not contact a blogger with an interesting fact or update that will help keep one of their top posts fresh and interesting?
Try doing a search for some of the more broad terms related to your keywords, and visiting some of the blog posts you come across. Focus on the ones that already seem to have massive appeal. Read through, and catch yourself if you start thinking this reminds me of…
As soon as that happens, get in touch and let them in a surprising piece of information that’s relevant to the article. This can be a great opportunity to earn a link.

4. Customize a Widget

You might not be a coding master with the ability to put together a master widget that everybody’s going to want to download. However, odds are pretty good you have the coding or design skills necessary to customize a widget so that it fits a site’s branding and appearance.
Try seeking out blogs in your niche, or peripheral niches, that use widgets you recognize. Give the widget a tweak so that it fits the site better, place a link, and offer the new take on the widget to the blogger. This is especially powerful for popular blogging topics that aren’t centered around the tech industry.
That said, be ethical, and make sure the blogger is aware of the link back to your site. There’s no reason to hide this. As long as you understand how to work with people amicably it shouldn’t be a problem.

5. Buy Display Ads

Carson Ward from Distilled recently ran an experiment where the purchased ads as a method of building links. He discovered that, by far, the most effective ads were display ads. (AdWords ads, on the other hand, were the worst.)
With display ads, it’s possible to target sites where bloggers and site owners are most likely to lurk. This way, you can put your site in front of the audience that has the most impact on links and online influence. You only need to pay when the visitors actually click on the ads, and each visit from this type of visitor is much more likely to result in a link than a visit from the search engines.
Many SEOs are averse to spending money on ads because they generally only produce short term benefits, but if you can use ads to build links this objection doesn’t make sense anymore. The only question you should be asking is whether your time is better invested in paying for ads or outreach (or some combination of both).

6. Work with Experts

There’s no reason to produce content in a vacuum. In fact, most of the best content is the result of collaboration. Involve experts in the creation, fact-checking, and refinement of your content before it goes live. The more experts you work with, the more opportunities you have for additional links.
Don’t try to scale this too much. The more people you try to involve, the less commitment you can get from each of them, especially if you are automating your outreach. Instead, customize your outreach emails and be clear about why you are contacting them. Don’t ask for too much from them, and make it clear that they will be getting something out of the exchange as well.

7. Get In Business Directories

Most “link” directories are useless (though exceptions like DMOZ and AllTop are worth your time). However, getting added to relevant business directories is certainly a worthwhile effort, since these links are from reputable organizations and are a good sign of trust. These kinds of links can come from:
  • The Better Business Bureau
  • The Chamber of Commerce
  • Your local library
  • Other relevant city and state government resources
  • Accrediting organizations
  • Business memberships
Focus on links from reputable business lists that people actually use and care about. Avoid directories that exist simply to provide links for search engine authority, since these are the least likely to offer any real search engine authority.
Can you think of additional alternatives? If you can, pass them along, and if you liked this, be sure to pass it along as well.

Sunday, August 25, 2013

Negative SEO : How to Uncover an Attack Using a Backlink Audit

Ever since Google launched the Penguin update back in April 2012, the SEO community has debated the impact of negative SEO, a practice whereby competitors can point hundreds or thousands of negative backlinks at a site with the intention of causing harm to organic search rankings or even completely removing a site from Google's index. Just jump over to Fiverr and you can find many gigs offering thousands of wiki links, or directory links, or many other types of low-quality links for $5.
By creating the Disavow Links tool, Google acknowledged this very real danger and gave webmasters a tool to protect their sites. Unfortunately, most people wait until it's too late to use the Disavow tool; they look at their backlink profile and disavow links after they've been penalized by Google. In reality, the Disavow Links tool should be used before your website suffers in the SERPs.
Backlink audits have to be added to every SEO professional's repertoire. These are as integral to SEO as keyword research, on-page optimization, and link building. In the same way that a site owner builds links to create organic rankings, now webmasters also have to monitor their backlink profile to identify low quality links as they appear and disavow them as quickly as they are identified.
Backlink audits are simple: download your backlinks from your Google Webmaster account, or from a backlink tool, and keep an eye on the links pointing to your site. What is the quality of those links? Do any of the links look fishy?
As soon as you identify fishy links, you can then try to remove the links by emailing the webmaster. If that doesn't work, head to Google's disavow tool and disavow those links. For people looking to protect their sites from algorithmic updates or penalties, backlink audits are now a webmaster's best friend.
If your website has suffered from lost rankings and search traffic, here's a method to determine whether negative SEO is to blame.

A Victim of Negative SEO?

Google Analytics 2012 vs 2013 Traffic
A few weeks ago I received an email from a webmaster whose Google organic traffic dropped by almost 50 percent within days of Penguin 2.0. He couldn't understand why, given that he'd never engaged in SEO practices or link building. What could've caused such a massive decrease in traffic and rankings?
The site is a 15-year-old finance magazine with thousands of news stories and analysis, evergreen articles, and nothing but organic links. For over a decade it has ranked quite highly for very generic informational financial keywords – everything from information about the economies of different countries, to very detailed specifics about large corporations.
With a long tail of over 70,000 keywords, it's a site that truly adds value to the search engine results and has always used content to attract links and high search engine rankings.
The site received no notifications from Google. They simply saw a massive decrease in organic traffic starting May 22, which leads me to believe they were impacted by Penguin 2.0.
In short, he did exactly what Google preaches as safe SEO. Great content, great user experience, no manipulative link practices, and nothing but value.
So what happened to this site? Why did it lose 50 percent of its organic traffic from Google?

Backlink Audit

I started by running a LinkDetox report to analyze the backlinks. Immediately I knew something was wrong:
Your Average Link Detox Risk 1251 Deadly Risk
Upon further investigation, 55 percent of his links were suspicious, while 7 percent (almost 500) of the links were toxic:
Toxic Suspicious Healthy Links
So the first step was to research those 7 percent toxic links, how they were acquired, and what types of links they were.
In LinkDetox, you can segment by Link Type, so I was able to first view only the links that were considered toxic. According to Link Detox, toxic links are links from domains that aren't indexed in Google, as well as links from domains whose theme is listed as malware, malicious, or having a virus.
Immediately I noticed that he had many links from sites that ended in .pl. The anchor text of the links was the title of the page that they linked to.
It seemed that the sites targeted "credit cards", which is very loosely in this site's niche. It was easy to see that these were scraped links to be spun and dropped on spam URLs. I also saw many domains that had expired and were re-registered for the purpose of creating content sites for link farms.
Also, check out the spike in backlinks:
Backlink Spike
From this I knew that most of the toxic links were spam, and links that were not generated by the target site. I also saw many links to other authority sites, including and It seems that this site was classified as an "authority site" and was being used as part of a spammers way of adding authority links to their outbound link profile.

Did Penguin Cause the Massive Traffic Loss?

I further investigated the backlink profile, checking for other red flags.
His Money vs Brand ratio looked perfectly healthy:
Money vs Brand Keywords
His ratio of "Follow" links was a little high, but this was to be expected given the source of his negative backlinks:
Follow vs Nofollow Links
Again, he had a slightly elevated number of text links as compared to competitors, which was another minor red flag:
Text Links
One finding that was quite significant was his Deep Link Ratio, which was much too high when compared with others in his industry:
Deep Link Ratio
In terms of authority, his link distribution by SEMrush keyword rankings was average when compared to competitors:
SEMrush Keyword Rankings
Surprisingly, his backlinks had better TitleRank than competitors, meaning that the target site's backlinks ranked for their exact match title in Google – an indication of trust:
Penalized sites don't rank for their exact match title.
The final area of analysis was the PageRank distribution of the backlinks:
Link Profile by Google PageRank
Even though he has a great number of high quality links, the percentage of links that aren't indexed in Google is substantially great. Close to 65 percent of the site's backlinks aren't indexed in Google.
In most cases, this indicates poor link building strategies, and is a typical profile for sites that employ spam link building tactics.
In this case, the high quantity of links from pages that are penalized, or not indexed in Google, was a case of automatic links built by spammers!
As a result of having a prominent site that was considered by spammers to be an authority in the finance field, this site suffered a massive decrease in traffic from Google.

Avoid Penguin & Unnatural Link Penalties

A backlink audit could've prevented this site from being penalized from Google and losing close to 50% of their traffic. If a backlink audit had been conducted, the site owner could've disavowed these spam links, performed outreach to get these links removed, and documented his efforts in case of future problems.
If the toxic links had been disavowed, all of the ratios would've been normalized and this site would've never been pegged as spam and penalized by Penguin.

Backlink Audits

Whatever tool you use - whether it's Ahrefs, LinkDetox, or OpenSiteExplorer – it's important that you run and evaluate your links on a monthly basis. Once you have the links, make sure you have metrics for each of the links in order to evaluate their health.
Here's what to do:
  • Identify all the backlinks from sites that aren't indexed in Google. If they aren't indexed in Google, there's a good chance they are penalized. Take a manual look at a few to make sure nothing else is going on (e.g., perhaps they just moved to a new domain, or there's an error in reporting). Add all the N/A sites to your file.
  • Look for backlinks from link or article directories. These are fairly easy to identify. LinkDetox will categorize those automatically and allow you to filter them out. Scan each of these to make sure you don't throw out the baby with the bathwater, as perhaps a few of these might be healthy.
  • Identify links from sites that may be virus infected or have malware. These are identified as Toxic 2 in LinkDetox.
  • Look for paid links. Google has long been at war with link buying and it's an obvious target. Find any links that have been paid and add them to the list. You can find these by sorting the results by PageRank descending. Evaluate all the high PR links as those are likely the ones that were purchased. Look at each and every one of the high quality links to assess how they were acquired. It's almost always pretty obvious if the link was organic or purchased.
  • Take the list of backlinks and run it through the Juice Tool to scan for other red flags. One of my favorite metrics to evaluate is TitleRank. Generally, pages that aren't ranking for their exact match title have a good chance of having a functional penalty or not having enough authority. In the Juice report, you can see the exact title to determine if it's a valid title (for example, if the title is "Home", of course they won't rank for it, whether they have a penalty). If the TitleRank is 30+, you can review that link by doing a quick check, and if the site looks spammy, add it to your "Bad Links" file. Do a quick scan for other factors, such as PageRank and DomainAuthority, to see if anything else seems out of place.
By the end of this stage, you'll have a spreadsheet with the most harmful backlinks to a site.
Upload this Disavow File, to make sure the worst of your backlinks aren't harming your site. Make sure you then upload this disavow file when performing further tests on Link Detox as excluding these domains will affect your ratios.

Don't be a Victim of Negative SEO!

Negative SEO works; it's a very real threat to all webmasters. Why spend the time, money, and resources building high quality links and content assets when you can work your way to the top by penalizing your competitors?
There are many unethical people out there; don't let them cause you to lose your site's visibility. Add backlink audits and link profile protection as part of your monthly SEO tasks to keep your site's traffic safe. It's no longer optional.

Thursday, August 22, 2013

Finding and analysing Website backlinks is an easy way to build quality backlinks to your website, the more you get backlink reports, the more you'll have data about website back links. Backlink checker tools are important and you should use it in the right way because you need links information that pointing to your competitor website. It also helps for your own website, because you need to know that from where you're getting backlinks?, what is the anchor text that linking to your site? how much ? what is the type of backlink you're getting?
There are online Web applications built by high quality professional teams that allow you search backlinks for a website and get free reports. Before we search out all website backlinks. let's clear the concept of quality backlink. 

What are Quality Backlinks?

If you have thousands of inbound links but not quality and relevant then they won't be that much important as one quality and relevant backlink. Backlinks are of quality type if they are relevant, for example for a Blog publishing "Free Online Tools" will have have good type of backlinks if they are from relevant blogs. 
Blogs that are based on online tools contents, now if they are from Blog having higher page rank then they will be quality too.

Finding and Evaluating backlinks

Finding Backlinks are easy, once you done the first then the last phase would be evaluating and gaining for yours website.
We'll be using Backlink checker tools to find out all backlinks for a website, note that we can search out not just own website backlinks but also someone else (Competitor), checking own website backlinks will be important for gathering back-links data, for competitor website checking it will be good for us too because we want also to have backlinks from websites that our competitor have already.
Once we get a list of all backlinks of our website, then we have to evaluate the type and how they are linked.
For example suppose you found a backlink then the it is important to know is it nofollow or dofollow?, what is the anchor text (keyword) that is linked.

The more you've quality backlinks the more you'll have ranking in Google Search, Finding and checking Backlinks are essential , because this is the effective way to evaluate the type of backlink and get backlink reports.

Free Backlink checker

Now we're going to use online tools to check website backlinks and then analysis it. 
There are numerous tools available online to search backlinks, following are some of them.

  1. AhrefsSite explorer and backlink checker tool, allow you get free reports of all backlinks pointing to your website, the best thing about this tools is you it show you diagrammatically all data, for example a graph showing the date when you had an amount of backlinks, and were they increasing or decreasing, how much backlinks and the type of backlinks that is- how much of them are nofollow and dofollow. The best think I like in ahrefs is that it shows anchor text cloud which demonstrate the overall picture of most linked keyword (anchor text) as well as graphic representation of phrases.
  2. Open Site Explorer- Developed by SEOMoz (Now Only Moz), most of the features and reports are similar to ahrefs, allow you check backlinks free and get analysis reports. An advance tool help you find backlinks that're linking to your own website, you can do the same for your competitor and can get benefits by getting links from the websites that your competitor getting.
  3. Backlink Watch- Another Backlink checker tool, uses Ahref (World largest Backlink reporter) API, a simpler interface, give you list of your website backlinks. Show you the top domains that are linking to your site as well as the anchor text that is linked.
  4. Small SEO Tools Compared to other Backlink checker Small SEO tool backlink check is a bit slow. The best thing is that it is free and give you all backlinks report, some backlinks may not show, because their database does't frequently update.
  5. Majestic SEO -  The best one in terms of efficiency and reports, simply you search your homepage URL, and get all backlinks by pages anchor text and profile. To access the advance features you've to subscribe and pay.
  6. Analyze Backlinks- Allow you filter the search reports by checking the boxes that says Backlinks from the homepage Only, specific anchor text, total links and outbound links.
  7. WebConfs- The down side of this tools is that it will show you only the dofollow backlinks of your website and won't repeat from other pages, means if you are getting many links from a blog, then it will be counting only once. it will also show you the page rank of the website that is linking to your blog.
  8. Rank SignalA free backlink checker tool allow you easily check site backlinks with analysis, this tool also show you the rank of the site linking to your site as well as how your site is linked by image or text, if it is anchor text then it will also show you the most used anchor text pointing to a website.
  9.  Web SEO Analytics- The slower one, a free version have limited features, but still you can check backlinks free by entering the URL of your website. compared to other link checker tools mentioned in this post WEB SEO analytics tool won't show you graphical representation and thus is't convenient. it is best if you want to copy all backlinks of a website and then do analysis further.
  10. Monitor backlinks- Another free backlink checker tool allow you get website backlinks reports as well as monitor them, to monitor you'll have to upgrade to pro version. if you're getting more than one links from a website, it will show you them only once.
So these were some of the Free backlink checker tools, i hope you like them and that it will be useful for you, if you want to check social media links of your website then you can useSEOCentro. Backlinks are of quality type if they are relevant. 

Saturday, August 17, 2013

As SEOs, we’ve all used common anchor text and more generic links in the past: it was, after all, just how things were done a year or two back.  However, things have now changed, and we’ve all got to focus on throwing a bit more variation and invention into our lead generation campaigns.  If the old style links aren’t proving effective anymore, then where should we be focusing our effort?

Co-occurrence.  It’s been voiced (most notably by Bill Slawski), that ranking for a term could now occur without the main keyword being used at all, simply because the link has a direct relevance to it.  For instance, creating content based around Bishop’s Finger or Old Peculiar could lead to a site being ranked the phrase ‘Real ales’, simply because the two former terms are individual brands of a drink.  Enough relevant terms that aren’t direct variations of the keyword could be a great way to build reputation whilst avoiding the dreaded link penalties.

Synonyms.  Sure, the word might sound like the evil company from a dystopian sci-fi novel, but focusing on synonyms can give your link building campaign a real sense of variation.  For instance, if you’re building a football blog, then build up links based on ‘soccer’ too.  If you’re creating an image database, then include links focused on ‘pictures’ and ‘pics’, as well as image formats like ‘JPGs’.  You’re still building a reputation for the same things, but adding far more depth to your reputation.  You’ll also pull in a lot more searches when you begin to rank for all the terms.

Partially matching.  As far as we’re concerned, the very best way to include natural links is to write your content first, then worry about the links afterwards.   If you’ve written a top drawer, 1000 piece article that would grace the New York Times, then added a link to the phrase ‘locksmith’ – with your site’s main keyword being ‘locksmiths based in NY’ then you’re going to see far better results than having two links with the exact keyword within a 200 word knockoff article.  Remember: article first, link second!

Link out.  I’m always amazed when in 2013, SEOs still don’t see the value in linking externally to authority sites.  Outgoing links, when relevant, are a serious way of building authority.   If you were discussing the merits of the war in Iraq (to use a random example), then you would likely cite arguments given by other, higher authorities who agreed with your cause (‘Well, Christopher Hitchens said…’ etc).  This is just as applicable when applied to building good content.  If you’re writing a piece about Michael Jordan, then link to videos of him playing.  If you’re writing about Hemingway, then link to some of his greatest quotes.  It’s relevant to your content, so it’s good for your reputation.

Thursday, August 8, 2013

Think Beyond Links, Build Compelling Sites with Rich UI and UX
No serious SEO can ignore the elephant in the room: black hat works. For every post informing us that we need to create compelling content in order to rank, there are hundreds or thousands of junky posts ranking on purchased links, private link networks, and tiered linking schemes. Black hat doesn’t work for long, but it does work.
So why are we sitting here telling you to “think beyond links?” Are we just industry sheeple, regurgitating Google propaganda about how you need “great content” in order to show up in the SERPs?
We’re here to tell you that compelling sites take you further than rankings alone, and that UI and UX are musts if you want to optimize any relevant KPIs, as opposed to just optimizing for search. If you still aren’t convinced, take a look at our full explanation.
Today, we’re going to talk about how to do it.

1. Embrace Responsive Design

Responsive Design
At the end of 2012, Mashable redesigned their entire site. The content now resizes itself if you adjust the size of your browser. They’ve done this to respond to their changing user base. As more people switch over to tablets and mobile phones, fewer of them are willing to put up with sites that force them to use horizontal scrolling or zooming just to use the site.
Mashable is far from alone in recognizing this need.
2013 is the first year where projected PC sales are expected to be lower than the previous year. One hundred million tablets are expected to be sold by the end of the year, and smartphones already outnumber traditional cell phones.
Meanwhile, very few businesses will be able to design apps accessible from every device. Apps are certainly a good way to put yourself in view among your core customer base, but it’s much more difficult to convince a user to install an app than it is to convince them to visit a website. Add to that a huge number of conflicting platforms and it starts to get very difficult to reach as wide an audience with apps alone.
Using media queries, you can determine the resolution of the user’s browser and adjust the presentation accordingly. Some screens are too narrow to accommodate side bars or large images.
In addition to responsive design, “adaptive design” is another important element in the future of site development. With adaptive design, you detect the type of device and adjust features accordingly. You may want to enable swiping, or adjust button sizes to accommodate fat fingers as opposed to tiny cursors.
Simply saying “we need responsive design” isn’t enough, though. You need to eliminate age-old habits you didn’t even know you were using. You can’t just create a mockup of what you want the site to look like and let the designer make all the calls. You’ll need to do things like:
  • Assign a hierarchy to each page element
  • Assign a hierarchy to each part of the content (because lorem ipsum just isn’t going to cut it)
  • How does navigation change on smaller screens?
It’s not just a matter of making things smaller so they fit the screen.
Smashing Magazine recommends starting with a mobile wireframe design that makes the hierarchy and priorities immediately clear. You can then hand this mobile mockup to a designer. A good designer should be able to immediately envision what the desktop size version should look like. (Here is there example, starting with a PDF of the mobile version and ending with the hi-res desktop version.)
It’s much easier to design for mobile and envision how things will adjust when the screen gets larger than it is to go in the other direction. This is central to effective responsive design.

2. Understand Split Testing

Split Testing
You’re not going to maximize conversions, user engagement, or create the most positive user experience if you don’t perform split tests, or don’t perform them correctly.
Danny Inny at CopyBlogger compared split testing to sex in high school: everybody says they’re doing it, most of them aren’t, and most of those who really are probably aren’t doing it right. Truer words have rarely been spoken.
Here are a few of the biggest mistakes people make when they split test their pages:
Statistical significance is not some vague feeling that you’ve run the test long enough to verify that one page works better than the other. When you ignore statistical significance, you surrender to statistical flukes and your own biases. You need to reach 90 or 95% confidence before you decide that one page is working better than the other. If you don’t know how to do that, you can take advantage of this free tool by Firepole marketing, you can learn what a two-sample t-test is, and you can take advantage of Google’s own split-tester right in Google analytics, or use the Premise WordPress plugin right from the WordPress menu.
You need a lot of traffic to spot small changes. 100 impressions is only enough to spot a 20% difference between two results. It takes 10,000 impressions before you can spot a 2% difference between options. That’s why, unless you can afford to pay for the traffic, you should start with major changes like:
  • Entire landing page concept
  • The headline
  • Price
  • Content
  • Images
Some will argue with me on the first point, but I think that’s unwise. I maintain that changing the entire page is still “changing one thing at a time,” and it’s probably the very first thing you should test. The landing page concept contains within it hidden assumptions about how your audience acts. You need to test what kind of audience you’re working with before you test individual things.
Testing individual page elements will help you get the best version of a page, but it will not get you to the best page concept.

3. Behaviors Speak Louder than Words

Behaviours Speak Louder than Words
Split testing is great for maximizing conversions, time on site, etc., but there is another kind of testing that is absolutely vital for UI and UX: usability testing. It’s very important to understand that usability testing is pretty much nothing like split testing.
The purpose of usability testing is to see how users actually use the interface. It’s user-centered design at its best, and it helps you build an interface that accommodates the way people actually expect it to work. This results in an intuitive user experience that flows naturally.
So how does usability testing work?
In usability testing, you present users with the interface and ask them to perform a series of steps. Their interaction with the interface is recorded, observed, and documented.
Unlike split testing, a very small number of users are actually needed in order to arrive at conclusions. The reality of the situation is that most people, especially those relatively inexperienced with technology, will respond to the interface the same way. It almost never takes more than ten users to spot a design flaw or opportunity, and typically takes fewer than that.
User testing is not market research. It is not about asking users what would make them like the interface more or asking for feedback. It is entirely about observing behavior. The truth is, people rarely know what they want or what would make the interface work better…at least not consciously. It is only by observing their behavior that you can learn what the product is missing, or where opportunities lie.
Jakob Nielson of Sun Microsystems popularized the concept of a large number of very small usability tests during development in the ‘90s. All that is typically necessary is five or six random users. This has come to be called “hallway testing” because of the implication that the users to test will be random people picked up off the hallway.
Put simply, as soon as you spot two or three people struggling with the interface, you don’t gain very much by watching hundreds or thousands of other people go through the same problem.
Statistically speaking, samples this small aren’t really representative of the general population, but that’s not the point. You can almost always spot problems of some kind with samples this small, and it’s pointless to test further until the problem is fixed. Testing larger samples is unnecessary until no problems can be detected with such small samples.
The point is to focus on iteration. Usability testing is much less scientific than split testing, and is driven primarily by the experience and intuition of the designers, combined with the behavioral data of users. With enough iterations, the overall sample size does become large, but the benefit of using this method is that initial problems are resolved early on and never encountered again, so that smaller problems can be uncovered by subsequent tests.
Usability testing can be made more effective and efficient by using tracking software, heat maps, and eyeball tracking tools.
It doesn’t have to be expensive, however. One method that is being advocated for startups is called paper prototype testing, and it’s exactly what it sounds like. This involves using paper mockups, which may even be hand sketched. It involves a user, who interacts with the paper mockup, a facilitator, who takes notes and delves into the problems with the users, and a “human computer,” who is familiar with the interface and manipulates the mockup to simulate the final interface.
Such paper prototype testing has been working quite effectively since the ‘80s and is unlikely to go out of fashion any time soon.
While usability testing isn’t the same as market research or qualitative questionnaires, it can be useful in some cases to mix them together. Face to face interviews tend to be most useful, particularly right after a user testing session. These interviews can help uncover the root issue behind the behaviors. They can be very informative, as long as you remember that behaviors speak louder than words.

4. Build Consumer Psychology into the Site

Consumer Psychology
User experience is all about what is happening mentally, so it’s important to understand a few things about consumer psychology when you design the interface and decide on the content. We can start with Robert Cialdini’s six principles of influence:
1. Reciprocity – Humans, in general, are more inclined to do favors for people who have already done favors for them. The more value your website offers to its users, the more inclined they are to offer value of their own in return.
2. Commitments – People are more likely to follow through on something if they make a commitment, no matter how small, and even if it is a mental commitment. For example, you are more likely to get a positive result in most circumstances if you ask “Will you…” instead of “Please…” Small commitments also open up the door to larger commitments in the future.
3. Authority – We tend to trust people with credentials and expertise more than people without them. A seal of approval, a group membership, or an endorsement from a trusted authority can go a long way. This is also the entire premise of content marketing: that you can become a trusted authority by offering valuable, helpful content on subjects that matter to your target audience.
4. Social Proof – We’re more likely to be influenced by somebody who is trusted by others, especially if they happen to be popular in groups that we associate with. This is even truer when somebody that we personally know endorses a product, organization, or person.
5. Scarcity – The more rare something is, the more valuable we tend to think it is. Unfortunately, this tactic has been used so often, and has become reminiscent of so many “don’t miss your chance” infomercials, that consumers rarely trust it anymore. It’s best to allow them to come to the conclusion that your brand is rare on their own, by producing content, products, and an online experience that is hard to come by.
6. Rapport – This is where market research and user targeting can get especially helpful. The more consumers feel like they have in common with you, the more likely they are to trust you, and to eventually buy from you.
In addition to these six principles, we can add 4 cognitive biases and heuristics that affect the way we think:
7. Loss Aversion – Generally speaking, the fear of loss is more influential than the promise of reward. We’re more willing to take a risk to avoid a loss than to earn a reward, and this is true even when the outcome is exactly the same. This has been scientifically validated many times. It’s the reason we fear the unknown, and it’s why stock market players hold on to losers too long and let go of winners too early.
8. Status Quo Bias (Default Bias) – If we’re given a series of options, we tend to choose whichever one seems to be the “default.” Anybody who’s ever been overwhelmed with decisions by the restaurant waitress (or the confusing menu) knows what I’m talking about. We like options, but don’t like being forced to make a decision. This is why smart restaurant menus highlight just a few dishes with special coloration and pictures, and why smart site designers make it clear where they want users to start.
9. Anchoring – Humans don’t think in absolutes. They think in relative terms. We tend to anchor things on our first impression or our most memorable (probably most emotional) one. As an example, if we see the highest price first, the lowest price will tend to seem lower. If we start with the lowest price, the highest price will seem much higher. The same goes for any other quality we might evaluate. This is why it’s important to be very careful with first impressions, and how we move forward from there.


Hopefully this introduction has given digital marketers something to think about. The careful balance between UI and UX is every bit as important as off-site SEO, and it is a crucial part of the buying cycle. Master these basics and you will be worlds ahead of the competition.
What are your thoughts on UI and UX strategy?

Friday, June 21, 2013

Responding to the Link Devaluation “Google Update”

It’s been almost half a month and Google still denies that an algorithm update occurred on January 17th, 2013. Even so, SEOmoz’s Dr. Pete noticed that several sites were no longer ranking for branded terms. Within a day of the suspected update, several webmasters contacted us with concerns about big drops in their rankings. We noticed a common narrative in their stories.

Very few of them had seen a site-wide drop in rankings.
Whether the drop in rankings was large or small, the sites were only seeing a loss in rankings forsome of their keywords. We discovered that many of them were still enjoying top positions for some of their other keywords.

After analyzing the affected sites in depth, we reached the conclusion that some sort of targeted link devaluation was underway. Some pages had dropped a few pages, others had plummeted out of the search results, and still others were completely unaffected.

We’ve been tracking the rankings of a wide variety of sites over the past several months, and we find ourselves in agreement with what Branded3 has to say on the matter. We’re seeing Google moving in the direction of devaluing links on a continuous basis, as they are crawled, rather than eliminating them in large chunks with one-off updates like Penguin.

At this point, we’re fairly certain that the January 17 event was the result of continuous link devaluation, rather than a single update.

There was already some talk of an update on January 15, and certainly no shortage of grumblings before then. It’s our belief that January 17 was merely the point where this process reached critical mass. If Google is crawling bad links and using them as seeds to detect other bad links, then at some point this process will explode exponentially, which we feel is exactly what happened.

So in that respect, what Google is saying is true. There was no update on January 17. The changes started several months earlier.

Rather than delve into every nook and cranny of these link devaluation algorithms, we thought it would be more useful to offer you a guide to recovery, in a similar vein as our Ultimate Guide to Advanced Guest Blogging for SEOmoz. So let’s take a look out how to recover from link devaluation, and how to prevent it in the first place.

What is Link Devaluation?

Link devaluation itself is nothing new. Google has never released an “official” update on the matter, but it has been happening for quite some time. Any webmaster who has witnessed a drop in rankings that had nothing to do with a penalty, increased competition, or a Google update has experienced link devaluation. It is simply the process whereby Google disregards a link, treating it as though it does not exist.

What is new is the continuous devaluation of links. In the past, Google employees manually devalued links, or used a combination of manual selection and algorithmic extrapolation to find and devalue links. Now, it appears that Google is devaluing links as they are crawled and indexed, rather than removing them in large chunks.

Google has many incentives to move in this direction. Penalties are post-hoc and selective. They are usually based on sites surpassing a certain threshold of suspicious links or on-site activity. Penalties, in general, target sites or individual pages rather than link graph itself. In short, penalties only put a band-aid on the larger problem of webspam.

In contrast, link devaluation cleans up the entire link graph, rather than targeting individual sites.

Were Your Lost Rankings the Result of Link Devaluation?

If you are seeing a slow decline in your rankings rather than a sudden drop, this is almost certainly the result of link devaluation (if it’s not due to higher competition or fewer searches in the first place). But a relatively swift drop can also be the result of link devaluation if it only seems to be affecting specific pages, or if you are still seeing traffic even after the drop.

This is also true for some other updates and penalties, however, so you’ll want to consider the following:
  1. Has there been a Google update? Wait a few days and check with the top sites to see if any updates have been announced around the time you saw a drop in traffic. Check withMozCast to see if there were any major fluctuations in rankings around that time. If an update has occurred around that time, you will want to check into it before pursuing anything else.

  2. Take a look at the total “Links to your site” in Google Webmaster Tools to see if this number is dropping. If so, link devaluation is almost certainly the issue, since Google doesn’t typically omit links from webmaster tools for no reason. It is a good idea to create a spreadsheet and record your links over time (or use a tool to do this for you), so that these changes are more obvious.

  3. Identify the landing pages that have seen the largest drop in search traffic. If the quality of links is lower than usual and the quality of the content is average for your site, it’s unlikely to be Panda. If there hasn’t been a Penguin update, this means it is probably link devaluation.

Misconceptions About Link Devaluation

It’s easy to conflate all the various aspects of Google’s algorithm, so it’s important to clarify the following:
  • Link devaluation is not Panda – Panda is designed to target low quality content. It is not based on link metrics. However, links from these affected pages are devalued, and this can indirectly affect your rankings.
  • Link devaluation is not Penguin – Penguin targets entire sites that either use manipulative links to rank their own site or other sites. However, the links from these affected sites are devalued, and this is an effect that you may notice even if your site is not directly hit by Penguin.
  • Link devaluation is not the unnatural links penalty – The unnatural links penalty was a precursor to Penguin that completely deindexed several sites that people were using to manipulate their rankings. Once again, links from these penalized sites are devalued, which can indirectly impact your rankings.
The important thing to understand about link devaluation is that it is not a penalty in the true sense. Devalued links simply don’t count, or don’t count as much as they used to. Most people who are impacted by Google updates aren’t actually directly affected. Instead, they are affected by the devalued links from the sites that are directly affected.

Now that links are being devalued on a continuous basis, you can be impacted even in the absence of a Google update. Do not confuse devalued links with penalties.

Responding to Link Devaluation: Do’s and Don’ts

It’s easy to do more harm than good by overreacting to a devaluation of your links (or any update or penalty, for that matter). Here are a few things to keep in mind to keep you on the right track.


  1. Revise your link building strategy by putting a focus on high quality links. We’ve written extensively about how to do that at Search Engine Journal with three posts on the subject.
  2. Use an anchor text strategy built for the modern world.
  3. Focus on content marketing as a link earning strategy, rather than merely building links. Take a look at our guide on the subject to get a handle on how to approach this.
  4. Approach SEO from a brand-building perspective, not a ranking perspective


  1. Do not waste time removing low quality or spam links from your profile. The bad ones have already been devalued and aren’t being counted anymore. They don’t count against you if this is genuinely a link devaluation.
  2. Do not use the Google link disavow tool. In general, you shouldn’t use this tool unless you have received a warning or a notice of action directly from Google, as we’ve previously discussed. At best, you’ll only disavow the links that Google has already devalued. More likely, you’ll disavow links that Google hasn’t devalued and end up shooting yourself in the foot.
  3. Do not use any link building technique that allows you to personally build a large number of links quickly.
  4. Do not build links using easily produced or automated content. Build links using content that attracts attention.
  5. Avoid links that don’t make sense in the modern era, like the ones we talked about in this SEJ post.

5 Reasons the Push Toward Link Devaluation is Actually a Good Thing

If Google’s new emphasis on continuous link devaluation sounds scary to some SEOs, here are a few reasons to see the change as a positive one:
  1. Devalued links don’t count against you, so there is no reason to spend time removing all the suspected links yourself.
  2. Devalued links don’t cause your site’s traffic to plummet overnight in the majority of cases, which gives you time to adjust your strategy.
  3. You will generally still see some of your pages unaffected after link devaluation occurs, unless a large number of devaluations causes your entire domain authority to start sinking.
  4. You can focus all of your efforts on building high quality links, rather than being concerned about weeding out bad ones.
  5. Spammers will be less likely to see results even in the short term, as opposed to the repeated process of success then penalty over and over again. Similarly, you will get more consistent feedback from your rankings about whether what you are doing is working or not.


Links devaluation is something not easy to identify unless you analyze it deeply or take help from the professionals. Identifying the right cause of penalty is the most important rather than taking actions and moving forward. I highly recommend taking help of professionals if you are unable to identify the cause of penalty as moving into wrong direction will put you in trouble and you will not be able to see your website rankings back that stay last long.

If you come across to suspicious behavior on your website and are not sure about the cause, feel free to get in touch with us and we’d be glad to taking a look and providing our suggestions.
Do you have any other ideas/suggestions to do the best with Links Devaluation recovery?

Top Headlines

Protected by Copyscape Online Plagiarism Check
Subscribe to RSS Feed Follow me on Twitter!