Protected by Copyscape Duplicate Content Detection Tool
Showing posts with label Google Search Algorithm Updates. Show all posts
Showing posts with label Google Search Algorithm Updates. Show all posts

Saturday, August 24, 2013

What is Google Penguin 2.0 Update ?

Webmasters have been watching for Penguin 2.0 to hit the Google search results since Google's Distinguished Engineer Matt Cutts first announced that there would be the next generation of Penguin in March. Cutts officially announced that Penguin 2.0 is rolling out late Wednesday afternoon on "This Week in Google".
"It's gonna have a pretty big impact on web spam," Cutts said on the show. "It's a brand new generation of algorithms. The previous iteration of Penguin would essentinally only look at the home page of a site. The newer generation of Penguin goes much deeper and has a really big impact in certain small areas."
In a new blog post, Cutts added more details on Penguin 2.0, saying that the rollout is now complete and affects 2.3 percent of English-U.S. queries, and that it affects non-English queries as well. Cutts wrote:
We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.
This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.
Webmasters first got a hint that the next generation of Penguin was imminent when back on May 10 Cutts said on Twitter, “we do expect to roll out Penguin 2.0 (next generation of Penguin) sometime in the next few weeks though.”
Matt Cutts Tweets About Google Penguin
Then in a Google Webmaster Help video, Cutts went into more detail on what Penguin 2.0 would bring, along with what new changes webmasters can expect over the coming months with regards to Google search results.
He detailed that the new Penguin was specifically going to target black hat spam, but would be a significantly larger impact on spam than the original Penguin and subsequent Penguin updates have had.
Google's initial Penguin update originally rolled out in April 2012, and was followed by two data refreshes of the algorithm last year – in May and October. 
Twitter is full of people commenting on the new Penguin 2.0, and there should be more information in the coming hours and days as webmasters compare SERPs that have been affected and what kinds of spam specifically got targeted by this new update.
Let us know if you've seen any significant changes, or if the update has helped or hurt your traffic/rankings in the comments.
Google has released a series of seven videos designed to help webmasters resolve specific types of spam issues that have been identified on their site. With Google Webmaster Tools offering more specific details about why a website might be penalized, these videos are designed to help you know exactly what kind of manual spam action your site has been impacted by, and the specific steps you can take to correct the issues in Google's eyes.

What is Pure Spam?

Google considers pure spam to be anything to spam that anyone with a bit of tech savviness can tell that it spam. Often called "black hat", Cutts said this includes such things as “auto generated gibberish, cloaking, scraping, throwaway sites, or throwaway domains, where someone is more or less doing churn and burn where they are creating as many sites as possible to make as much money as possible before the get caught.”
Cutts said this is that type of spam that Google takes the most action against. He added that it's rare for people to actually file reconsideration requests for sites that are classified as pure spam, because many webmasters approach them as churn and burn.
For example, here's an image of auto-generated spam site Cutts included in a blog post a number of years ago:
Matt Cutts
Image Credit: Matt Cutts Blog

Fixing Pure Spam on a New Website

Sometimes there are legitimate cases where site owners have purchased the domain only to discover that there is a huge amount of spam in the domain's history, making it difficult for a new owner to then create something legitimate on that domain. People can look up a domain's history on archive.org and see what kind of spam issues had been happening, so it will become a priority to ensure that the new owner is starting with a clean slate with none of the spam content anywhere to be seen.
If this sort of thing happened to you, you must take special care to ensure that the new site you're putting on the previously spamming domain is high quality and nothing that could be remotely confused with being spammy. You essentially need to create actions within the site that gives Google signals that the site is now trustworthy and should be included in the index.

Fixing Pure Spam on an Existing Website

If your site has been flagged as being pure spam, this is probably one of the more difficult spam flags to overcome, because it is reserved for the spammiest of websites. That means, when you file your reconsideration request, you need to ensure that there is nothing anywhere on the site that could be remotely considered spam.
When you're trying to clean up, ensure everything that violate the Google webmaster guidelines has been removed, and that the quality guidelines are being followed to the letter. You should look at it from the perspective of building an entirely new site with new quality content.
Cutts said it's important for webmasters who are trying to clean up from a pure spam warning that they document everything they do, whether it is having purchased the domain from a previous owner, discovering and then removing spam you didn't realize existed on your site, or just simply not knowing better when you created what you thought was a fabulous auto-generated site.
When you finally file a reconsideration request, be sure that you include the steps you took to clean it up and when, so that Google can investigate and decide if the site has really turned over a new leaf.

Sunday, August 11, 2013

Four SEO Tactics That Still Work After Google Penguin Update


Have your websites been penalized by Penguin 2.0? You are not alone. An important fact to know about this Penguin update is that it is an algorithm change. This means SEO tactics which have been working before might not be working now.
About 2.3% of English-US queries are affected to the degree that a regular user might notice.” –Matt Cutts
So, what SEO tactics are still working? Here, I will discuss about 4 SEO tactics that I am still using to maintain my page 1 rankings on my website.

1. Press Releases

An old school SEO tactic that have been working for ages. Press releases are articles that get syndicated or distributed across multiple news websites. The cool thing about press releases is that every article you write will be allowed a backlink and as your article gets distributed among the other news websites, your backlinks will increase as well.
There are multiple ways you can get press releases, one of them is by using a service called PRWeb. PRWeb service helps distribute your article to across 100+ different websites including Google News (PR 8), USA Today (PR 8) and Market Watch (PR 8).
The best thing among these press releases is not only the page rank of these websites but the traffic that you will be getting from them which potentially can be up to the thousands.

2. Guest Blogs

Guest blogging is another SEO tactic that is still working wonderfully well after Penguin. Even though, Google targeted guest post spammers in the past, you can still rank your website just with guest blogs. One advantage of guest blogs is the relevancy of the links you get and after Penguin 2.0, link relevancy is a very important factor in Google ranking factors.
There are a few ways you can easily get guest blogs. The first way is to use the manual way which is by searching Google for terms like: “become a contributor” and “write for us” + (your keyword).
For example: “write for us” + travel, “write for us” + SEO
An easier way to get guest blogs is to use guest blogging platforms like MyBlogGuest and BloggerLinkUp. These platforms connect bloggers who are looking for more posts with guest bloggers. That way it is much easier to get a guest post going instead of contacting each website owner on the search engine for a guest post opportunity.

3. Relevant Blog Comments

Most blog comments tactics have been destroyed by Google over the past year with their previous Panda and Penguin updates. This is because link builders are using automation tools like Scrapebox to spam the whole web with their backlinks.
Google has put a tight filter when factoring blog comments to rank websites. However, relevant guest blogs still work in ranking your website.
A good way to get relevant blog comments is to go to Alltop.com then find your niche websites there and finally find pages to comment on.
Another way to find relevant websites to post on is by searching Google with this query: intext: leave a reply + SEO
Just by leaving 5 relevant blog comments linking to your website everyday can do wonders to its rankings.

4. Diversified Anchor Text

The fourth SEO tactic I use is not a link building tactic but a SEO strategy instead. Every backlink that I place I will diversify my anchor texts on that backlink. Meaning I won’t leave the same anchor text over multiple websites.
For example: when placing blog comments, 20% of the time I will use link building, 30% backlink building, 20% learn more here and the last 30% I will use link building after penguin.
After Penguin 2.0, Google has placed a lot of spam filters on websites with the exact same anchor texts linking back to them. Therefore, be sure to use this anchor text strategy for the backlinks that you are going to build.

Wrapping Up

Press releases, guest blogs, relevant blog comments and diversify your anchor text are the SEO tactics that I am using to maintain my page 1 rankings after Penguin 2.0. The best thing about these SEO tactics is that they have been working for a long time now unlike most of the other SEO tactics that you hear about.

Tuesday, July 9, 2013



10 Dead SEO Tactics (And Their Alternatives)


SEO is a constantly changing industry, and much of what you read in a passing introduction to the subject is outdated. We’d like to take this time to discuss several SEO tactics that are dead or dying. We’re not necessarily claiming that these tactics never work (although that’s true for a few of them). We’re simply pointing out that these tactics are losing value and place the future of your site at risk. All of these tactics are frowned upon by Google, and many of them can get you penalized. Some are more “gray” and still work, but can’t be expected to last long enough to be worth the investment.


Tactic #1 – Exact Match Domain Names


Like most of the tactics mentioned here, this can still work, but only if none of the other search results have anything going for them. There was a time when your domain name could give you a boost over your competitors, but this tactic was so abused that Google released an algorithmic update just to target exact match domains. (Here’s our infographic on the updates that came out in 2012.)

Today, exact match domains only really matter for branded searches. If somebody searches for Amazon, they’re going to get Amazon.com. Google is getting increasingly skilled at telling the difference between a branded search and a phrase search. If your domain is built to target a specific phrase, Google can usually tell, and this will only get more important in the future.

Alternative

Instead of choosing a domain name that matches a commonly searched for phrase, choose a domain name that will stick in people’s heads. Look at all the successful tech companies and you’ll see that very few of them are keyword-driven. Google, Facebook, Kickstarter, Amazon, Zappos: these names are designed to be remembered, not to inform.


It’s also important to realize that the end of exact match domains doesn’t mean the end of keyword research. It simply means that keyword research is far more important for individual pages than it is for domain names. This actually leads us right into our next point.

Tactic #2 – Exact Match Keywords in Titles and Meta Tags


Keyword-stuffed meta descriptions are as dead as an SEO tactic can be. They offer no value whatsoever, except to encourage a click-through from the SERP. We can’t stress this enough. It’s good practice to get a call-to-action in your meta description, and that’s the only reason you should be using it. Stuffing it with keywords is only going to scare away users.

Exact match keywords in titles are a grayer area. If you can fit an exact keyword into the title, it’s still worth doing it, but that’s not really what we’re talking about here. We’re talking about cutting and pasting a keyword out of the AdWords Keyword Tool and giving no thought to your title. This should be avoided almost every time.

It’s not hard for Google to tell if every one of your page titles is simply stripped right out of the keyword tool, and users subconsciously pick up on it as well. This gets especially bad when the keyword phrase you’re targeting isn’t grammatically complete.

Alternative

Work a variation of the keyword phrase into a title that encourages click-through and social activity. Put the focus on grabbing attention and encouraging clicks, rather than on the keyword.


You should still try to work the keyword phrase, or a subtle variation of it, into the title, but not at the expense of a memorable and eye-grabbing title. Don’t concern yourself with getting the exact phrase into the title. Adjust the tense and conjugation of your words as needed, and please don’t be afraid to add
punctuation or adjust the order of the words.

If you pay attention to the search results, you’ll notice that exact match titles don’t show up as often as they used to. Instead, the keywords increasingly show up scattered throughout the title and the body. Google is getting better at interpreting the meaning of your query, and your approach should reflect this change.

Tactic #3 – Meta Keywords

I hope everybody who’s reading this already knows this, but meta keywords have absolutely no influence on search results whatsoever. This has been true since the end of the ‘90s. It can be good practice to include a few keywords in case your site gets scraped by some tool that still uses the meta keywords, but as far as SEO this one is just plain useless.

Alternative

Focus on increasing your percentage of repeat visitors from the search engines. Google can measure how often users return to your site, and they use this information to determine how relevant your content is. I sometimes think of this as the present-day stand-in for meta keywords, since links are more accurately thought of as authority signals than as relevancy ones (though this isn’t completely true).


Analyze Google Analytics to find the pages on your site with the highest percentage of repeat visitors (the lowest percentage of new visitors). These are the pages that you want to promote and emulate the most. (Make sure that you are filtering yourself out of Google Analytics before taking this data too seriously).

Tactic #4 – Building Links with Duplicate Content


This one is not only dead, it never worked very well to begin with. Google can clearly identify when a piece of content is copied, and if you submit multiple pieces of content it will, at best, ignore all but one of them.
In the past, this may have been a decent way of capturing a link from the highest PageRank site willing to post your duplicate content, but today it’s actually just a good way to get yourself penalized. Google knows that spammers are trying to use this tactic in order to manipulate their rankings, and it very rarely works.

A variation on this tactic that has also worked in the past it to “spin” an article by running it through an algorithm that creates automated variations of it. This process results in very low quality articles that are no longer exact copies. Anybody who tries to read the articles can immediately tell that something is wrong with the grammar and that bizarre synonyms have been chosen. Google is no longer fooled by these tactics and can usually tell the difference between content written by a human and content written by an algorithm.

Alternative

It’s still possible to get multiple links from a single piece of content, and the best way to do it has always been the best way to do it: post your content somewhere where it will reach a massive audience and attract natural links as a result.


It no longer makes sense to create and submit a piece of content if the only benefit is a single link back to your site for each publication. This has never been the kind of a link Google wanted to count and it’s never been the best way to grow your link profile.

To build the highest quality links, and grow your link profile at the fastest rate, you need to get published in a place where you’ll actually get read. What is the SEOmoz or SearchEngineJournal of your industry? That’s where you need to be published, because it sends referral leads directly to your site, and creates a ton of natural links in the process.

Don’t waste time on content submission that only results in a single link back to your site under your own control. It’s just not worth it anymore.

Tactic #5 – Keyword Density


This is another outdated tactic that, much like meta keywords, actually hasn’t worked for many years. There is no specific number of times you should use a keyword on your page, or a certain percentage of your content that it should make up.

As with many of the other tactics on this list, too much emphasis on keyword density can actually end up hurting your rankings. Thinking about keyword density while writing also makes it very difficult to write worthwhile content. The more time you spend pondering how to fit a keyword into a piece of content, the less time you’ll spend thinking about how to write sentences that keep users engaged and informed.

Alternative

Instead of worrying about keyword density, you can capture a much larger space in the search engines by bringing in long tail traffic. The more in depth your content goes, the more long tail traffic you’ll bring in.


Research has a way of expanding the vocabulary of your content. As you dig deeper into the problems your users want solved, and mine the tough sources for original information, you’ll also serendipitously discover and use keyword phrases you would never have found in the keyword tool. Since they’re naturally attached to the subject at hand, they’re also commonly searched for along with your core keywords.

In addition, you can use the keyword tool to find related keywords, and discuss them within your content. When you do this, however, it should serve the end user. Don’t go out of your way to use several variations on your core keyword. Use the keyword tool to find related topics, and discuss them naturally within your blog posts.

Tactic #6 – “Unique” Content that Serves No Real Purpose


We’re currently exiting the period where “unique” meant little more than “not plagiarized.” Your content needs to be consistently creative, and there are things you can do to make that happen.

There was a time when it was easy to rank content as long as it was technically unique, in the sense that those exact words hadn’t been said elsewhere. All it took was enough links. This is still possible, but it rarely lasts long. It’s not sustainable.

Google’s Panda algorithm is designed to identify how well a piece of content serves its purpose. Leaked guidelines for Google’s quality raters make this abundantly clear.  If your content doesn’t meet its intended purpose for the user, or it has no purpose to begin with, it will eventually be taken down by an algorithm update or a human quality rater.

Alternative

Focus on the true meaning of unique, as in “unique selling proposition.” Your content must be designed to fill a hole in the idea marketplace. It must add value that no other prominent piece of content on the subject adds. That value can come from the research, the tone, the target audience, the medium, the personality, the user experience, or one of many other ways to differentiate yourself.


The point is to focus on unique value, as opposed to merely unique words. You accomplish this by identifying the specific problem you are trying to solve, and solving it for your target audience better than any other piece of content on the web can. If you can’t do that, you need to choose a different topic, because you’re chasing a fool’s errand.

Tactic #7 – Exact Match Anchor Text Links


The anchor text of a link hasn’t lost all of its value, but it’s no longer the most important ranking signal out there, and if you pursue anchor text overzealously you will end up with nothing more than a penalty to show for it.

Over-optimized anchor text sends a very clear message to Google: you have direct control over the links pointing toward your site, so they are not natural. In their eyes, this also means that they are irrelevant as a sign of your authority on the web.

Alternative

When you build backlinks yourself, you should move away from anchor text and start focusing on your conversion rates. Place links where they are more likely to be clicked, and use anchor text that’s more likely to result in a click-through. You can test this a bit by experimenting with AdWords or a different text link ad service if you wish.


The anchor text isn’t completely ignored, and it’s still used to find clues regarding what the linked page is about, but you should avoid too much emphasis on it. It’s generally best to just discus the linked content in the way that makes the most sense, and then attach the hyperlink to the part of the sentence that’s most likely to get clicked on.

In addition, it’s wise to use bare URLs, branded links, links containing the exact title of the linked page, and partial match anchor text.

At the same time, you shouldn’t spend too much time trying to make your links appear “natural.” If you’re doing things right, most of your link profile should already be natural, as we’ll get at the end of the article.

Tactic #8 – Keyword Heavy Footer Links


It doesn’t matter whether they’re outbound or internal links, keyword heavy footer links are a bad idea. The footer has been abused as an SEO tool for quite some time, and Google has wised up to the fact. The search engines now place most of the emphasis on main content, and for the most part ignore links in the footer. Excessive keyword use in the footer is just asking for an algorithmic demotion.

Alternative

Use the footer to reduce your bounce rate. Google most likely measures “pogo-sticking” behavior, where a user clicks onto a site, clicks back, and goes to the next site. Users who instead stay on the site, and don’t return to the search results, tend to be more satisfied with that result.


Instead of filling the footer with keyword links, fill it will calls to action for more content that will interest them. Test, test, test and find the links that encourage the highest click through rate. Keep users on your site so that they are more likely to remember your brand, share you with their friends, subscribe to your newsletter, and ultimately convert. Keep them from returning to the SERP and sending a negative signal to Google in the process.

Tactic #9 – Site-Wide Links


Nearly every client we’ve dealt with who was hit by Penguin or an unnatural link penalty had a problem with site-wide links, either on- or off-site. Whether they’re backlinks or internal links, a link from every single page on any website is generally a bad idea, especially if it’s keyword optimized.

Don’t get us wrong. You want a link back to your home page from every page on your site; that’s just good UI. And if a few sites happen to put you in their blogroll that’s rarely a problem, especially if they used your brand name rather than a keyword.

But if site-wide links end up making large part of your backlink profile, or nearly every page on your site links to every other page, you’re just asking for trouble.

Alternative

For your internal links, just include a few links in your main body and make a few recommendations at the end of each blog post. This keeps users clicking through and seeing what you have to offer, which is great for engagement. There’s no reason to link to a single page from every other page on your site, unless it’s your home page, or it’s part of a “best of” list in your sidebar.


As for external links, you should essentially never build a site-wide link yourself. Don’t worry too much about site-wide links if they’re natural, but as for your own efforts you should stick to contextual links or calls to action in your signature. Links should look more or less editorial (although not to the point that you’re disguising the fact that it’s your link).

Tactic #10 – Unnatural Links

In general, you should avoid any unnatural linking scheme. What do we mean by this? Well, Google’s terms of service indicate that any link intended to manipulate rankings is a violation. Many SEOs fail to understand this, and mistakenly believe that their links are within Google’s guidelines as long as the quality levels are high.
I suppose we’re pushing things a bit by saying this tactic is “dead,” as it can still work quite well, but consider yourself on thin ice. Even quality links are in Google’s gray area if you built them yourself, in particular if the link offers little or no value outside of SEO. If there’s reason to believe that the link only exists to manipulate your rankings, there’s reason for Google to disregard the link.

Alternative

Stop building links to manipulate your rankings. I know, this is almost heretical in some circles, but it really is the only way to stay within Google’s terms of service, and it’s actually the fastest way to build links.

Instead of focusing on building links to grow your presence in the search engines, switch over to building links for referral traffic. This is the only method defensible as a long term marketing strategy.

How could this be the best path to improving your rankings? It’s simple. The pages that send the most referral traffic are the pages that Google wants to rank in the search engines. Focus on building links from those sites, and you will focus on promoting yourself with the influencers that matter most.

Google does not want to see your pages ranking on backlinks that you built. That’s an inconvenient truth for many SEOs, but there’s no getting around it. If you focus on referral traffic, you end up focusing on tactics that result in natural links. It’s the simple law of numbers. The more often people see your content, the more often they’ll link to it. In the meantime, your hand built links will come from the sites that send the most positive signals to Google, and actually indicate that you do have some influence.
It’s the path of least resistance. Cheating is harder.

Conclusion

There’s no point investing in outdated tactics that can’t be expected to work long term. While some of the tactics we’ve talked about can still be effective, this is only true in the short term. These are SEO strategies that can, at best, give you a false sense of security about the future, and all will eventually leave you dead in the water if you rely on them exclusively.
We realize we’ve taken a few strong stances here, and we welcome all feedback.

Post Source E2msolutions.com

Friday, June 21, 2013

Responding to the Link Devaluation “Google Update”



It’s been almost half a month and Google still denies that an algorithm update occurred on January 17th, 2013. Even so, SEOmoz’s Dr. Pete noticed that several sites were no longer ranking for branded terms. Within a day of the suspected update, several webmasters contacted us with concerns about big drops in their rankings. We noticed a common narrative in their stories.

Very few of them had seen a site-wide drop in rankings.
Whether the drop in rankings was large or small, the sites were only seeing a loss in rankings forsome of their keywords. We discovered that many of them were still enjoying top positions for some of their other keywords.

After analyzing the affected sites in depth, we reached the conclusion that some sort of targeted link devaluation was underway. Some pages had dropped a few pages, others had plummeted out of the search results, and still others were completely unaffected.

We’ve been tracking the rankings of a wide variety of sites over the past several months, and we find ourselves in agreement with what Branded3 has to say on the matter. We’re seeing Google moving in the direction of devaluing links on a continuous basis, as they are crawled, rather than eliminating them in large chunks with one-off updates like Penguin.

At this point, we’re fairly certain that the January 17 event was the result of continuous link devaluation, rather than a single update.

There was already some talk of an update on January 15, and certainly no shortage of grumblings before then. It’s our belief that January 17 was merely the point where this process reached critical mass. If Google is crawling bad links and using them as seeds to detect other bad links, then at some point this process will explode exponentially, which we feel is exactly what happened.

So in that respect, what Google is saying is true. There was no update on January 17. The changes started several months earlier.

Rather than delve into every nook and cranny of these link devaluation algorithms, we thought it would be more useful to offer you a guide to recovery, in a similar vein as our Ultimate Guide to Advanced Guest Blogging for SEOmoz. So let’s take a look out how to recover from link devaluation, and how to prevent it in the first place.

What is Link Devaluation?

Link devaluation itself is nothing new. Google has never released an “official” update on the matter, but it has been happening for quite some time. Any webmaster who has witnessed a drop in rankings that had nothing to do with a penalty, increased competition, or a Google update has experienced link devaluation. It is simply the process whereby Google disregards a link, treating it as though it does not exist.

What is new is the continuous devaluation of links. In the past, Google employees manually devalued links, or used a combination of manual selection and algorithmic extrapolation to find and devalue links. Now, it appears that Google is devaluing links as they are crawled and indexed, rather than removing them in large chunks.

Google has many incentives to move in this direction. Penalties are post-hoc and selective. They are usually based on sites surpassing a certain threshold of suspicious links or on-site activity. Penalties, in general, target sites or individual pages rather than link graph itself. In short, penalties only put a band-aid on the larger problem of webspam.

In contrast, link devaluation cleans up the entire link graph, rather than targeting individual sites.

Were Your Lost Rankings the Result of Link Devaluation?

If you are seeing a slow decline in your rankings rather than a sudden drop, this is almost certainly the result of link devaluation (if it’s not due to higher competition or fewer searches in the first place). But a relatively swift drop can also be the result of link devaluation if it only seems to be affecting specific pages, or if you are still seeing traffic even after the drop.

This is also true for some other updates and penalties, however, so you’ll want to consider the following:
  1. Has there been a Google update? Wait a few days and check with the top sites to see if any updates have been announced around the time you saw a drop in traffic. Check withMozCast to see if there were any major fluctuations in rankings around that time. If an update has occurred around that time, you will want to check into it before pursuing anything else.

  2. Take a look at the total “Links to your site” in Google Webmaster Tools to see if this number is dropping. If so, link devaluation is almost certainly the issue, since Google doesn’t typically omit links from webmaster tools for no reason. It is a good idea to create a spreadsheet and record your links over time (or use a tool to do this for you), so that these changes are more obvious.

  3. Identify the landing pages that have seen the largest drop in search traffic. If the quality of links is lower than usual and the quality of the content is average for your site, it’s unlikely to be Panda. If there hasn’t been a Penguin update, this means it is probably link devaluation.

Misconceptions About Link Devaluation

It’s easy to conflate all the various aspects of Google’s algorithm, so it’s important to clarify the following:
  • Link devaluation is not Panda – Panda is designed to target low quality content. It is not based on link metrics. However, links from these affected pages are devalued, and this can indirectly affect your rankings.
  • Link devaluation is not Penguin – Penguin targets entire sites that either use manipulative links to rank their own site or other sites. However, the links from these affected sites are devalued, and this is an effect that you may notice even if your site is not directly hit by Penguin.
  • Link devaluation is not the unnatural links penalty – The unnatural links penalty was a precursor to Penguin that completely deindexed several sites that people were using to manipulate their rankings. Once again, links from these penalized sites are devalued, which can indirectly impact your rankings.
The important thing to understand about link devaluation is that it is not a penalty in the true sense. Devalued links simply don’t count, or don’t count as much as they used to. Most people who are impacted by Google updates aren’t actually directly affected. Instead, they are affected by the devalued links from the sites that are directly affected.

Now that links are being devalued on a continuous basis, you can be impacted even in the absence of a Google update. Do not confuse devalued links with penalties.

Responding to Link Devaluation: Do’s and Don’ts

It’s easy to do more harm than good by overreacting to a devaluation of your links (or any update or penalty, for that matter). Here are a few things to keep in mind to keep you on the right track.

Do’s:

  1. Revise your link building strategy by putting a focus on high quality links. We’ve written extensively about how to do that at Search Engine Journal with three posts on the subject.
  2. Use an anchor text strategy built for the modern world.
  3. Focus on content marketing as a link earning strategy, rather than merely building links. Take a look at our guide on the subject to get a handle on how to approach this.
  4. Approach SEO from a brand-building perspective, not a ranking perspective

Don’ts:

  1. Do not waste time removing low quality or spam links from your profile. The bad ones have already been devalued and aren’t being counted anymore. They don’t count against you if this is genuinely a link devaluation.
  2. Do not use the Google link disavow tool. In general, you shouldn’t use this tool unless you have received a warning or a notice of action directly from Google, as we’ve previously discussed. At best, you’ll only disavow the links that Google has already devalued. More likely, you’ll disavow links that Google hasn’t devalued and end up shooting yourself in the foot.
  3. Do not use any link building technique that allows you to personally build a large number of links quickly.
  4. Do not build links using easily produced or automated content. Build links using content that attracts attention.
  5. Avoid links that don’t make sense in the modern era, like the ones we talked about in this SEJ post.

5 Reasons the Push Toward Link Devaluation is Actually a Good Thing

If Google’s new emphasis on continuous link devaluation sounds scary to some SEOs, here are a few reasons to see the change as a positive one:
  1. Devalued links don’t count against you, so there is no reason to spend time removing all the suspected links yourself.
  2. Devalued links don’t cause your site’s traffic to plummet overnight in the majority of cases, which gives you time to adjust your strategy.
  3. You will generally still see some of your pages unaffected after link devaluation occurs, unless a large number of devaluations causes your entire domain authority to start sinking.
  4. You can focus all of your efforts on building high quality links, rather than being concerned about weeding out bad ones.
  5. Spammers will be less likely to see results even in the short term, as opposed to the repeated process of success then penalty over and over again. Similarly, you will get more consistent feedback from your rankings about whether what you are doing is working or not.

Conclusion

Links devaluation is something not easy to identify unless you analyze it deeply or take help from the professionals. Identifying the right cause of penalty is the most important rather than taking actions and moving forward. I highly recommend taking help of professionals if you are unable to identify the cause of penalty as moving into wrong direction will put you in trouble and you will not be able to see your website rankings back that stay last long.

If you come across to suspicious behavior on your website and are not sure about the cause, feel free to get in touch with us and we’d be glad to taking a look and providing our suggestions.
Do you have any other ideas/suggestions to do the best with Links Devaluation recovery?
SEOprofiler SEO software

Top Headlines

Protected by Copyscape Online Plagiarism Check
Subscribe to RSS Feed Follow me on Twitter!