Tampilkan postingan dengan label Search Engine Optimization. Tampilkan semua postingan
Tampilkan postingan dengan label Search Engine Optimization. Tampilkan semua postingan

27 Things That Help and Hurt SEO

SEO

Additionally, factors can be relative.  If factor A is at a certain level, then factors B and C contribute accordingly.  If factor A’s value drops or increases, that causes changes to how the signals from B and C are integrated and thus the final result changes – sometimes minutely, sometimes dramatically.  It's a bit like a manual transmission in a car.  You can travel at 40 mph using 1st gear, 2nd gear or 3rd.  Still the same 40 mph, but due to the different gear ratios engaged with each individual selection (1st, 2nd or 3rd gear), the engine speed will vary, while maintaining that 40 mph.

In fact, it’s worth noting that instead of chasing SEO at the on page and technical levels, many would be better served by trying to influence people to impact some of those important signals and factors.  Rather than try to guess at information which is not ever shared, turning to working on influencing people and developing a repeatable pattern that works, is often a better bet.

If you can influence visitors to your site to share a link or tweet about your product by placing relevant social sharing options in the best locations (which you determine through testing, not guessing), you not only get the value of their efforts and actions directly (their friends seeing it and clicking through to you), but you also reap the rewards of how those signals impact the search algorithms.  In the end, you have discovered a process that’s repeatable as well – if you post a positive, funny article, for example, more people share it.  The process then becomes: write positive, funny article, post, use optimized social sharing option placements.
And that is infinitely easier to figure out than to try to guess what’s going on inside an engine’s algorithm.
But what things matter?  Well, let’s look at the common things, many of which are often overlooked or shortcuts taken.
What to pay attention to:
  • Title tags
  • Meta Descriptions
  • Clean URLs
  • Images and Alt descriptions (also called alt tags)
  • H1 tags
  • Rel=Canonical
  • Robots.txt
  • Sitemaps
  • Social sharing options
  • Unique content
  • Depth of content
  • Matching content type to visitor expectations (text, images, video, etc.)
  • Usability Page load times (to a certain point – faster is great, but not at the expense of usability and usefulness)
  • Crawlability (AKA discoverability, so can we actually get to all your content)
  • News – if you are actually a new site, submit for inclusion
What to skip:
  • Meta Keywords (fill them in if you like, keep it short and relevant, but not a big ranking factor)
  • Duplicate URLs
  • Overly long URLs (no set number, but you’ve all seen these)
  • Cloaking (comes down to your intent, but risky business for sure)
  • Link buying
  • Selling links
  • Link and like farms
  • Three way links
  • Content duplication content
  • Auto following in social media
These list are, as suggested at the start of this article, not full lists.  These lists are also not in any particular order, as what matters in one instance will differ from other instances.  Some items will apply to you, some might not.  Some you may know already, others you may need to research to learn more about.  And taking something like H1 tags from the “pay attention to list” and saying “Yeah, we have those” does not mean you’re done.  You have to understand why an H1 tag is important to a reader, and earn how to write optimized content for them.  Just “having” a tag in place doesn’t mean yours are doing everything they can for you.

Duplicate Content Like Terms and Conditions Will Not Cause Harm to the Site's Rankings

Duplicate content across many webpages like those contained in terms and conditions which are similar in many aspects but are legally required will not hurt your website rankings in Google. However, having duplicate content on other important pages of your site that are not legally required or has scope of change may affect your website presence on Google.

Matt Cutts gave a simple explanation that legally required similar content webpages are ignored by the search engines and they do not cause any harm in the search rankings of the website. This explanation was necessary as many webmasters were confused after the Panda and Penguin updates as all kinds of duplicate content can hurt their website. It was for this reason that webmasters contacted Google to specify a valid answer to this. Have a look at this video:-


 

Google’s Matt Cutts On Hidden Text Using Expandable Sections: “You’ll Be In Good Shape”

In the latest video answer from Google’s head of search spam Matt Cutts, he answers a question about “hiding” text in JavaScript/AJAX-like expandable menus. Matt said that generally this practice is acceptable within Google’s webmaster guidelines, if and only if, it is being done in a non-spammy way.
Use of expandable content sections is very popular across the Web and used by large e-commerce sites and content sites to make the content on the page less threatening and more welcoming.
Matt said there are very good reasons to implement expandable content sections, but if you are doing it with the intent of tricking Google by hiding content, in that case it would be against their guidelines. Otherwise, “you’ll be in good shape” using this Web design technique.
The example Matt gives is how Wikipedia collapses some of the content on their mobile interface, here is a screen shot:
wikipedia-mobile-expandable
As you can see, for a page with a lot of content, this can be very helpful for users.
Below is the video:
Again, if you do this without the intent of hiding content from Google, you should be okay.

Does SEO Still Matter?

We’ve all heard “SEO is Dead” from alarmists, the uninformed, and in countless link bait articles (irony?). But is there any truth to it?
Search Engine Optimization

While Search Engine Optimization may look completely different than it did even a year ago, I firmly believe that SEO is not only alive, but thriving. Here’s why:

Some of SEO is Dead

Many tactics that have fallen under the SEO umbrella can safely be considered dead, either because they don’t work anymore, never worked, or still work but are in violation of Google’s guidelines. I’m not going to spend time discussing why they don’t work or are risky, because that’s not what this article is about, and there has been plenty written about the topic.
Just so we’re on the same page, here are some examples of basic SEO tactics that aren’t worth your time:
  • Keyword stuffing and hiding
  • Buying mass links, directory links
  • Duplicating websites (or categories) on different domains
  • Content spinning, automatic content
  • Optimizing purely for “ranking” outcomes

SEO Keyword Spamming

What SEO is Today

SEO at its core is the art and science of making high quality content easier to find on search engines. The key point being ‘quality content’ that helps customers answer questions that lead to purchase or some other business outcome. Most of Google’s algorithm updates are intended to reward good content and punish spam. While it may not always feel like it, most of Google’s best practices for SEO are really on your side, you just need to learn and master them.
Here are some SEO tactics that are alive and well:
  • Keywords that support customer targeting
  • SEO copywriting and on-page optimization
  • Link attraction
  • Internal link optimization
  • Technical SEO (anything designed to make your site more accessible to search engines)
  • Optimizing for engagement and conversions

Quality Content is Good, Optimized Content is Best

If search engines are just trying to reward high quality content by making it more findable, isn’t it enough to just create great content and call it a day? Unfortunately, no.
While search engines are getting much smarter, more efficient, and overall better at ‘screening’ content, they still pale in comparison to people’s inherent ability to pick out the nuances and meaning of content. So it’s important to send the right signals to search engines and make those signals as easy to understand as possible.
Content quality comes down to relevance for customers and there’s no better way to target customer interests than through keywords. Every search begins with someone typing keywords into a search box, and ends with them clicking on one of the sites listed in the search results. If your site doesn’t include the keywords or closely related phrases on web pages, in meta-data, or inbound link anchor text, you’re not giving the search engines (or buyers) the information they need to understand your site’s relevance for that search query.
Optimization of on-page copy and meta elements can have positive effects on search traffic and rankings, in particular for sites that are strong in most other aspects. For example, I have been working with a client in the software industry who has a well designed site that is technically sound, has useful and compelling content, and a strong back-link profile. 
However, competitor keyword research and customer targeting analysis indicated that the keywords which are most relevant to their audience related to consideration and purchase stages of the buying cycle weren’t being effectively targeted (i.e. they didn’t appear enough or at all in on page copy, meta elements, or cross-linking).
Within 3 months of implementation of basic on-page content optimization, we achieved a 320% increase in organic search traffic, a 15% decrease in average bounce rate and page one rankings in the major search engines for nearly all of our identified target keywords. Better visibility for what customers are actually looking for leads to more traffic and sales. 

Links Still Matter

While Google’s recent announcement about the decreased importance of links is significant, it is far too soon to write off quality links altogether. Crawling links is an important way for search engines to discover content, thus the more links pointing to your site (from relevant, quality sources), the more opportunities the search engines have to find your content.
Don’t fall into the trap of treating links as more important than quality content, or that enough links pointing towards bad content can somehow make it good. This is the definition of misguided effort, as great content will not only attract quality links on its own (with help from effective promotion and social media shares), but is far more likely to increase visitor engagement when it’s found, and result in those all-important conversions.
Social shares are as important as links from other web pages, so ensure your content creation efforts include content promotion efforts through social networks. Grow networks on a regular basis to increase the audience reach of the optimized content you’re promoting too. Google+, Facebook and Twitter are must-haves with any content promotion efforts through social media. Just make sure you’re promoting plenty of other useful content, not just your own.
Increasingly, it has become important to not only acquire quality links, but to monitor and potentially remove low quality links, especially if you have received an unnatural link warning from Google. Regular monitoring and auditing of your site’s link profile is a good preventative measure, as bad links often have a cumulative effect, and can be very difficult to clean up once they become a clear problem.
Recently, a preliminary audit of a new client’s site indicated the prevalence of several nasty kinds of links, including paid site-wide links, and several thousand links from blog networks and link farms. Given the severity of the problem, we prioritized an extensive inbound link audit and disavowal initiative to ensure the quality content being published would not be negatively affected by previous SEO link building efforts.

Technical Problems can Prevent Search Engines (and People) from Finding and Engaging with Your Content

As fast as things change in SEO, the chances that search engine algorithms will start to penalize sites for functioning well from a technical standpoint are slim, and humans are no different. How many times have you wished a site would load slower?
The importance of optimizing your site so that your pages load fast, your content is easily accessible and your navigation is intuitive cannot be understated. People will leave a site and never return if they get confused or have to wait too long, and search engines will too.
This is one area in particular to keep a close eye on, as small technical issues can have wide-spread and severe effects on your site’s search engine friendliness. Many companies with large sites that employ digital marketing agencies with strong SEO skills, receive their value many times over just from ongoing technical optimization.
For example, un-intentionally  blocking pages or a whole site from being indexed in search engines via robots.txt is not only an SEO killer but very easy to do. Often development teams will temporarily block parts of a site when making updates, and unfortunately neglect to restore the robots.txt file following the updates.
As site updates can often introduce indexation as well as other technical website problems, it’s a good idea to include a step for an external team to check for any problems following  a major update, as well as on an ongoing basis.

Modern SEO is Alive and Well

By definition, SEO is about an ongoing effort to improve the performance of your website content to be found both by search engines and customers using search engines. What better time is there for your useful content to be found than at the exact moment your customers need it? That’s the value search engine optimization brings to the online marketing mix. As long as people use search engines to find information and businesses have content they want potential customers to see, SEO will be important. I don’t see that changing anytime soon.

Linking all Your Domains Together - Is it Right in the Eyes of Google?

Many large web networks, blog networks, big brands etc. all have more than 1 website and majority of them link all these websites together. Some may even link more than 30 websites from a single domain, resulting in massive interlinking and causing serious spamming issues. However, linking all your domains together solely depends upon the user experience and how well you have kept the user experience in mind while interlinking your domains together.

For Country Specific TLD's, Interlinking is Fine

For businesses having 10-12 domains or more and all having country specific TLD's, it is OK to interlink their websites together. These businesses must create a separate page for listing all their domains and link it from the home page. This is a much better option than linking all the domains from the footer itself.

Here is a video from Matt Cutts, giving advice on how should you link all your domains together- 


Google’s Matt Cutts: Duplicate Content Won’t Hurt You, Unless It Is Spammy

Duplicate content is a huge topic in the search engine optimization (SEO) space; heck, we even have a category devoted to the topic. But should we worry about it? Google’s head of search spam, Matt Cutts, said he wouldn’t stress about it — that is, unless it is spammy duplicate content.
In a video posted today, Matt Cutts answers, “How does required duplicate content (terms and conditions, etc.) affect search?”
Matt Cutts said twice that you should not stress about it, in the worse non-spammy case, Google may just ignore the duplicate content. Matt said in the video, “I wouldn’t stress about this unless the content that you have duplicated is spammy or keyword stuffing.”
Google has said time and time again, duplicate content issues are rarely a penalty. It is more about Google knowing which page they should rank and which page they should not. Google doesn’t want to show the same content to searchers for the same query; they do like to diversify the results to their searchers.

Syndication is a Valid Way to Get More Links

Search engine optimizers and internet marketers have long used the strategy of content syndication as an effective method to increase links to a website. But one question still haunts them and that is "Is content syndication a valid way to get more links?". Well, the answer is, YES. In Matt Cutts opinion, content syndication and link building is not illegal. The point is, you only need to maintain some technical things to get it right.

Content Syndication


How to Do Content Syndication the Right Way?


In order to get the full benefit of content syndication, you need to put some mechanical things in place.

Publish the content under your Google Authorship - You need to publish the content either on your main site or any other authoritative site under your Google Authorship. This will help Google find out the original author of the article and pass on the full value.

Use rel=canonical - You must use rel=canonical tag in order to help Google understand that the location of the original content and treat the other syndicated ones as unoriginal.

Make Your Website Ready for 2013 Link Building - Infographic

Link building in 2013 infographic, Make Your Website Ready for 2013 Link Building - Infographic Link building has changed a lot in 2013. The infographic shared below will unveil the basics of link building in 2013.

5 Things Google Gives Priority While Presenting the Results


Quality - The overall quality of the site matters a lot. This includes the quality of content and the trust which the audience holds in the content.

Relevancy - Google always picks the most relevant site out of the lot with respect to the search query.

Frequency - The frequecny with which a brand receives promotion, including links and posting of content, all holds a certain value in the eyes of Google.

Naturality- How natural is the link building process? Is the website using unnatural methods to build links? Naturality with which links are created can make or break a website.

Quantity - The quantity of links and the quantity with which content is created on the site, both holds value.

Link Building Platforms


Content Marketing - The traditional way of marketing which includes article syndication and guest posting.

Social Media - Promoting a website on major social media platforms like Facebook, Google Plus, Twitter, StumbleUpon etc.

Local Niche Sites - Presence of business details in locally relevant websites.

Geo Local Sites - Presence of business details in geo local sites.

Lead Generation Platforms - These are costly but gives good results.

Link building in 2013 infographic

Happy Blogging!

Reasons Why Accepting Guest Posts Could Be A Threat

Guest Posting is no longer a mysterious word. If you are in the field of blogging, then you must be familiar with it. Guest posting is one of the proven methods to improve the Search Engine Optimization performance by improving the Page Rank. We all know that incoming back links are really helpful in increasing the Page rank, while there are lot of other factors also that build the Page Rank.
Page Rank is actually named after the founder of Google, Larry Page. It’s basically an algorithm that ranks the page out of 10. The concept of Page Rank was introduced in 1996 in the Stanford University by Page and Sergey Bin. The concept of guest posting didn’t exist at that time because blogging was not popular and the internet was not available to everyone. This meant that big firms didn’t use the internet to grow their businesses.
Nowadays, the trend is totally different. Almost all the big firms are online. Blogging has grown very much. You can well imagine that 99% of internet is built on blogs. Blogging was introduced around 12-13 years ago and now days it’s a billion dollar business. As the competition among the bloggers and e-commerce firms increased, Google introduced many restrictions in showing the page in search results and therefore made the Page Rank for the purpose of ranking a page. To improve the Page rank, everyone started doing different practices and then guest blogging came in to existence. Afterwards, guest blogging gained much popularity and became the proven method to build the page rank. It has also shown its importance in building relationships between bloggers.
A few months ago, Matt Cutts, Head of Search Quality Team at Google,  released a statement that hosting guest posts can be harmful for your blog. Even I think that hosting too many guest posts can be harmful and can threaten your blog.
After the introduction of Panda and Penguin Update, Guest Blogging started being considered as the best way to build back links to improve the Page rank. But over the past few months I have seen many bloggers who stopped accepting guest posts on their blogs. There should be some reasons for this cause.

Reasons why accepting too many guest posts could be a threat -
Guest Posting may degrade your page rank -
Yeah, hosting guest posts may degrade your Page Rank. How? Take a look. Suppose you did lots of link building. We know that all that Google data (Back links, Site Links, Page Rank etc.)  constitute, what is known, as Google Juice. Now consider an example, you have a glass full of that juice, now someone came and asked you to share your juice, you gave him juice. What happened here? You lost your juice. If you continue to do this, how much juice will you have left?
May lead to lot of broken links -
If you are hosting guest posts then you are going to have some outbound links in your blog. What is the guarantee that all those links are going to direct to a proper domain? Suppose you hosted 500 guest posts on your blog and for example 100 guest posters deleted the link from the internet which they have posted on your blog through guest posting, Boom! You just got 100 broken links on your blog that may lead to degradation in your SEO performance because Google hates it if you have a lot of broken links.
Guest posting can affect your blogging relations -
Here comes the major problem. Although guest posting is one of the tools used to build relations with other bloggers, it is also one of the major issues that affect the relations bloggers have. It’s pretty common that everyone wants to contribute a guest post to the popular blogs, but it’s not possible to host every post because not every blogger produces high quality content. Getting a guest post rejected creates a sentiment towards that site. If you lose a writer, then you most likely lose their readership as well.
Guest Posting may lead to lose your blogging identity -
It’s true that hosting too many guest posts may cause you to lose your identity in your community and your blog’s voice. Almost all bloggers have their own format publishing their posts. If you are publishing only the guest posts, then when you are going to publish your own creations. Don’t you think it’s going to decay your blogging identity?
Guest posting may lead to spamming -
The most hated thing in the virtual world is spamming. Many guestposters are spamming by using links that are irrelevant to the site. By doing this, you may lose important readers.
What you can do -
  • Allow guest post only for limited days.
  • Publish only high quality content.
  • Write very strict guidelines for the guest bloggers to follow.
That’s what I think about hosting guest posts on your blog. I would like to hear from you. Please share your valuable feedback in the comment section below. Keep Blogging!

Black Hat SEO Practices That Can (and Will) Damage Your Reputation

The Internet has changed the way business is handled, and in doing so, it has also changed many of the ways businesses go about building and managing their reputations. A business's online reputation can help make or break its success. With that said, businesses are taking extreme steps to build and protect those reputations.
SEO

About Using SEO to Build a Reputation

 
When using SEO tactics correctly, a website or business gains the favorable attention of search engines. This helps the site move up in search rankings, which means the site will come up in searches more often. In other words, the site listing is more likely to get found when people are searching for terms related to the site. The bottom line is, SEO is designed to increase traffic to a site by the process of gaining high-ranking SERPs (search engine result pages).

SEO is "white hat" best practices. These principles should be used when working to develop and maintain a business, and also when building an online reputation.

Black Hat SEO


The label "black hat" is used to refer to inappropriate SEO strategiesBlack hat SEO methods are those designed to attempt to "trick" search engines or damage other websites in some way. The use of these techniques can result in websites dropping in search engine ranks or being completely banned from a search engine. That has a detrimental effect on the business's online reputation.

The goal of black hat SEO is really the same as white hat methods -- to gain search engine juice. But not only are these techniques frowned upon, they can literally ruin a business's overall Internet reputation.

SEO Tactics to Avoid


Hidden Content: Although there are different ways to accomplish this dirty deed, some webmasters will add keywords in a white font to a white background or use black font on a black background, etc. This results in the words being invisible to site visitors, but the keywords are picked up by search engines.

Link Farming: Link building is an important part of SEO, but links should be relevant, and they should be built over time in a natural way. Joining a link farm will ensure links, but they do not help increase traffic to the site, and they are highly frowned on by search engines.

Worthless Content: The truth is soon discovered about sites that add worthless content just for the sake of building quantity and adding keywords. These sites become known as spam sites that post irrelevant content.

Things to Do Instead


The basics of SEO begin with proper keyword research and learning how and where to use those words and terms. In most cases, a short phrase is more effective than a single word.

The website itself should be user-friendly and easy to navigate. It should be designed in a way that makes it easy for visitors to find what they are looking for. Add quality content with the intent of becoming an authority site. Content should always be properly optimized.

Link-building can be a slow process, but it is necessary. Making use of natural link building methods are the best way to add reputation to your site. Internal links are also important for SEO purposes. Link related articles whenever possible.

White hat SEO techniques are best practices that can take any business to a higher level, and when practiced consistently, these methods will protect and maintain the online reputation of the business. 

Google Will Try To Get More Examples Of ‘Bad Links’ In Messages To Webmasters

Google says it will try to get more examples of so-called “bad links” in its messages to webmasters who have submitted reconsideration requests after being hit with webspam penalties.
In a Webmaster Help video today, Google’s Matt Cutts responded to the submitted question:
Client got unnatural links warning in Sept’ 12 without any example links, 90% links removed, asked for examples in every RR but no reply, shouldnt it be better to have live/cached “list” of bad links or penalties in GWT? Think about genuine businesses.
“That’s fair feedback. We appreciate that,” says Cutts. “We’re working on becoming more transparent, and giving more examples with messages as we can. I wouldn’t try to say, ‘Hey, give me examples in a reconsideration request,’ because a reconsideration request – we’ll read what you say, but we can really only give a small number of replies – basically ‘Yes, the reconsideration request has been granted,’ or ‘No, you still have work to do.’ There’s a very thin middle ground, which is, ‘Your request has been processed.’ That usually only applies if you have multiple webspam actions, and maybe one has been cleared, but you might have other ones left. But typically you’ll get a yes or no back.”
He continues, “But there’s no field in that request to say – a live amount of text – to just say, ‘Okay, here’s some more examples. But we will work on trying to get more examples in the messages as they go out or some way where you…for example, it would be great if you could just log into Webmaster Tools and see some examples there.”
“What I would say is that if you have gotten that message, feel free to stop by the Webmaster Forum, and see if you can ask for any examples, and if there’s any Googlers hanging out on the forum, maybe we can check the specific spam incident, and see whether we might be able to post or provide an example of links within that thread,” Cutts concludes. “But we’ll keep working on trying to improve things and making them more transparent.”

5 Important Link Removal Facts Post Penguin 2.0

Penguin 2.0 launched May 22nd, causing many sites to lose vital rankings, visibility, and traffic. This will without a doubt lead to yet another wave of link removal projects, which have been prevalent since Penguin 1.0.
Before diving into your backlink portfolio and attempting a Penguin recovery, here are 5 important link removal facts of which you should be aware.
1)    Matt Cutts recently stated that link removal/disavow needs to be done with a “machete”, not a “scalpel” or “fine toothed comb”.
5 important link removal facts
Cutts is fairly direct and straightforward. When a site is hit by Penguin there’s no sugar-coating it. Anything under suspicion needs to be removed if there’s to be hope of forward progress. His actual statement:
“Hmm. One common issue we see with disavow requests is people going through with a fine-toothed comb when they really need to do something more like a machete on the bad backlinks. For example, often it would help to use the “domain:” operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links. That’s one reason why we sometimes see it take a while to clean up those old, not-very-good links.”
Personally I can attest to this. Anyone who has spent time working on link removal, disavow, and reconsideration requests knows that Google’s not going to reward half efforts. There needs to be considerable work done, and a true mending of ways. Even a hint of spam will receive nothing more than a vague “At this time…”
2)    There are no guarantees with Google
The first mantra of every SEO’s life should be ‘There are no guarantees with Google’. Before you launch a project, especially link removal, it’s important to stare this statement in the face. Think about it, understand it, and truly accept it.
We’ve had successful link removal campaigns, and we’ve seen recovery from both manual actions and algorithmic penalties. Given enough time, energy, and resources I have no doubt that virtually all recovery campaigns are possible. But, at the end of the day, there are no guarantees with Google.
3)    Link Removal isn’t a small undertaking
Link removal is an exhausting task. To meet Google’s standards there are basically four steps to any link removal campaign:
A)     Backlink portfolio analysis
Here you’ll be taking a complete analysis of your backlink portfolio using Open Site Explorer, Majestic, or Ahrefs. These can be quite large, potentially with tens of thousands of links or more, and need to be properly categorized.
Again, it’s important to ensure you’re not skimming the worst off the top. You’ll have to dive deep and ensure you’re getting as close to every offender as possible. Specifically:
  • Paid links
  • Link directories
  • Irrelevant links
  • Bad link neighborhoods
  • Site-wide links on low quality sites
  • Spammy blog comments
  • Article directories
  • Link exchanges
  • Etc. etc.
Basically, any link you wouldn’t want Google to take a look at, or you’d have to explain with a conditional statement (that link is actually good because…) get rid of it.
B)      Find contact information
You’ll need a way to contact all these sites in order to request to have the link removed – a very important part of the link removal process. Google will potentially ignore any disavowed links if there’s been no effort to have the link removed.
C)      Outreach
Such a simple word for an exhausting process. Here, you’ll be contacting every site which you wish to have your link removed from. Which typically involves thousands of sites, depending upon the project.
It’s important to note that contacting them all once isn’t enough. You should contact all the sites at least three times, over the course of a month, in order to prove you’ve made every possible effort.
D)     Disavow
You’ll never be able to have every spammy, low quality, irrelevant link removed. There will be sites that are abandoned, sites with no contact information, webmasters that refuse or ask for money, etc. etc.
Once again, make every effort to have the link removed. Once you’ve had as many links as possible removed, go ahead and disavow the rest, including notes as necessary.
E)      Rinse and repeat
The hidden step, you’ll often have to rinse and repeat the whole process depending upon Google’s response. Once again, think machete, not scalpel.
4)    New links are vital
Link building is generally overlooked, or put on pause, during a link removal campaign. And, while it logically it makes sense to focus all of your energy into link removal, it’s actually better and more effective to build quality links in conjunction to link removal.
This is true for a variety of reasons.
First of all, building quality links signals Google that you’ve changed your ways, mended your tune, and changed your song. These newly built, high authority links will be a point of proof that you’re moving in a new, better direction, which is very important when wrangling with Google.
Secondly, these new links will help lessen the blow of your current link removal. You should be removing a large amount of links from your backlink portfolio. And, no matter how careful you are (you shouldn’t be overly cautious) you’ll be removing links that were passing value. Having new links, of higher quality, should ensure a quick recovery from any dip you see as you remove these links.
5)    Link removal is extremely difficult without tools
Tools are absolutely vital to a successful, effective & efficient link removal campaign. Often these projects have hundreds of hours invested into them, and any tool that can help provide an edge is important.
At the bare minimum, you’ll need help from a tool that can run a backlink analysis on your site. Some of the top rated:
  • http://www.opensiteexplorer.org/ – from Moz (formerly SEOmoz).
  • http://www.majesticseo.com/ – Majestic SEO
  • https://ahrefs.com/index.php – Ahrefs, a clever play on the html tag
  • http://raventools.com/tools/ – Raven tools
  • http://www.link-assistant.com/seo-spyglass/ – SEO Spyglass
Going beyond that, there’s tools specifically developed to help ease the pain of link removal. Some of the top rated:
  • Remove’em – A very comprehensive tool, also the most expensive. Helps keep track of the project and emails, as well as suspicious link discovery.
  • rmoov – Helps identify contact information, create and manage outreach, complete with reminders.
  • SEO Gadget – Automatically rates whether the link is ‘safe or not’. Can do 200 at a time, and will help find contact information as well.
No matter which tools you use, make sure you’re documenting your work. Documentation, documentation, documentation! Not only will it keep the project flowing smoothly and efficiently, but Google’s unlikely to revoke manual actions without proof of effort and change.
Here’s a video from Cutts himself which discusses the unnatural link detection warning as well as a few changes Google’s currently working on:

These 5 link building removal facts will hopefully prove useful as webmasters gear up for lengthy link removal projects, especially since the release of Penguin 2.0. I wish everyone the best moving forward and a speedy recovery.

Google: Guest Blogging For Links? You Better Nofollow Those Links

Marie Haynes spotted two video responses from Google where Google’s John Mueller, said in general, it is best you nofollow links in stories you write, especially when those stories are guest blog posts for the purpose of link building.
In general, that is Google’s advice. If you link to something with the intent that it should help your Google rankings – then nofollow the link. If you write something without that intent and the link is really natural, then there is no reason to nofollow the link.
Here are the video snippets from Google’s John Mueller, start around 49:56 in:
Generally speaking, if you’re submitting articles for your website, or your clients’ websites and you’re including links to those websites there, then that’s probably something I’d nofollow because those aren’t essentially natural links from that website.
If you are journalist, you might be writing about a web site in a natural way… that is of course, okay.
With the second video, start around 31:19 in:
Think about whether or not this is a link that would be on that site if it weren’t for your actions there. Especially when it comes to guest blogging, that’s something where you are essentially placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a linkbuilding point of view. I think sometimes it can make sense to guest blog on other peoples’ sites and drive some traffic to your site because people really liked what you are writing and they are interested in the topic and they click through that link to come to your website but those are probably the cases where you’d want to use something like a rel=nofollow on those links.
Google’s head of search spam, Matt Cutts, did post an official video on this exact question in October 2012.
In short, if you are guest blogging just to get links, then to be safe, nofollow the link.

Parameterless Searches - Soon to Become a Reality?

Soon you will be able to search just with the shake of the mobile. This is known as "Parameterless Searches" that would allow you to get instant information without entering any query on Google. The magic will happen just with the shake of your mobile and information will be delivered instantly. These type of searches will use data collected from time, date, emails, geographic location, travel speed, habitual activity etc. So, get ready for parameterless search queries.

Here is the Claim -



How Relevant Will be the Results?


Every result provided by Google would be customized to each and every individual. This customization would be based on the current context of the mobile computing device of the user.

How Will it Work?


Parameterless searches will work based on several factors as given below:-

Geographical location of the user
Speed of travel
Device activity (Data can be collected from it)
Weather conditions
Time and date

In short, it works by establishing a connection between three different servers - the data server (back end server that stores the information to be presented), the application server (consists of middleware) and the front end server (the mobile or digital computing device).

Examples


Twitter Delicious Facebook Digg Stumbleupon Favorites More