HOW TO: Quickly Remove Your Unwanted Links from Google's Search Results

Weed Out Broken and Unwanted Links

In the past, requesting a page to be removed from Google was a long and untracked process where you had to send a request by using a feedback form. Since 2007, Google has rolled out an new set of tools to speed up the URL removal process on domain names you control through Google Webmaster Tools.  As not many people know about this process, here is a quick how-to to avoid your website promotion efforts to result in a 404 from Google.

Why Proactive Link Checking of Google Search?

Links is the technology that makes web as we know it, but as the Internet, your site may evolve fast.  Through this, there may be human-errors or required changes that may cause unwanted or broken links to be displayed in a Google Search of your site's content.

Dead links can negatively impact your website's promotion and brand in several ways:

  • Lost of Reputation: When your visitors see a 404 error page when they land on your site from Google: they may immediately dismiss your site and have a negative perception of your brand for being sloppy or not attentive to details.
  • Not Search Engine Optimized:  Any broken link is a sign that your site is not optimized for search engine and that it needs to be fixed.
  • Wasted Server Resources: Every time a page not found is loaded, the error is appended to your server's log file.  Without proper control, its size can bog down your web host's performance.

Isn't it Automated?

You are right if you say that the process is automated and that Google will remove your broken links when Googlebot keeps on receiving 404 HTTP responses when they update the entire index.  However, this process is not very fast and a few months may pass before they are confident enough that the previously indexed page should be removed.  Everyone should check how their site is displayed using Google since broken or unwanted links could slip in from the following:

  • Broken: There are several causes to why broken links to your site appear in Google's Search Results.
    • Change of site organization or URL structure: 
      If you decide to categorize your pages using a different URL structure, or renamed a page or if you purchased domain name which was previously indexed.
    • Temporary redirect rule: 
      If you used a redirection that was temporary as a patch to the problem, Google will not remove the old page.
    • Broken and no-crawl
      If a previously indexed page no longer exist and at the same time was excluded from indexing using robots.txt or became an orphan page.
  • Unwanted: Google will not remove your unwanted links.
    • Pages with no content such as forms and transitional messages.
    • Pages with too personal information that you want accessible but not searchable through Google

How to Remove Your Unwanted Links from Google's Search Results Using Google Webmaster Tools

Prerequisites to Google Expedited Link Removal

  • Have or be a Webmaster that:
    • Registered and authenticated the site using Google Webmaster Tools
    • Knows about bot exclusion rules using robots.txt and meta tags

How to Remove Page Urgently from Google

  1. Find the links you may want to remove
    • Perform a Google Search on for site:example.com
    • Head to Last Page of the results and click to repeat the search with the omitted results included
    • Return to the first page
    • Manually verify each link by clicking on it while noting down the links you want removed (in a .txt, Evernote, Google Docs, etc...)
  2. Verify the Removal Eligibility
    • To prevent exploits and search result shaping, Google require the page to be actually inaccessible
    • The URL must, one of the following:
    • Verify that all your unwanted links are blocked from Googlebot
    • Verify that your broken links are not temporary
  3. Submit Removal Request
    • Head to Google Webmaster Tools
      1. On the Webmaster Tools home page, click the site you want.
      2. Under Site configuration, click Crawler access.
      3. Click the Remove URL tab.
      4. Click New removal request.
    • Select whether you want to remove: 
      • Individual URLs: web pages, images, or other files
        Remove outdated or blocked web pages, images, and other documents from appearing in Google search results.
      • A directory and all subdirectories on your site
        Remove all files and subdirectories in a specific directory on your site from appearing in Google search results.
      • Your entire site
        Remove your site from appearing in Google search results.
      • Cached copy of a Google search result
        Remove the cached copy and description of a page that is either outdated or to which you've added a noarchive meta tag.
    • Remember that all the affected pages must meet the removal eligibility (e.g.: the subdirectory is excluded using robots.txt)
  4. Make sure you they are not re-indexed
    • An important note is that these pages are removed from the search results for a minimum of 90 days, after which Google might re-index the unwanted pages.  It is therefore important to keep your robots.txt exclusion rules or your meta tags for those pages.
  5. Undo if necessary
    • If you made a mistake, you can always cancel the request if it has not been processed yet
    • For 90 days, you can also reinclude the page in the same expedited manner

Removing Cached Pages from Google

The removal of cached pages is slightly different. You will need to use cache:URL to check for your cached page and use a "noarchive" robots or googlebot meta tag to avoid removing the page entirely.

Alternatives to Google Webmaster Tools

There are alternatives to removing dead links from Google, but they are not guaranteed to be processed in the short term.  We will expand on these in a following post, they are:

  • Use a 410 HTTP Response which officially marks the URI as "Gone"
  • Use a 301 Redirect which marks the page as "Moved Permanently"

As for unwanted links:

  • Use a "noindex" meta tag to the page
  • Add a rule that excludes the URI in robots.txt

As alternative tool, Google also offer a public Webpage removal request tool which ca be used to remove personal information displayed on websites you do not own (after they have been updated).

Redirection and Link Checkers

Even if the link is removed, you may have hundreds of visitors going through it due to bookmarks and such, stay tuned as one of our next post will be on how to properly redirect the traffic by subscribing to our RSS feed.

Note that the dead links found using this method is often the result of a page that once worked, for pure "File Not Found" pages that could be due to a typo or pages that were never linked to, you will have to use a link checker.  We will also review automated link checkers to see how you can 404-proof your site before Googlebot crawls it in one of our upcoming posts.


Post your comment, suggestion or question

We're looking for comments that adds to the main article or other comments. They may be interesting, substantial or highly amusing while staying related to the main post. If your comments are excessively self-promotional, obnoxious, or even worse, boring, you will be banned from commenting.

Web page addresses and e-mail addresses turn into links automatically.
Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd>
Lines and paragraphs break automatically.
Your email address is kept private and will not be shown publicly, it is only used for verification.

Blog Responses to this post (Trackbacks)

Quickly Remove Your Unwanted Links from Google's Search Results

Kudos for a great SEO article - Trackback from SEOKudos

See how you can contribute to this post (Take Action!).

feed your rss reader
add tips to your reading list

get email updates
instant updates to your inbox

Enter your email address:

follow us on twitter
interact and see posts in progress
  • fetching tweets...

see more and follow @netparticles on Twitter