None of the results provide any location information in the title or snippet, nor do they have a regional TLD (such as .ca for Canada). The only way to find the result you're looking for is to refine your search ([capital city boxing canada] works) or click through the various links to figure it out. Clicking through the first result reveals that there's apparently another "Capital City Boxing" club in Alabama.

Region tags improve search results by providing valuable information about website location right in the green URL line. Continuing our prior example, here's a screen shot of the new region tag (circled in red):



As you can see, the fourth result now includes the region name "Canada" after the green URL, so you can immediately tell that this result relates to the boxing club in Canada. With the new display, you no longer need to refine your search or click through the results to figure out which page is the one you're looking for. In general, our hope is that these region tags will help searchers more quickly identify which results are most relevant to their queries.

As a webmaster, you can control how this feature works by adjusting your Geographic Targeting settings. Log in to Webmaster Tools and choose Site configuration > Settings > Geographic Target. From here you can associate a particular country/region with your site. These settings will determine the name that appears as a region tag. You can learn more about using the Geographic Target tool in a prior blog post and in our Help Center.

We currently show region tags only for certain domains such as .com and .net where the location information would otherwise be unclear. We don't show region tags for results on domains like .br for Brazil, because the location is already implied by the green URL line in our default display. In addition, we only display region tags when the region supplied by the site owner is different from the domain where the search was entered. For example, if you do a search from the Singapore Google domain (google.com.sg), we won't show you region tags for all the websites webmasters have targeted to Singapore because we'd end up tagging too many results, and the tag is really most relevant for foreign regions. For the initial release, we anticipate roughly 1% of search results pages will include webpages with a region tag.

We hope you'll find this new feature useful, and we welcome your feedback.


Promote your site without comment spam

If you want to improve your site's visibility in the search results, spamming comments is definitely not the way to go. Instead, think about whether your site offers what people are looking for, such as useful information and tools.

FACT: Having original and useful content and making your site search engine friendly is the best strategy for better ranking. With an appealing site, you'll be recognized by the web community as a reliable source and links to your site will build naturally.

Moreover, Google provides a list of advice in order to improve the crawlability and indexability of your site. Check out our Search Engine Optimization Starter Guide.

What can I do to avoid spam on my site?

Comments can be a really good source of information and an efficient way of engaging a site's users in discussions. This valuable content should not be replaced by gibberish nonsense keywords and links. For this reason there are many ways of securing your application and disincentivizing spammers.
  • Disallow anonymous posting.
  • Use CAPTCHAs and other methods to prevent automated comment spamming.
  • Turn on comment moderation.
  • Use the "nofollow" attribute for links in the comment field.
  • Disallow hyperlinks in comments.
  • Block comment pages using robots.txt or meta tags.
For detailed information about these topics, check out our Help Center document on comment spam.

My site is full of comment spam, what should I do?

It's never too late! Don't let spammers ruin the experience for others. Adopt security measures discussed above to stop the spam activity, then invest some time to clean up the spammy comments and ban the spammers from your site. Depending on you site's system, you may be able to save time by banning spammers and removing their comments all at once, rather than one by one.

If I spammed comment fields of third party sites, what should I do?

If you used this approach in the past and you want to solve this issue, you should have a look at your incoming links in Webmaster Tools. To do so, go to the Your site on the web section and click on Links to your site. If you see suspicious links coming from blogs or other platforms allowing comments, you should check these URLs. If you see a spammy link you created, try to delete it, else contact the webmaster to ask to remove the link. Once you've cleared the spammy inbound links you made, you can file a reconsideration request.

For more information about this topic and to discuss it with others, join us in the Webmaster Help Forum. (But don't leave spammy comments!)

Diagram of serving content from your mobile-enabled site


We're working on a daily basis to improve search results and solve problems, but because the relationship between PC and mobile versions of a web site can be nuanced, we appreciate the cooperation of webmasters. Your help will result in more mobile content being indexed by Google, improving the search results provided to users. Thank you for your cooperation in improving the mobile search user experience.


What's our take on watermarked images for Image Search? It's a complicated topic. I talked with Peter Linsley—my friend at the 'plex, video star, and Product Manager for Image Search—to hear his thoughts.

Maile: So, Peter... "watermarked images". Can you break it down for us?
Peter: It's understandable that webmasters find watermarking images beneficial.
Pros of watermarked images
  • Photographers can claim credit/be recognized for their art.
  • Unknown usage of the image is deterred.
If search traffic is important to a webmaster, then he/she may also want to consider some of our findings:
Findings relevant to watermarked images
  • Users prefer large, high-quality images (high-resolution, in-focus).
  • Users are more likely to click on quality thumbnails in search results. Quality pictures (again, high-res and in-focus) often look better at thumbnail size.
  • Distracting features such as loud watermarks, text over the image, and borders are likely to make the image look cluttered when reduced to thumbnail size.
In summary, if a feature such as watermarking reduces the user-perceived quality of your image or your image's thumbnail, then searchers may select it less often. Preview your images at thumbnail size to get an idea of how the user might perceive it.
Maile: Ahh, I see: Webmasters concerned with search traffic likely want to balance the positives of watermarking with the preferences of their users -- keeping in mind that sites that use clean images without distracting artifacts tend to be more popular, and that this can also impact rankings. Will Google rank an image differently just because it's watermarked?
Peter: Nope. The presence of a watermark doesn't itself cause an image to be ranked higher or lower.

Do you have questions or opinions on the topic? Let's chat in the webmaster forum.


It seems the world is going mobile, with many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn't easy. Mobile sites not only use a different format from normal desktop site, but the management methods and expertise required are also quite different. This results in a variety of new challenges. As a mobile search engineer, it's clear to me that while many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly. I'd like to help ensure that your mobile site is also available for users of mobile search.

Here are troubleshooting tips to help ensure that your site is properly crawled and indexed:

Verify that your mobile site is indexed by Google

If your web site doesn't show up in the results of a Google mobile search even using the 'site:' operator, it may be that your site has one or both of the following issues:
Googlebot may not be able to find your site
Googlebot, our crawler, must crawl your site before it can be included in our search index. If you just created the site, we may not yet be aware of it. If that's the case, create a Mobile Sitemap and submit it to Google to inform us to the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, in the same way as with a standard Sitemap.
Googlebot may not be able to access your site
Some mobile sites refuse access to anything but mobile phones, making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Our crawler for mobile sites is "Googlebot-Mobile". If you'd like your site crawled, please allow any User-agent including "Googlebot-Mobile" to access your site. You should also be aware that Google may change its User-agent information at any time without notice, so it is not recommended that you check if the User-agent exactly matches "Googlebot-Mobile" (which is the string used at present). Instead, check whether the User-agent header contains the string "Googlebot-Mobile". You can also use DNS Lookups to verify Googlebot.

Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, we then check for whether the URL is viewable on a mobile device. Pages we determine aren't viewable on a mobile phone won't be included in our mobile site index (although they may be included in the regular web index). This determination is based on a variety of factors, one of which is the "DTD (Doc Type Definition)" declaration. Check that your mobile-friendly URLs' DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML. If it's in a compatible format, the page is eligible for the mobile search index. For more information, see the Mobile Webmaster Guidelines.

If you have any question regarding mobile site, post your question to our Webmaster Help Forum and webmasters around the world as well as we are happy to help you with your problem.


As a post-Halloween treat, we're happy to announce a brand new user interface for our Keywords feature. We'll now be updating the data daily, providing details on how often we found a specific keyword, and displaying a handful of URLs that contain a specific keyword. The significance column compares the frequency of a keyword to the frequency of the most popular keyword on your site. When you click on a keyword to view more details, you will get a list of up to 10 URLs which contain that keyword.

This will be really useful when you re-implement your site on a new technology framework, or need to identify which URLs may have been hacked. For example, if you start noticing your site appearing in search results for terms totally unrelated to your website (for example, "Viagra" or "casino"), you can use this feature to find those keywords and identify the pages that contain them. This will enable you to eliminate any hacked content quickly.

Let us know what you think!
Share on Twitter Share on Facebook

Update: The described product or service is no longer available.


Just a few weeks ago, we made Google Friend Connect a lot easier to use by dramatically simplifying the setup process. Today, we're excited to announce several new features that make it possible for website owners to get to know their users, encourage users to get to know each other, and match their site content (including Google ads) to visitors' interests.

To learn more about these new features, check out the Google Social Web Blog.
Share on Twitter Share on Facebook


We're convinced that structured data makes the web better, and we've worked hard to expand Rich Snippets to more search results and collect your feedback along the way. If you have review or people/social networking content on your site, it's easier than ever to mark up your content using microformats or RDFa so that Google can better understand it to generate useful Rich Snippets. Here are a few helpful improvements on our end to enable you to mark up your content:

Testing tool. See what Google is able to extract, and preview how microformats or RDFa marked-up pages would look on Google search results. Test your URLs on the Rich Snippets Testing Tool.


Google Custom Search users can also use the Rich Snippets Testing Tool to test markup usable in their Custom Search engine.

Better documentation. We've extended our documentation to include a new section containing Tips & Tricks and Frequently Asked Questions. Here we have responded to common points of confusion and provided instructions on how to maximize the chances of getting Rich Snippets for your site.

Extended RDFa support. In addition to the Person RDFa format, we have added support for the corresponding fields from the FOAF and vCard vocabularies for all those of you who asked for it.

Videos. If you have videos on your page, you can now mark up your content to help Google find those videos.

As before, marking up your content does not guarantee that Rich Snippets will be shown for your site. We will continue to expand this feature gradually to ensure a great user experience whenever Rich Snippets are shown in search results.

Share on Twitter Share on Facebook

Today we're launching two cool features:
  • Malware details
  • Fetch as Googlebot
Malware details (developed by Lucas Ballard)

Before today, you may have been relying on manual testing, our safe browsing API, and malware notifications to determine which pages on your site may be distributing malware. Sometimes finding the malicious code is extremely difficult, even when you do know which pages it was found on. Today we are happy to announce that we'll be providing snippets of code that exist on some of those pages that we consider to be malicious. We hope this additional information enables you to eliminate the malware on your site very quickly, and reduces the number of iterations many webmasters go through during the review process.

More information on this cool feature is available at our Online Security Blog.


Fetch as Googlebot (developed by Javier Tordable)

"What does Googlebot see when it accesses my page?" is a common question webmasters ask us on our forums and at conferences. Our keywords and HTML suggestions features help you understand the content we're extracting from your site, and any issues we may be running into at crawl and indexing time. However, we realized it was important to provide the ability for users to submit pages on their site and get real-time feedback on what Googlebot sees. This feature will help users a great deal when they re-implement their site with a new technology stack, find out that some of their pages have been hacked, or want to understand why they're not ranking for specific keywords.


We're pretty excited about this launch, and hope you are too. Let us know what you think!

Share on Twitter Share on Facebook