Google aims to provide the highest quality results for any search. As part of this, we take action to prevent what we call “webspam” from degrading the search experience, content and behaviors that violate our webmaster guidelines. Our efforts help ensure that well under 1 percent of results visited by users are for spammy pages. Here’s more about how we fought webspam in 2018.



Google webspam trends and how we fought webspam in 2018



Of the types of spam we fought in 2018, three continue to stand out:


Spam on hacked websites: We reported in 2017 that we had seen a substantial reduction of spam from hacked websites in search results. This trend continued in 2018, with faster discovery of hacked web pages before they affect search results or put someone in harm’s way.   While we reduced how spam on hacked sites affects search, hacked websites remain a major security problem affecting the safety of the web. Even though we can’t prevent a website hack from happening, we’re committed to helping webmasters whose websites have been compromised by offering resources to help them recover from a hacked website. 


User-generated spam: A particular type of spam known as User-generated spam has been a continued focus for us. User-generated spam includes spammy posts on forums, as well as spammy accounts on free blogs and platforms, none of which are meant to be consumed by human beings, and all of which disrupt conversations while adding no value to users. In 2018, we were able to reduce the impact on search users from this type of spam by more than 80%. While we can’t prevent websites from being exploited, we do want to make it easier for website owners to learn how to protect themselves, which is why we provide resources on how to prevent abuse of your site’s public areas.


Link spam: We continued to protect the value of authoritative and relevant links as an important ranking signal for Search. We continued to deal swiftly with egregious link spam, and made a number of bad linking practices less effective for manipulating ranking. Above all, we continued to engage with webmasters and SEOs to chip away at the many myths that have emerged over the years relating to linking practices. We continued to remind website owners that if you simply stay away from building links mainly as an attempt to rank better and focus on creating great content, you should not have to worry about any of the myths or realities. We think that one of the best ways of fighting spam of all types is by encouraging website owners to just create great quality content. Resources such as the SEO starter-guide highlight best practices and bust some common myths and misconceptions related to what it takes to appear well in Google Search results. Reporting link spam is also a great way to assist us in fighting against this type of abuse and to help preserve fairness in Search ranking.



Working with users, webmasters and developers for a better web

Everyday users continue to help us find spam, malware and other issues in Search that escape our filters and processes by reporting spam on search, reporting phishing or  reporting malware. We received over 180,000 search spam user reports and we were able to take action on 64% of the reports we processed. These reports truly make a difference and we’d like to thank all of you who submitted them. 


We think it’s important to let website owners known when we detect something wrong with their website. In 2018, we generated over 186 million messages to website owners calling out potential improvements, issues and problems that could affect their site’s appearance on Search results. We can only deliver these notifications to site owners that verified their sites in Search Console, and we successfully delivered 96 million of those messages. The rest of the messages will be kept linked with the website for as long as they are relevant, so they can be seen when a webmaster successfully registers their site in Search Console. The majority of these messages were welcoming new users to Search Console, and the second largest group was informing registered Search Console users when Mobile-First Indexing became available. Of all messages, slightly over 2%—about 4 million—were related to manual actions resulting from violations of our Webmaster Guidelines. 


High quality content keeps spam off of search results, and we continued to improve the tools and reports we offer for webmasters that create that content. The Google Search Console was completely rebuilt from the ground up to provide both new and improved reports (Performance, Index Coverage, Links, Mobile Usability report), as well as brand new features (URL Inspection Tool and Site and User management). This improved Search Console graduated out of beta in 2018 and is now available generally to all registered website owners.


We didn’t forget the front-end developers who make the modern web work, and focused on helping them make their sites great for users and also search-friendly regardless of whether they are on a CMS, roll their own CSS and JS, or build on top of a web framework. With the new SEO audit capability in Lighthouse, the open-source and automated auditing tool for improving the quality of web pages, developers and webmasters can now run actionable SEO health-checks on their pages and quickly identify areas for improvement.


We also engage directly with website owners to provide help with thorny issues. Our dedicated team members meet with webmasters around the world regularly, both online and in-person. We delivered more than 190 online office hours, online events and offline events in more than 76 cities, to audiences totaling over 170,000 including SEOs, developers and online marketers. We hosted four search events in Tokyo, Singapore, Zurich and Osaka as well as an 11-city Search Conference in India. In 2018, we started live office hours in Spanish on top of English, French, German, Hindi and Japanese, where Webmasters can find help, tips and useful discussion on our Google Webmaster YouTube channel. Product experts continued to help webmasters find solutions through our official support forums in over a dozen languages. 


We look forward to continuing our work to deliver a spam-free Search experience to all in 2019!


Posted by Juan Felipe Rincón, Webmaster Outreach, Dublin




Google webspam trends and how we fought webspam in 2017



As we continued to improve, spammers also evolved. One of the trends in 2017 was an increase in website hacking—both for spamming search ranking and for spreading malware. Hacked websites are serious threats to users because hackers can take complete control of a site, deface homepages, erase relevant content, or insert malware and harmful code. They may also record keystrokes, stealing login credentials for online banking or financial transactions. In 2017 we focused on reducing this threat, and were able to detect and remove from search results more than 80 percent of these sites. But hacking is not just a spam problem for search users—it affects the owners of websites as well. To help website owners keep their websites safe, we created a hands-on resource to help webmasters strengthen their websites’ security and revamped our help resources to help webmasters recover from a hacked website. The guides are available in 19 languages.

We’re also recognizing the importance of robust content management systems (CMSs). A large percentage of websites are run on one of several popular CMSs, and subsequently spammers exploited them by finding ways to abuse their provisions for user-generated content, such as posting spam content in comment sections or forums. We’re working closely with many of the providers of popular content management systems like WordPress and Joomla to help them also fight spammers that abuse their forums, comment sections and websites.


Another abuse vector is the manipulation of links, which is one of the foundation ranking signals for Search. In 2017 we doubled down our effort in removing unnatural links via ranking improvements and scalable manual actions. We have observed a year-over-year reduction of spam links by almost half.


Working with users and webmasters for a better web



We’re here to listen: Our automated systems are constantly working to detect and block spam. Still, we always welcome hearing from you when something seems … phishy. Last year, we were able to take action on nearly 90,000 user reports of search spam.


Reporting spam, malware and other issues you find helps us protect the site owner and other searchers from this abuse. You can file a spam report, a phishing report or a malware report. We very much appreciate these reports—a big THANK YOU to all of you who submitted them.


We also actively work with webmasters to maintain the health of the web ecosystem. Last year, we sent 45 million messages to registered website owners via Search Console letting them know about issues we identified with their websites. More than 6 million of these messages are related to manual actions, providing transparency to webmasters so they understand why their sites got manual actions and how to resolve the issue.

Last year, we released a beta version of a new Search Console to a limited number of users and afterwards, to all users of Search Console. We listened to what matters most to the users, and started with popular functionalities such as Search performance, Index Coverage and others. These can help webmasters optimize their websites' Google Search presence more easily.

Through enhanced Safe Browsing protections, we continue to protect more users from bad actors online. In the last year, we have made significant improvements to our safe browsing protection, such as broadening our protection of macOS devices, enabling predictive phishing protection in Chrome, cracked down on mobile unwanted software, and launched significant improvements to our ability to protect users from deceptive Chrome extension installation.


We have a multitude of channels to engage directly with webmasters. We have dedicated team members who meet with webmasters regularly both online and in-person. We conducted more than 250 online office hours, online events and offline events around the world in more than 60 cities to audiences totaling over 220,000 website owners, webmasters and digital marketers. In addition, our official support forum has answered a high volume of questions in many languages. Last year, the forum had 63,000 threads generating over 280,000 contributing posts by 100+ Top Contributors globally. For more details, see this post. Apart from the forums, blogs and the SEO starter guide, the Google Webmaster YouTube channel is another channel to find more tips and insights. We launched a new SEO snippets video series to help with short and to-the-point answers to specific questions. Be sure to subscribe to the channel!


Despite all these improvements, we know we’re not yet done. We’re relentless in our pursue of an abuse-free user experience, and will keep improving our collaboration with the ecosystem to make it happen.



Posted by Cody Kwok, Principal Engineer

Share on Twitter Share on Facebook

In this study we learned that the ‘compare’ functionality was often overlooked. We consequently changed the design with ‘filter’ and ‘compare’ appearing in a unified dialogue box, triggered when the ‘Add new’ chip is clicked. We continue to test this design and others to optimize its usability and usefulness.
We incorporated user feedback not only in practical design details, but also in architectural decisions. For example, user feedback led us to make major changes in the product’s core information architecture influencing the navigation and product structure of the new Search Console. The error and coverage reports were originally separated which could lead to multiple views of the same error. As a result of user feedback we united the error and coverage reporting offering one holistic view.
As the launch date grew closer, we performed several larger scale experiments. We A/B tested some of the new Search Console reports against the existing reports with 30,000 users. We tracked issue fix rates to verify new Search Console drives better results and sent out follow-up surveys to learn about their experience. This most recent feedback confirmed that export functionality was not a nice-to-have, but rather a requirement for many users and helped us tune detailed help pages in the initial release.
We are happy to announce that the new Search Console is now available to all sites. Whether it is through Search Console’s feedback button or through the user panel, we truly value a collaborative design process, where all of our users can help us build the best product.
Try out the new search console.
We're not finished yet! Which feature would you love to see in the next iteration of Search Console? Let us know below.
Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

In the next few weeks, we're releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.

The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.
Here’s a peek of our new Index Coverage report:

The new AMP fixing flow

The new AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks.
Share on Twitter Share on Facebook

Share on Twitter Share on Facebook


You may have noticed the Google Webmaster Central blog has a new address: webmasters.googleblog.com.

That’s because starting today, Google is moving its blogs to a new domain to help people recognize when they’re reading an official blog from Google. These changes will roll out to all of Google’s blogs over time.

The previous address will redirect to the new domain, so your bookmarks and links will continue to work. Unfortunately, as with a custom domain change in Blogger, the Google+ comments on the blogs have been reset.

Thanks as always for reading—we’ll see you here again soon at webmasters.googleblog.com!

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

2. Understand how your app is doing in search results

How are users engaging with your app from search results? We’ve introduced two new ways for you to track performance for your app deep links:

3. Make sure key app resources can be crawled

Blocked resources are one of the top reasons for the “content mismatch” errors you see in Webmaster Tools’ Crawl Errors report. We need access to all the resources necessary to render your app page. This allows us to assess whether your associated web page has the same content as your app page.

To help you find and fix these issues, we now show you the specific resources we can’t access that are critical for rendering your app page. If you see a content mismatch error for your app, look out for the list of blocked resources in “Step 5” of the details dialog:

4. Watch out for Android App errors

To help you identify errors when indexing your app, we’ll send you messages for all app errors we detect, and will also display most of them in the “Android apps” tab of the Crawl errors report.

In addition to the currently available “Content mismatch” and “Intent URI not supported” error alerts, we’re introducing three new error types:

In our experience, the majority of errors are usually caused by a general setting in your app (e.g. a blocked resource, or a region picker that pops up when the user tries to open the app from search). Taking care of that generally resolves it for all involved URIs.

Good luck in the pursuit of appiness! As always, if you have questions, feel free to drop by our Webmaster help forum.

Share on Twitter Share on Facebook


If you have further interest in why cybercriminals hack sites for spammy purposes, see Tiffany Oberoi’s explanation in Step 5: Assess the damage (hacked with spam).

Tiffany Oberoi, a Webspam engineer, shares more information about sites hacked with spam

And if you’re curious about malware, Lucas Ballard from our Safe Browsing team, explains more about the topic in Step 5: Assess the damage (hacked with malware).

Lucas Ballard, a Safe Browsing engineer, and I pretend to have a totally natural conversation about malware

While we attempt to outline the necessary steps in recovery, each task remains fairly difficult for site owners unless they have advanced knowledge of system administrator commands and experience with source code. For helping fellow webmasters through the difficult recovery time, we'd like to thank the steady members in Webmaster Forum. Specifically, in the subforum Malware and hacked sites, we'd be remiss not to mention the amazing contributions of Redleg and Denis Sinegubko.

How to avoid ever needing Help for hacked sites
Just as you focus on making a site that's good for users and search-engine friendly, keeping your site secure -- for you and your visitors -- is also paramount. When site owners fail to keep their site secure, hackers may exploit the vulnerability. If a hacker exploits a vulnerability, then you might need Help for hacked sites. So, to potentially avoid this scenario: Help for hacked sites can be found at www.google.com/webmasters/hacked. We look forward to not seeing you there!

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

That’s right, TCs & Google Guides came from all over the world to convene in California.

TCs & Google Guides from Webmaster Central and Search forums after one of the sessions.

After a day packed with presentations and breakout sessions...

...we did what we actually came for...

...enjoyed a party, celebrated and had a great time together.

Share on Twitter Share on Facebook


If your site isn't appearing in Google search results, or it's performing more poorly than it once did (and you believe that it does not violate our Webmaster Guidelines), you can ask Google to reconsider your site. Over time, we’ve worked to improve the reconsideration process for webmasters. A couple of years ago, in addition to confirming that we had received the request, we started sending a second message to webmasters confirming that we had processed their request. This was a huge step for webmasters who were anxiously awaiting results. Since then, we’ve received feedback that webmasters wanted to know the outcome of their requests. Earlier this year, we started experimenting with sending more detailed reconsideration request responses and the feedback we’ve gotten has been very positive!

Now, if your site is affected by a manual spam action, we may let you know if we were able to revoke that manual action based on your reconsideration request. Or, we could tell you if your site is still in violation of our guidelines. This might be a discouraging thing to hear, but once you know that there is still a problem, it will help you diagnose the issue.

If your site is not actually affected by any manual action (this is the most common scenario), we may let you know that as well. Perhaps your site isn’t being ranked highly by our algorithms, in which case our systems will respond to improvements on the site as changes are made, without your needing to submit a reconsideration request. Or maybe your site has access issues that are preventing Googlebot from crawling and indexing it. For more help debugging ranking issues, read our article about why a site may not be showing up in Google search results.

We’ve made a lot of progress on making the entire reconsideration request process more transparent. We aren’t able to reply to individual requests with specific feedback, but now many webmasters will be able to find out if their site has been affected by a manual action and they’ll know the outcome of the reconsideration review. In an ideal world, Google could be completely transparent about how every part of our rankings work. However, we have to maintain a delicate balance: trying to give as much information to webmasters as we can without letting spammers probe how to do more harm to users. We're happy that Google has set the standard on tools, transparency, and communication with site owners, but we'll keep looking for ways to do even better.

Share on Twitter Share on Facebook



If you'd like to know more about the positions available, here's the full list of requirements and responsibilities. Great candidates should be able to email the recruiter directly.

Share on Twitter Share on Facebook