At Google, we believe in open access to information, so we try hard to make information from the web available to everyone. We believe that society works best when it provides a space for all voices to be heard, and that people are best served when they have access to a breadth of diverse content from a variety of sources. That’s why we do not remove content from search results – except in very limited circumstances, including legal removals, a violation of our webmaster guidelines or at the request of the webmaster who is responsible for the page.
Since our search results reflect content and opinions that are already published on the web, in some instances they may surface content that contains biases, negative societal attitudes and practices, or offensive material. If the language of your search query matches very closely with the language used on a more controversial site, you may see that reflected in your results. Such content does not reflect Google’s own opinions, but our belief in open access to information means that we do not remove links to content simply because it contains views or information many people may disagree with.
While we may not get it right every time, we are constantly working to prevent poor quality or irrelevant content arising in your search results. We can’t anticipate every search query that we see – in fact, every day, 15% of the searches we process are ones that we’ve never seen before. The underlying content on the web is also constantly growing and changing, with hundreds of new web pages published every second. Consequently, we try to find algorithmic solutions that can address issues not just for one search results page, but for thousands or millions.
However, there are certain cases where we manually remove content in addition to these algorithmic solutions. For instance, we encourage people and authorities to alert us to content they believe violates the law. For many issues, such as privacy, defamation or hate speech, our legal obligations may vary country by country, as different jurisdictions have come to different conclusions about how to deal with these complex issues. In the case of all legal removals, we share information about government requests for removal in our Transparency Report. Where possible, we inform website owners about requests for removal via our Webmaster Console.
We want to keep people safe and respect the laws and cultural norms of the nearly 200 countries in which we offer services. While we try hard to make information from the web available to everyone, there are a few instances where we will remove content from Search:
We block search results that lead to images of child sexual abuse.
Upon request, we’ll remove personal information from search results if we believe it could make you susceptible to specific harm, such as identity theft or financial fraud. This includes sensitive government ID numbers such as US Social Security numbers, bank account numbers, credit card numbers and images of signatures. In cases where this information is broadly available, like national ID numbers listed on a government website, we generally don’t process removals. We sometimes refuse requests if we believe someone is attempting to abuse these policies to remove other information from our results. We also honour requests from people to remove nude or sexually explicit images and videos shared without their consent (often referred to as 'revenge porn') from Google search results.
Sometimes we remove content or features from our Search results for legal reasons. For example, we’ll remove content if we receive valid notification under the US Digital Millennium Copyright Act (DMCA), or under data protection law in the EU. We also remove content from local versions of Google consistent with local law, when we’re notified that content is an issue. For example, we’ll remove content that illegally glorifies the Nazi party from our German service or that unlawfully insults religion from our Indian service. We scrutinise these requests to ensure that they are well-founded, and we frequently refuse to remove when there is no clear basis in law to do so. When possible, we display a notification that results have been removed and report these removals to lumendatabase.org, a project run by the Berkman Centre for Internet and Society, which tracks online restrictions on speech. We also disclose certain details about legal removals from our Search results through our Transparency Report.
We remove web pages from search results at request of the webmaster who is responsible for the page.
For certain Search features, like Autocomplete, where we provide information more proactively, we want to help you get to the information you are looking for as quickly as possible, but we also want to be careful not to show potentially upsetting content when you haven’t asked for it. For these features, we have developed policies to exclude things like porn, hate speech or violence from appearing. While we do our best to prevent content that may be offensive from appearing, Search features are algorithmic and we don’t always get it right – we welcome feedback from users which helps us improve our algorithms and address content that violates our policies.
We also provide SafeSearch to help any user filter sexually explicit results. In addition to setting your personal account preferences in Search settings, parents can turn on the filter for their children’s supervised devices and accounts in the Family Link app, and workplace or school users can turn on the filter at the network level. While SafeSearch filters most explicit results, please be aware that it is not perfect and explicit materials may still appear.