SafeSearch and Content Policies
Last updated
SafeSearch and content policies affect search visibility differently, and the two are often conflated. SafeSearch is a preference-based filter applied at query time. Content policies govern what Google will index and show regardless of user settings. Each has distinct implications for publishers.
What SafeSearch does
SafeSearch is a filter applied by Google when serving results. When a user has SafeSearch enabled, results containing explicit sexual content or graphic violence are removed from the results page for that query.
The filtering happens at display time, not at indexing time. The underlying pages remain indexed. They are not penalised, not demoted, and not treated differently in ranking for users with SafeSearch off. A page filtered under SafeSearch is fully present in Google’s index.
SafeSearch tiers
Google operates three settings:
- Filter (on): explicit content is hidden from results. The default for signed-in users who have not changed the setting.
- Moderate: filters explicit images but may allow some explicit text results. The prior default before Google moved toward stricter settings.
- Off: no filtering applied. Available to signed-in adult users in supported regions.
In some contexts, SafeSearch is locked at the network or account level: school networks, Google Workspace for Education accounts, and accounts belonging to users under 18. A locked SafeSearch cannot be changed by the user.
How Google classifies content
Google’s SafeSearch classifier evaluates pages at the page level based on:
- Explicit sexual content. The primary category: nudity, graphic sexual depictions, adult content.
- Graphic violence. Depictions of severe violence, gore, or graphic injury.
- Shocking content. Disturbing content that does not fall primarily under the above categories.
A site that publishes both standard and explicit content has individual pages classified separately. The site as a whole is not labelled; individual pages may be filtered.
Restricted indexation
Restricted indexation is distinct from SafeSearch filtering. It is a policy-level action that limits how or whether a site appears in results.
Adult content restriction: sites Google classifies as primarily adult content may be restricted to users with SafeSearch disabled. This is a site-level algorithmic classification, not a manual penalty.
Country-specific restrictions: content that violates local law may be removed from results in a specific country. This is not a sitewide action.
Manual actions: issued via Google Search Console when a site violates spam or quality policies. Separate from SafeSearch classification. A manual action suppresses rankings globally until resolved and reviewed.
SafeSearch filtering and manual actions are handled differently. Filtered pages are indexed and rank normally for unfiltered users. Manual action targets are actively suppressed.
Google’s content policies
Content policies cover categories Google will not index, or will remove regardless of user settings:
- Child sexual abuse material (CSAM). Automatic removal. Google scans actively and reports findings to NCMEC. No exceptions.
- Non-consensual explicit imagery. Content depicting real people in explicit scenarios without consent, removable on request.
- Certain personal information. Sensitive personal data categories can be removed on request under Google’s personal information policies.
- Legally required removals. Court orders, DMCA takedowns, and jurisdiction-specific legal requirements.
- Spam policy violations. Cloaking, hidden text, link schemes, auto-generated content with no user value.
Most sites will never encounter restrictions under any of these categories. They apply to a narrow set of harmful content and to sites that knowingly violate spam policies.
How to check if your site is affected
Page Indexing report in Google Search Console. Filters pages by status. Policy-related exclusions appear here with specific labels alongside common benign statuses such as pages crawled but not yet indexed, or pages excluded by a noindex tag.
SafeSearch test. Search for your brand or representative pages with SafeSearch enabled and disabled. Pages that appear in one and not the other are classified as explicit content.
Site: operator check. site:yourdomain.com in Google returns indexed pages. A large mismatch between known page count and indexed pages warrants investigation.
Manual Actions report in GSC. Under Security and Manual Actions. If Google has issued a manual action, it is documented here with specific guidance on what triggered it.
Reconsideration requests
Manual actions can be appealed after the underlying issue is fixed:
- Identify the specific violation from the Manual Actions report in GSC.
- Fix the issue: remove the violating content, clean up spam signals, or address the policy breach.
- Submit a reconsideration request via the Manual Actions report. Describe what was wrong and what changed.
- Google reviews the request manually. Response times range from days to weeks.
SafeSearch classification is algorithmic and has no reconsideration process. If you believe a page is misclassified, review what on the page may be triggering the classifier (images, surrounding copy, page context) and modify accordingly.
Frequently asked questions
Will SafeSearch filtering hurt my rankings? No. SafeSearch filtering is not a ranking signal. It does not affect how a page ranks for users with SafeSearch off. It is a display filter applied per user, not a quality judgement.
Can I opt pages out of SafeSearch filtering? No. Classification is determined algorithmically based on page content. The only way to avoid filtering is to ensure your content does not contain material the classifier targets.
Is legal adult content allowed in Google’s index? Yes. Adult content is indexable and may appear in results for users with SafeSearch disabled. Google does not prohibit indexing legal adult content. What it restricts are illegal categories (CSAM) and policy violations (spam, cloaking), which are separate issues.
What is the difference between a manual action and SafeSearch filtering? A manual action is a penalty issued by a Google reviewer for a specific policy violation. It suppresses your rankings globally and is documented in GSC. SafeSearch filtering is algorithmic, content-based, and affects only users with SafeSearch enabled. The two are unrelated.