What is a page penalized?
Why a page is not indexed or it appears in the results far behind other less rich content sites?
A page can trigger a negative signal, without even the webmaster can suspect it.
Some examples ...
- Too much blank space.
It is a sometimes falsely negative signal indicating that the page is almost empty or generated automatically.
- Hidden text.
You wanted to create dynamic content and a portion of text in the page appears after a user action. This is seen by the crawler as text intended only for search engines, called cloaking.
- Duplicate content.
The same text listed on all pages triggers a signal of duplicate content.
- Accidental redirection.
The page contains a link to another page that was subsequently redirected to another, perhaps to the same page.
- A keyword filtered.
The site essex.edu was a time filtered by Google from searches and limited to mature audience because the domain contains the word "sex".
The words are not always splitted, that seems limited to domain names, but verify, however, that words on the page does not cause filtering.
Three audits are essential:
Check it does not block the page.
- Meta tag "robots".
It should not block indexing by the noindex or none directive.
- The canonical tag.
It must correspond to the page. Check by looking at the source code from a Web browser.