Why My Page Is Not Indexed? - Technical and Quality Reason
In the latest episode of Search Off the Record, the Google Search Relations team revealed some reasons why pages are not indexed. Here are the answers!
Key Takeaways
-
In the latest episode of Search Off The Record from Google, the reasons why websites are not indexed are discussed.
-
There are technical and content quality reasons that affect the indexing process of a page.
-
John Mueller suggests individually requesting indexing for each page of the website through Search Console.
On June 22, 2023, Google's podcast Search Off The Record released its latest episode discussing the reasons why some pages are not indexed by Google. The podcast is hosted by the Google Search Relations team, consisting of Martin Splitt, John Mueller, and Gary Illyes. Here are some key points from this podcast.
Why Pages Are Not Indexed?
Quoting from the conversation of the three experts, there are several factors that can affect the indexing of a page. However, Mueller emphasizes that it is common for some pages within a website to not be indexed, except if almost all pages are not indexed.
If almost all pages are not indexed by Google, it indicates a larger and more technical problem. Typically, the welcome page of a website is the easiest to be indexed. If not, then an audit needs to be conducted to find the issue.
Here are some reasons why pages may not be indexed properly:
1. New Website
The first reason that may make it difficult for a page to be indexed is that the website is still relatively new. This is common as Google's system is still learning about the new website.
However, it doesn't mean that new pages cannot be indexed at all. Usually, Google will need time to index them after crawling. The time required also depends on several factors such as the urgency of the page, the demand for the topic of the page, and more.
2. Server Unreachable by Googlebot Host
The second reason is related to the server of the website, which may be unreachable by Googlebot Host. This results in the entire web pages not being able to be indexed by Google. Here is what John Mueller said about this:
If my server is not reachable from Googlebot hosts, then sure, that is a technical problem, I guess. Right?
That could be a reason why nothing is indexed. Sure, yeah.
This reason is related to the technical aspect of the website, so webmasters can check whether the server used can be reached by Googlebot.
3. Misuse of robots.txt
The robots.txt file is a special text file placed in the root directory of a website. This file provides instructions to search engine robots, such as Googlebot, about which pages should be indexed and which pages should be ignored.
By using rules in the robots.txt file, site owners can control how search engines index their website content. If used incorrectly, it can make it difficult for pages to be indexed by Google.
4. Noindex Setting
In this podcast, John Mueller also explained that many webmasters use the noindex tag on pages that shouldn't have it. This tag is used to prevent search engines from indexing, even if other pages are linking to it.
5. Pages with No Content
When purchasing a new domain, users may not have any content on the main page. As a result, search engines ignore it. To fix this, it is essential to ensure that each page has quality content for proper indexing.
6. Pages with No Links
In addition to having no content, pages that have no inbound or outbound links will be difficult to index. Google crawls all linked links. Therefore, linking pages can expedite the indexing process.
Moreover, Gary Illyes stated that:
"Depending on how well a site is linked or a page is linked, it can get indexed within seconds."
Typically, news or trending content is indexed quickly. Additionally, Google can also consider which content needs to be indexed immediately. Quoting Gary:
"But yeah, we can index stuff very fast if we need to, like if we see that there is a spike in interest for something, and then we can also take our damn time with it because sometimes it's just not obvious that something should be indexed, right?"
So, content that is currently in high demand may be indexed faster. Thus, the key is to create quality content that is in demand by users.
What to Do When Pages Are Not Indexed?
So, how can we ensure that pages get indexed? Gary Illyes and John Mueller suggest using the inspect URL feature in Search Console to check if the page has not been indexed yet. If so, users can request indexing through this tool. This way, Googlebot will prioritize the page for indexing.
In addition, checking the SERP by adding "site:example.com" can also be used to see if the page has been indexed. However, this method only provides a broad overview of the results, so it is possible that the desired page is not included in the SERP even though it has been indexed.
Regarding outdated and irrelevant content, should it be deleted?
Mueller states that this type of content does not need to be deleted from the website, but it needs to be optimized to improve its quality. After that, Googlebot can re-index it to reflect the changes.
These are the key points covered in the Google podcast Search Off The Record regarding the reasons why some pages are not indexed.
Article Source
As a dedicated news provider, we are committed to accuracy and reliability. We work extra hard by attaching credible sources to support the data and information we present.
- Search Off The Record: “Why is my site not indexed?" https://search-off-the-record.libsyn.com/why-is-my-site-not-indexed
- Support Google: https://support.google.com/webmasters/thread/123697715/my-website-pages-are-not-getting-index?hl=en
Tati Khumairoh
An experienced content writer who is eager in creating engaging and impactful written pieces across various industries. Using SEO approach to deliver high-quality content that captivates readers.
Another post from Tati
cmlabs Launches Country-Specific Writing Guidelines
Tue 18 Jun 2024, 08:46am GMT + 7None Can Guarantee Google Ranking, What Does SEO Agency Sell?
Wed 21 Feb 2024, 11:22am GMT + 7Google Update: Circle to Search & AI-Powered Multisearch
Wed 24 Jan 2024, 08:24am GMT + 7Structured Data Update for Products: suggestedAge Property
Fri 19 Jan 2024, 08:24am GMT + 7More from cmlabs News your daily dose of SEO knowledge booster
In the development of its latest search engine, Bing has partnered with GPT-4 to deliver the most advanced search experience. Here are the details.
Bard, an experimental conversational AI service, combines information with language model intelligence. Check out the details here.
With the rapid advancement of AI technology, major search engines like Google and Bing are now equipped with their respective generative AI. Here is the detail.
WRITE YOUR COMMENT
You must login to comment
All Comments (0)
Sort By