So, here is a new problem I am facing with Writer Sanctuary. It seems Google has found the pages but currently hasn’t indexed them. Considering how many of them were published in the last three months, this is disheartening.
Mostly because they were pretty good articles that no one is finding. No wonder why I haven’t been seeing growth in traffic. Not to mention how it’s been a steady decline over the past few months.
OK, let’s fix this problem.
Discovered – Currently Not Indexed
According to Google, this means the page was found but not indexed yet because the process was expected to overload the site. Considering how I don’t have much traffic as it is, this is completely a false flag.
Checking the URL is Actually Live
First, I opened the pages in question…and there are more than 100! They are, indeed, live on the site and are functioning as they should, complete with Auto Ads from AdSense.
No redirects, no missing elements – they all appear as the other pages would.
Throttling Google’s Crawl
I remember a few months ago, my CPU and memory usage was being maxed out on my web host. Tech support told me to throttle Google’s crawling because it might be using up resources.
Come to find out, it was a faulty plugin that was burning up the resources. And I know I haven’t changed the settings back afterward. Perhaps that’s where I’ll start.
The problem is that I can’t remember what I had to change. So, that’s going to be fun.
The only thing I can remember is using the original “Webmasters” tools. I went it and saved the option to allow Google to determine the best rate. But I thought that would only last for 90 days, according to Google’s own documents.
As far as I can tell, everything is set correctly.
Verify Page is In the Sitemap
Some people say to check the sitemap to make sure the non-indexed page is being listed. My thought is that if Google detected it, wouldn’t that mean it’s in the sitemap? The problem is Google isn’t indexing it.
Though, I suppose the process could be different if you link to a page that isn’t in a sitemap.
In any case, I just went through my sitemap and verified the articles are there. Obviously. Sometimes I think “experts” just pull random ideas out of their asses to add as filler to an article. Then again, I could be wrong and Google can “detect” articles through other means.
I mean, it would make sense. But if you have a sitemap that is auto-generated by plugins in WordPress, I bet the articles will be listed.
Still, it’s always better to verify and ensure that something like this isn’t the case. In my case, the articles are, indeed, listed in the sitemap.
Check Robots.txt File
I was pretty sure the robots.txt file allowed all crawlers, and sure enough, it’s open to Google just fine.
I rarely change anything to the robots.txt file. In fact, I haven’t been in it since I set the website up. However, I do use a lot of “optimization” plugins here and there that might make some undesirable changes.
Never underestimate automated tools that are supposed to help but wind up hindering performance. In any case, the robots.txt file is fine and accessible, according to Google’s checking app.
Verifying Quality of Work
Some suggest that the problem could be related to the quality of the content itself. Though, I already have several pages that are ranking number one for their keyphrases, so I doubt it has anything to do with my writing quality.
Not to mention I use the same process for clients, some of which have 100x the daily traffic I get on a good day.
Now, the quality of the site itself might be in question. There are a few things in the theme’s layout that I don’t like, and perhaps it’s time to change things up a bit. I’ve been thinking about doing this for a long time, as a matter of fact.
I just haven’t had time to look for a new theme or design one with something like Page Builder by SiteOrigin or Elementor.
Waiting – It May Solve Itself
It’s common for Google to have URLs in the queue to be crawled at a later date. And I’m perfectly fine with that. Unfortunately, Google is sitting on perhaps some of my most prominent pieces of work.
I’ve written some awesome posts lately, and I need to get them into search as soon as possible. Some have been sitting there for three months.
I suppose I can give this a few days to see if anything changes. I did manually submit three of the most important articles for crawling, so, I suppose let’s see what happens.
Manually Submitting the Page to Crawl
As a test run, I manually submitted the “Skyscraper Technique” article to be crawled by Google. It might take some time before this actually happens, so, I’ll have to keep a close eye on the progress.
Otherwise, submitting all 100+ URLs would be a slight pain in the ass. It’s not overly difficult, just time-consuming.
Using the Validate Fix Button?
I’m half tempted to click the “Validate Fix” button in Search Console for the “Discovered – currently not indexed” list of posts. I’m not 100% sure what that would do, and haven’t read any articles that regard the function at all.
I think right now, let’s give it a few days to see what happens with the manual submissions. If nothing changes, or if it gets worse, I guess we’ll try the Validate button.