Yaklaşık 60K SEO sayfam var "İçeriğim
Bu bir sorunun işareti mi? Sadece trafik alamıyorlar mı? Belki daha az popüler yerler? İdeal olarak hala mevcut olmasını istiyorum.
Yaklaşık 60K SEO sayfam var "İçeriğim
Bu bir sorunun işareti mi? Sadece trafik alamıyorlar mı? Belki daha az popüler yerler? İdeal olarak hala mevcut olmasını istiyorum.
Kod e‑postana gönderildi. (24 saat geçerli)
we have seen lots of no traffic pages get removed from index. Google’s ultimate reason for choose to remove one page vs. the another? probably not be something you are going to exactly know
Sounds like a quality issue.
Google seems to think they are either highly duplicative or you haven’t provided enough signals (internal links, external backlinks, traffic) to justify retaining them in the index.
No sign of a problem. We have millions of pages. 300k got initially indexed. In the past months also 75% of them have been de-indexed.
Traffic increased and also Google started slowly indexing more pages again in the past week. There were discussions in the past weeks around the SEO subs bcs this happened to others too.
Current assumption is that crawl logic or crawl budget logic has been adjusted before the current core update.
This doesn’t flag the quality because pSEO pages have been de-indexed while equivalent pages with duplicate structure remained indexed.
With these facts the logic seems to be that there were no or no substantial search queries measured by Google for these pages.
In order to save computational costs of indexing and re-crwaling – pages could just be de-indexed and might be checked again for index suitability.
Moreover when your authority increases over time the crawl budget will be adjusted and more pages get back to the index (according to the current crawl algo). Which makes sense as we got recently featured by Forbes and other high authority sites.
For me all seems logical especially in times where Internet gets spammed with AI slop all over. Google need to get its computational costs under control.
If you have other logical reasoning let me know, but that’s the closest we got in our discussions in the last weeks without any major notification of Google itself.
You’re spamming Google like there’s no tomorrow. Google caught you, yet you’re being rewarded with even more traffic. This is a first, so I have no idea what could happen next. Usually, if Google catches you spamming with thousands of pages, they take them all down at once, not partially. Partial deletion is a thing, but I have only seen this on small websites, say 20-30 pages, not 50k. In short, I’ve honestly never seen this before, so I have no idea.
>Originally Google indexed about 55k of them and traffic grew steadily over 6 months. A month ago, they de-indexed all but 15k of them, but my Google traffic is still rapidly rising. I’m getting the little trophy milestones every week or so.
This demonstrates perfectly that SEO is not some holistic game of if you do good at everything you’ll get rewarded. Pages can perform while non-performing pages are de-indexed.
De-indexing is absolutely an authority/topical authority issue. Topical Authority is a shaped authority – in otherwords, not jsut a number but associated with topics. In the Dec 2024 update, Google narrowed Topical authority – a lot – thats what hit Hubspot I believe.
I really doubt its a technical issue – as many web devs automatically jump to – its hard to imagine that you had that many sites linked via tenuous links.
Whats probably happened is
1) those pages didnt perform
2) pages that linkedin to them that weren’t strong, got dropped and you have a knock-on effect
What you need to do is triage and get the pages that had clicks back up – normally re-targeting (which means re-publshing) and deicde which pages are highly related.
for example you might have had pages about “Tree pruning in Colorado” and vairous pages cookie-cut to fit each state, city and service. but those keywords had “tree” or “pruning and maintenance” in common and pruning and maintenance are no longer related, so Google “pruned” back your site…. (sorry about the pun).
If it happened today just wait as Google core update for June just started.
Any patterns to what was de-indexed or did you mean 100% de-indexed was “my content in location” and the remaining 15K are of a different pattern or subject?
Remove or edit the 75%. Test different variations before full implication and keep testing and editing. Recent studies show fresh content sometimes hurts but you have nothing to lose. Show it down. You can tinker with site nav and other internal linking strategies and strengthen topic clusters/silos. You are trying to guide the search engines so they will index the updated pages without wiping out the remaining 25%. That said, less is more and consolidating keywords from multiple pages down to one will be just as effective. Local SEO isn’t that competitive depending on the niche. Remove de-indexed pages and build links to the successful ones.
How can you tell if google de indexed your content?