Is there a way to bulk submit URLs to Fetch as Google?
Solution 1
Indirectly, yes:
Create a page with links to all the URLs you'd like re-crawled (like a sitemap) and add that to your website.*
Submit the URL to that page to Fetch as Google, selecting Desktop as the fetching strategy, as detailed here: Use Fetch as Google.
Once Fetch as Google is complete, and within 4 hours time, from the Fetches Table next to the status of the fetch, select Submit to Index, then select Crawl this URL and its direct links, followed by Submit.
As indicated in Ask Google to re-crawl your URLs, the above will:
Submit the URL as well as all the other pages that URL links to for re-crawling. You can submit up to 10 of requests of this kind per month.
*As commented by John Mueller, you can also submit a sitemap file or a text file containing a list of URLs.
Solution 2
Submitting a sitemap (XML,.txt
, HTML or RSS) allows bots to discover pages. If you need even faster submission, do it manually or some types of automation (using selenium).
Related videos on Youtube
Newmania
Updated on September 18, 2022Comments
-
Newmania over 1 year
Google is taking a while to re-index changes I have made to thousands of pages, so I want to use Fetch as Google to give it a prompt. However, you can only enter 1 URL at a time as standard.
Is there a way to bulk submit URLs to it?
-
the over 9 yearsNot what you're asking but you should consider submitting XML sitemaps.
-
Newmania over 9 yearsI do already, but thanks for the suggestion.
-
-
closetnoc over 9 years@dan That is far superior to my idea! I will keep this in my rolodex of tricks.
-
John Mueller over 9 yearsYou could also just create a sitemap file (or text URL-list file) and submit that.
-
dan over 9 years@JohnMueller Thanks for your input here, added that to the answer.
-
Newmania almost 8 years@JohnMueller Thanks. I'm now trying to speed up de-indexation of pages; could I submit a sitemap of pages to de-index (I'm not sure if it's counter intuitive to point a re-crawl to pages to re-index?).
-
John Mueller almost 8 years@Newmania sure, that works too.
-
Newmania almost 8 years@JohnMueller This is taking a really long time still. 21k pages removed from the index in 5 weeks, but 45k more still to be removed. I submitted rscpp.co.uk/please-crawl.txt on 14th June and see "URL and linked pages submitted to index". Anything further I can do to remove those?
-
John Mueller almost 8 yearsWithout a lastmodification date, those URLs aren't going to get crawled faster.
-
Geremia about 6 yearsNice. I didn't know you could use
.txt
for sitemaps. I thought they had to be according to the sitemap protocol only. -
Newmania almost 6 years@JohnMueller thanks, how would the "text URL-list file" specify the lastmodification date? Or is it better to create a sitemap? Also we are trying to get our AMP pages reindexed, and have submitted a similar txt file of URLS, but same issue in that only a small amount have been indexed so far.
-
Nemo about 2 yearsThose tools seem to have vanished. Nowadays there's only a search bar in Google Webmaster to look for data about URLs in your own registered website's profile.