SEO Efficiently cataloging 200 pages using the Google API.

KingOfWolfStreet

Well-known member
VIP MEMBER
Joined
Jul 28, 2022
Messages
393
Greetings!

.A user has contacted me asking how to index websites using the Google API. As of late, Google has been indexing sites at a slower pace and I believe this information could be helpful. The following tool can ensure the indexing of 200 links within a minute.

The post is divided into two parts:

Step 1: Configuring the Indexing API: Creating a service account and JSON key. To get started, head over to the Indexing API console on the Google Cloud Platform and create a service account.

1.png

You have the option to input your preferred project name or stick with the suggested one. The location can remain as is, but it can be edited if needed.

21.png
Let's proceed with the service account creation. The window in front of you should resemble the following (note that your project name may differ):

3.png

  1. Input another random name in the given field.
  2. Assign the "Owner" role to this account.
4.png

Afterwards, generate a new key and download it to your computer.

5.png

6.png

STEP 2: To run the scanning script, you will need the downloaded key. You can obtain the script from Github and it will come in the form of a folder, containing a file named "service_account".

7.jpg


Next, you should replace the content of the "service_account" file with the content of your downloaded JSON key. The end result will be that the "service_account" file within the script folder will appear similar to the following:


8.jpg

The script is prepared and now requires linking to Google Search Console.

As Step 3, you can link the script to the Google Search Console by assigning the client_email from the JSON key as the full owner. In Google Search Console, it will appear as follows:

9.jpg

To enable the usage of the Index API in our project, the only step left is to follow the provided link, select a service account, and activate the API.

10.png

Our script is fully operational.

STEP 4

To use it, you will need node.js which can be downloaded from here. Once installed, open PowerShell on your computer and input the command "npm install requests".

To run the script, navigate to the script folder (which you downloaded from Github and added the JSON key) and locate the URLs file. You can input up to 100 addresses to be scanned. Note that the maximum number of addresses that can be scanned per day is 200, so you will need to create two batches of 100 each.

Next, launch PowerShell and input the command "node index.js". Wait a few seconds and you should receive the response "200 OK".

11.jpg

DONE!!!
Feel free to ask about anything!
 
Great work, buddy! Please move this to VIP.
 
Last edited by a moderator:
Can that trick work for all domain names in Search Console, or do I need to set it up separately for each domain?
 
The Indexing API lets you tell Google to update or remove pages from its index by providing the page's location. You can also check the status of your notifications. However, right now, the Indexing API is only for pages with job postings or live broadcast events.

So, can I use this for a regular eCommerce site?
 
Can that trick work for all domain names in Search Console, or do I need to set it up separately for each domain?

For all the sites in your console, you can do this. If you try to trick the system, you could make multiple users and index even more.
 
The Indexing API lets you tell Google to update or remove pages from its index by providing the page's location. You can also check the status of your notifications. However, right now, the Indexing API is only for pages with job postings or live broadcast events.

So, can I use this for a regular eCommerce site?

Any site that belongs to you can do this.
 
If my pages were deindexed a few months ago, will your method still work?
 
Thanks! I want to know if the Instant Indexing for Google plugin by Rank Math does the same thing.
 
I actually did this 30-40 days ago, but I didn't do the script part. This is amazing—4,000 pages have already been indexed. Great share!
 
Totally helpful! I'll use it later when I get home. Thanks a lot!
 
Can I use this to index links that aren’t from my website, like tier 2 links? I'm asking because the plugin works for WordPress, so I’m wondering if this method is for indexing different kinds of links.
 
Can you translate that, buddy? It seems like a Russian method to get the links indexed.
 
Can we index Tier links, or does this only work with the domain we have in Search Console?
 
Back
Top Bottom