Use at your own risk.
> Currently, the Indexing API can only be used to crawl pages with either JobPosting or BroadcastEvent embedded in a VideoObject.
I wanted to highlight (in addition to your statement) that JobPosting is a specific type of structured data.
If the target site doesn't have these elements, it may or may not work... or it may work for now, but not once they realized it's being used incorrectly
JobPosting structured data: https://developers.google.com/search/docs/appearance/structu...
https://www.youtube.com/watch?v=kvYb2bdtT7A&t=422s
IME it will silently drop and ignore anything that is not relevant.
The consistent problem with SEO is that most SEOs don't understand Google's business model. They don't understand Google is going to best serve its customers (i.e., those doing the search). SEOs (and their clients) need to understand that getting Google to index a turd isn't going to change the fact that the content and the experience i'ts wrapped in is still a turd. Google is not interested in pointing its customers to turds.
We must have been using a different Google over the past 3 years. It does this almost exclusively now.
If a paying customer gives Google money to point eyeballs to turds, it points eyeballs to turds (this is how Google makes money today, it is the business model for search). The problem with SEO isn't that it degrades search, it's that SEO users aren't paying customers and don't make Google any money (and compromises Google's ability to direct eyeballs to paying customers).
This is classical "enshitification" - offer a service for free to capture eyeball share, then offer a paid service to companies that capitalizes on that eyeball share but compromises the "eyeball experience" (and then in the endgame, squeeze companies that become dependent upon the eyeball-platform to serve shareholders).
You'll probably find an npm package with lots of dependencies that'll generate that sitemap for you if that's what you need...
> For websites with many short-lived pages like job postings or livestream videos, we recommend using the Indexing API instead of sitemaps because the Indexing API prompts Googlebot to crawl your pages sooner than updating the sitemap
I wish I had been more picky with my sitemap but I thought including all URLs was the goal. I at least properly weighted them but that doesn't seem to do much.
sudo iptables -A INPUT -p tcp --dport 80 -j DROP
sudo iptables -A INPUT -p tcp --dport 443 -j DROP
sudo ip6tables -A INPUT -p tcp --dport 80 -j DROP
sudo ip6tables -A INPUT -p tcp --dport 443 -j DROP
That should do.> Currently, the Indexing API can only be used to crawl pages with either `JobPosting` or `BroadcastEvent` embedded in a `VideoObject`.
So this might come with the risk of seeing the site you want to boost rather penalized by Google.
The only outcome I can see from this is a) contributing to the rise of spam and b) harming people managing sites and apps for which this API is vital.
It's almost as if Google is actively trying -not- to index anything as a way to reduce spam, by forcing the people who really care to jump through 100 hoops.
A great way for the dark web remains dark.
+ this technique might make engines aware of your content, but doesn't guarantee indexation whatsoever.
I remember running a few websites back in the day, and with zero interaction with google all of the pages showed up in the search index a day or two after publishing at most.