TLDR: Probably not.
Bing and Yandex (based in Russia) announced a new protocol called IndexNow, which allows website owners to proactively let search engines (for now just Bing & Yandex) know whenever site content is updated.
On the surface, this seems like a good thing: webmasters can more quickly & easily get new content indexed, and search engines don’t have to spend as much resources crawling the entire web just to discover new content.
However, this approach has a number of flaws.
For one, the largest search engine in the world (i.e. Google) has not yet adopted this protocol. Unless you live in Russia, this new protocol is probably not going to have a significant impact on your SEO efforts.
Secondly, a protocol like this won’t make a significant impact on a search engine’s crawling bandwidth unless it receives mass adoption. Otherwise, a service like this will only supplement a search engine’s crawl efforts, because the majority of the web that doesn’t use IndexNow will still need to be crawled manually.
So my take on IndexNow is that while it is not completely useless, it won’t make a difference for most websites and won’t significantly alter how search engines crawl the web.
If Bing or Yandex drive a significant amount of traffic to your site, then IndexNow is definitely worth implementing. Personally Bing only drives 2-5% of search traffic for most of my sites, but that could still be a significant amount for larger sites like Amazon or eBay.
This could also reduce search engine crawling somewhat if many major sites adopt it.
But I don’t foresee the majority of the web ever adopting this protocol. The majority of websites don’t have dedicated SEOs or know anything about search engine optimization or web crawling.
This is why changes like Google rewriting title tags have a net positive impact on search relevance. While it might frustrate SEOs who carefully craft their title tags for rankings & conversions, the majority of website owners don’t even know what a title tag is.
It will be interesting to watch whether Google adopts this protocol, but I doubt it. If anything, they have been moving in the opposite direction, making it more difficult or webmasters to submit their content for indexing.
Google has massive resources and a massive amount of data on web activity. They know what sites they want to index content from, and they know how often each site on the web is typically updated.
They don’t need a bunch of humans or APIs telling them what content to index– they know what they want.