Links
All links in: web standards (all links)
-
Standardising AI crawler consent , external
IETF works on building blocks to let websites declare if crawlers can take their content for traininy:
Right now, AI vendors use a confusing array of non-standard signals in the robots.txt file (defined by RFC 9309) and elsewhere to guide their crawling and training decisions. As a result, authors and publishers lose confidence that their preferences will be adhered to, and resort to measures like blocking their IP addresses.
-
Standards in 2024 , external
In a blog post, W3C's new CEO Seth Dobbs outlines a focus on putting people first. He explains global standards, like the W3C's, are essential to:
[ensure technologies] are accessible by all, secure, maintain privacy, respect the planet, and work anywhere in the world