Links
Posts about what I read elsewhere. Subscribe with RSS
-
MEPs adopt new and first AI law
On Wednesday, Parliament approved the Artificial Intelligence Act that ensures safety and compliance with fundamental rights, while boosting innovation.
(…)
It aims to protect fundamental rights, democracy, the rule of law and environmental sustainability from high-risk AI, while boosting innovation and establishing Europe as a leader in the field. The regulation establishes obligations for AI based on its potential risks and level of impact.
(From: Artificial Intelligence Act: MEPs adopt landmark law | News | European Parliament)
-
W3C and AI
The W3C established articificial intelligence is having a “systemic impact on the web” and looked at how standardisation, guidelines and interoperability can help manage that:
Machine Learning models support a new generation of AI systems. These models are often trained on a large amount of Web content, deployed at scale through web interfaces, and can be used to generate plausible content at unprecedented speed and cost.
Given the scope and scale of these intersections, this wave of AI systems is having potential systemic impact on the Web and some of the equilibriums on which its ecosystem had grown.
This document reviews these intersections through their ethical, societal and technical impacts and highlights a number of areas where standardization, guidelines and interoperability could help manage these changes
(From: AI & the Web: Understanding and managing the impact of Machine Learning models on the Web)
-
Simpler businesses
All of this leaves me wanting simpler businesses with simpler motives — I'll pay, you provide a product or service commensurate with the value. No opaque policies, no concerns about data. I'd love for you to be profitable and sustainable, without being obsessed with scale. I'd love you to build products for the customers, not the speculators, that have invested in you.
(From: Of course AI is extractive, everything is lately • Cory Dransfeldt)
-
Broaden your frame of reference
Sean Voisen recommends to not stick to a particular technology:
Lose the label and become T-shaped. Stay curious. Keep learning. Go deep in a specific technology or framework or programming language, but develop breadth in adjacent technologies that will help inform your work and develop new perspectives.
(From: On being a ‹insert favorite technology here› “guy” | Sean Voisen)
Coincidentally, Jonathan Snook posted similar advice this week, in Shifting identifies.
-
Jakob Nielsen's problematic claims about accessibility
Jakob Nielsen wrote a post in which he states “the accessibility movement has been a miserable failure’ (his words) and claims that generative “AI” can somehow magically remove the need for accessibility research and testing.
Note, there's currently no evidence that what he proposes is desirable (by users) or possible (with the tech). It is, however, clear that testing with users and meeting WCAG is desirable and possible.
Léonie explains Nielsen needs to think again:
Nielsen thinks accessibility has failed.
Nielsen thinks that generative AI will make my experience better. Nielsen apparently doesn't realise that generative AI barely understands accessibility, never mind how to make accessible experiences for humans.
I think Nielsen needs to think again.
Matt May said we need to talk about Jakob:
This part of the post isn’t so much an argument on the merits of disabled access as it is a projection of himself in the shoes of a blind user, and how utterly miserable he thinks it would be. At no point in any of this—again, classic Jakob Nielsen style—does he cite an actual blind user, much less any blind assistive technology researchers or developers
Per Axbom wrote:
the published post is misleading, self-contradictory and underhanded. I'll walk you through the whole of it and provide my commentary and reasoning.
-
Hallucination is inevitable
Researchers show that hallucination is inevitable:
LLMs cannot learn all of the computable functions and will therefore always hallucinate. Since the formal world is a part of the real world which is much more complicated, hallucinations are also inevitable for real world LLMs.
(From: [2401.11817] Hallucination is Inevitable: An Innate Limitation of Large Language Models)
-
Switch in HTML
Apple is experimenting with a new HTML form control: a switch (see WHATWG/HTML issue #9546). It is designed as an attribute for
<input type="checkbox">
, you'd turn a checkbox into a switch by adding theswitch
attribute:<input type=checkbox switch checked>
In terms of pseudos:, they're experimenting with
::thumb
and::track
pseudo elements for styling the parts of the switch. Unlike the checkbox, it has no::indeterminate
pseudo class, because it has no indeterminate state.The colour can be set with
accent-color
. For browsers that don't support this newswitch
attribute, the element simply falls back to a checkbox.There is some accessibility support: a switch gets a
switch
role under the hood, and the element respects the “differentiate without color“ setting in iOS and “on/off labels” on iOS.Their blog post on when to use it:
Generally, we recommend using a switch when the end user understands the user interface element as a setting that is either “on” or “off”. A checkbox is well suited for when the end user would understand the element as something to be selected.
(From: An HTML Switch Control | WebKit)
-
Invisible systems
On the work the GOV.UK Design System team do:
it’s the invisible systems work that has a bigger impact. Reviewing. Advising. Organising. Co-ordinating. Triaging. Educating. Supporting. Allowing the innovation happening at the edges of the ecosystem to feed back into the centre, to be consolidated and standardised for the benefit of everyone.
(From: How far we’ve come: What it would mean to lose the GOV.UK Design System)
-
Design that encourages deletion
In Design 11Patterns that Encourage Junk Data, Michelle talks about the environmental cost of creating and storing so much of our data in the ‘cloud’:
the need for limitless digital storage bumps up against the very real physical limits of our planet.
(From: CSS { In Real Life } | Design 11Patterns that Encourage Junk Data)
She explains it's not only a huge amount of data, a lot of it is probably unnecessary:
It’s estimated that up to 88% of the data stored in the cloud is ROT (Redundant, Obsolete or Trivial) data, or “dark data”: data collected by companies in the course of their regular business activities, but which is not used for any other purpose. It all amounts to a lot of junk data that has no purpose, that will never be needed or looked at again.
Yup, I definitely store a lot of photos and emails that I will never need to look at again. I should set aside some time for cleanup.
I agree with Michelle. Design could help consumers decrease their storage. I want my software to encourage deletion, not (or not just) addition, bring it on!
-
Stitching together
Brian Merchant explains in Let's not do this again, please that OpenAI's new image generating thingy is mostly a “promotial narrative” to try and seek more investment money (OpenAI's server spend, the article says, is over 1 million USD per day).
The tech stitches together imagery, rather than create new imagery, Brian says:
It’s not that Sora is generating new and amazing scenes based on the words you’re typing — it’s automating the act of stitching together renderings of extant video and images.