Re: AI for content creation

Morten described a possible feature, maybe even reality, in which AI-generated content is rampant. But when we start to employ machine learning for content creation, we start to regard content as a means more than an end. In that process, won't we lose what's worth caring about?

In his post, Morten explains he sees three types of AI-generated content emerge. The first two, AI-curated content (it helps assemble content and provide you with what it thinks is the most relevant), and AI-assisted content creation (it contributes to content creation) are a thing now. The third, AI-synthesised content, will likely become a thing in the future. Morten's post gives a great overview of what to expect.

It reminded me of a project I did in university about automating the arts. My conclusion: we can write code to generate creative works, but code or models can't capture intentions, experiences or beliefs. This requires human input, therefore creating art (or content) requires human input, was my reasoning. There are nuanced differences between AI, machine learning, big data and bots, but in this post I won't go into them.

When I want to find a recipe for pizza dough on the web, I would consider myself lucky if I could get ahold of a blog post from someone who cares passionately about the right kind of dough, who maybe ran an artisan pizza kitchen in Naples for the past 30 years or has a background in baking. ‘Dream on’, you think. Well, these people exist on the web and the web is awesome for being an open platform that anyone with a passion can write on. I don't want to find text produced just because someone saw “pizza dough” is a common search phrase and a potential for top result ad money to be extracted. The passion that drives them isn't the pizza dough—that's fine, but it makes their content less relevant to me. Similarly, I also don't want to find text generated by a machine learning model. It can't possibly bring the knowledge and experience I'm hoping for.

When I write an email or reply, I try to put what I want to convey into words that I choose. I might choose to include an inside joke that me and the recipient share, fit in an appropriate cultural reference, be extremely polite, or terribly rude. I mean, my intentions and attitude are in that interaction. I don't want Google or LinkedIn or others to suggest what to reply, to reinforce reminiscences of the historical content they trained their machine learning models with. It dehumanises my conversation. Its suggestion may or may not align with my intentions.

When I listen to music, I can be touched by the experiences and world views that inspired the artist. Whether you're into the Eagles, Eels or Ella Fitzgerald, their songs capture things that machine learning systems can't because the artists have attitudes. Robots don't love and they don't have opinions. Maybe they can come up with interesting rhythms and melodies, or utter sentences like “I love you”, but the association of their work with intentions and memories needs humans.

When I read a newspaper, the order of pages, the focus a layout provides and the choice of photography… they are decided by human beings who have a lot of experience. People who work as a journalist after being a professional sports player for decades. People who followed politics for decades and therefore understand which scandal is worth extra attention. People who can make bold choices based on their world views. Bots don't have world views. Algorithmic prioritisation of content isn't as good as prioritisation by humans, even if it gets close. See also algorithmic timelines on social media versus human-curated lists and contextualisation.

When I have a consumer issue, I want to talk to a human representative of the company. Someone who has the authority to make decisions. Who can take us on the shortest path to a mutually satisfactory solution. Did you ever see a chat bot provide more than a repeat of the data it has been fed? Did you see a chat bot make enquiries with empathy? Lack of empathy isn't a bug in bots that we just haven't fixed yet, it arguably isn't empathy if it isn't human-to-human (ok maybe animals can be part of this equation).

All these examples lead me to think: the human side of data isn't measurable or computable. The human side of art, content or communication is not just a different side of the same coin, it's a bigger coin. There is more to reality than data can capture, like lived experiences from actual people and intentions and beliefs. Propositional attitudes that robots can only pretend to have.

Basically, I'm worried about overestimating how many human capacities machine learning can take over. At the same time, I don't think machine learning is useless. Quite the opposite, it is fantastic. I love it that computers are getting better at automated captions, translation or even generating images based on prompts. The latter may create a whole new level of art where artists use it as a material for their creations (see also AI is your new design material by Josh Clarke). Medical applications where machine learning notices abnormalities that a human might miss. Audio recognition engines that will tell you what song is playing. Email spam filters that save us all a lot of time. It's all super valuable and genuinely impressive. And a lot of these use cases don't need lived experiences, intentions or beliefs to be incredibly useful.

Comments, likes & shares (27)

@hdv @baldur The minor issue I have with "in principle never" here is that we just don't have a good understanding of how "real" intelligence works. What makes the human passionate about the best pizza crust? If the answer is, "the human has all this lived experience", then I don't think we can automatically say that the "lived experience" of a hypothetical AI is necessarily qualitatively different. If the answer is metaphysics and a soul or whatever, then sure.
This week, many Dutch families write each other poems, which can be tongue in cheek. While you can generate rhyming words, you can't generate the banter potential between people who've known each other since they were born.
@hdv Why they write each other poems? It sounds beautiful
@alenanik11 it can be brutal, but it's definitely fun, it's a Sinterklaas tradition https://en.m.wikipedia.org/wiki/Sinterklaas Sinterklaas - Wikipedia
@hdv @alenanik11 People write each other poems for Sinterklaas in the Netherlands? I can't recall us doing that in Belgium 🧐