Collectif Top Stories
5 mins read

Is there a future for Artificial Journalism?

As part of Mx3’s Collectif, in which we feature the work of our partners (see more here), Joris van Lierop of The Content Exchange outlines how the fundamental tenets of great journalism are as strong as ever. AI will not lead to the creation of great original stories, just better amplify the stories that have already been told……and original creators need to be fairly remunerated.

There is an emerging view that AI will replace all kinds of human creation. At The Content Exchange (TCE), we have more confidence in ‘human journalism’ than in ‘artificial journalism’. Notwithstanding, we believe that there will be a whole lot that AI will contribute to, and not just take from journalism.

TCE stands strong that ‘great stories are made for sharing’. In our view, the creation of a great story requires ingredients such as a sense of urgency, authenticity, originality and a genuine effort of being truthful. Let’s look at some examples to explain this. When you look closely at the list of some recently purchased items on the TCE platform, you find articles like:

  • Why Frankfurt is clamping down on data center sprawl
  • The Art of Saying No
  • Why Robotaxis Are Dividing San Francisco
  • Kylie Minogue: “I’m not a connoisseur of wines by any means. I just like a good wine.”
  • Gustave Eiffel: 100 Years Later, Still Defining ‘French Entrepreneur’
  • The Mine Of The Dead: Inside Egypt’s Desert Gold War

The sense of urgency

The first question to ask is why an editor would think about creating stories like these? Noticing a sense of urgency to have these stories told, out of the zillion possible stories, itself is already an important act of journalism. Of course, AI may copy these articles, rephrase them, summarise them; blend them, make videos out of them, translate them in a 100 languages but the question is if it will ever become the original source for them, and identify by itself that these stories are worthy of being told.

There is a well known concept in the news called ‘personalized’ news. Which means users can create their own specific blends of news, and decide for themselves what is urgent and what is not. AI might be an answer to this as it can help create the zillion possible stories and make them completely customised to match every individual’s preference. However, the consumer desire for such a service is doubtful, just like the probable quality of the zillion stories.

Another ‘sense of urgency’ may be to look at Google Trends or Trending stories on Twitter/X. From the perspective of news publishing, this might be an inspiration, but will not lead to creation of great original stories, only amplifying stories already told.


The examples you saw above are either background stories, commentaries, opinion pieces or interviews. Of course, it is perfectly possible to ask Chat GPT to write an article, in a certain style with a certain amount of words, about Gustave Eiffel, or fake an interview with Kylie Minogue. However both examples have serious issues.

In order to ask Chat GPT to write about Gustave Eiffel, there should at least be an awareness that ‘French Entrepreneurship’ is an original angle for this topic (not just a factual biography), and that it needs to have some sense of opinion that Eiffel is a ‘still defining’ entrepreneur. This is not about ‘truth’ but about the authority of the creator of the article to make such a claim. Does the reader trust the authority? Then the claim has value (or even ‘truth’), if the reader does not ‘trust’ the authority, the claim is meaningless.

An authority is not randomly gained, but earned by winning the hearts and minds of readers by the author (or a brand) by numerous previous publications over a long period of time. You may ask AI tools to write a text ‘as if’ it was written by Anna Wintour about London Fashion Week, but it is the own view of Anna Wintour that people are looking for.


That interviews can be faked has already been proven by the infamous Michael Schumacher interview. The interview format is the most ‘human’ form imaginable; based on a talk between two people who have a genuine interest in that conversation. Though you might be able to ‘fake’ such a talk; it has no value (let alone the questionable ethics of such an act). The reader is interested in the thoughts of Kylie Minogue, not in the thoughts of Chat GPT. It is true that many experiments have shown that people in fact can relate to bots and experience a real connection, but when a bot secretly plays as if it were an authentic person, the reader is being deceived.


More worrying examples of AI automation are the likes of ‘rewrite this article for me’, which are explicitly or implicitly handed over to AI. A great popular science story can easily be ‘rewritten’ in a way that you can get away with the copyrights. This in fact has nothing to do with AI. It is a practice that has been around since computers could offer the ‘copy paste’ functionality. As long as these ‘rewrites’ (could also be the creation of video or audio out of text) actions are focused on proprietary content, this is a very interesting way of making content available to new audiences, but when rewriting is done to other people’s work, then in fact someone is deprived from value, which is a big loss. What AI shows us is that rights/copyrights should be taken more seriously.

In essence, there is nothing wrong with reusing someone’s work (it is our mission), and adding customizations (like translations, local examples, summaries, etc); to make it fit the needs of the readers. What is wrong is that there is no payment and credit given to the original creator.

This is a fallacy in publishing which has been around longer than AI. AI only brings this to a massive scale, which makes it even more clear that this fallacy should be repaired.


Often AI/Journalism analysis and predictions take ‘the news’ as a kind of uniform genre, most resembling a kind of factual publications like a news wire. The reality is that ‘the news’ is a diversified field of genres, among which opinion, interviews, reviews, human interest, background stories, within certain well defined restrictions (the format). Being unlimited in output or super efficient in creation is not necessarily a benefit for all these kinds of genres. Originality, authenticity and authority are.

In our view, managing rights and fair payment for human journalism is the starting point for great collaboration with artificial intelligence. AI will make it possible to amplify human creativity, originality, sense of urgency and authenticity. A great background story, made accessible to new audiences by using AI and for example by turning it into a podcast is a big win for all parties involved. These kinds of Human/Artificial relationships is what TCE is steering at in these exciting times.

Joris van Lierop
CEO, The Content Exchange

The Content Exchange, founded by Joris van Lierop and Tom Schenkenberg, is developing a global marketplace for content. By bringing creators and publishers together, a partnership is formed that creates a better earnings model for good stories that need to be shared.

The business model is straightforward: Publishers retain full control of what content they make available to third parties and their content is listed in a secure, ring-fenced marketplace by category. Third-party publishers from other geographical territories can then select the content they wish to re-publish under license, often using a minimum bundle package.