An der australischen Uni New South Wales haben sie einen Bot programmiert, der Fabeln auf Basis von sechs Konzepten aus den Moralen – Morals? Wie lautet der Plural von Moral? – der Fabeln Äsops: Rache, Gier, Stolz, Erwartungen, Rücksichtslosigkeit und Belohnung.
More than 2,000 years after Aesop warned his listeners in ancient Greece about the dangers of greed and pride via the medium of geese, foxes and crows, researchers in Australia have developed a computer program which writes its own fables, complete with moral.
Margaret Sarlej, at the University of New South Wales, has devised the Moral Storytelling System, which generates simple stories with one of six morals identified in Aesop's fables: retribution, greed, pride, realistic expectations, recklessness and reward. The stories are structured around characters who are able to experience up to 22 emotions, from joy to pity, remorse and gratitude, in three different story worlds.
"The 'user' simply chooses a moral, and the system automatically determines a sequence of events (ie a story) which make characters feel the emotions required to convey that moral," said Sarlej via email.
Wenn man sich die Ergebnisse allerdings anschaut, müssen sich Storyteller (noch) keine allzugroßen Sorgen machen. Hier die Algo-Äsop-Story über Rache eines Einhorns:
Retribution (ie the fairy is punished for stealing the knight's sword):
Once upon a time there lived a unicorn, a knight and a fairy. The unicorn loved the knight.
One summer's morning the fairy stole the sword from the knight. As a result, the knight didn't have the sword anymore. The knight felt distress that he didn't have the sword anymore. The knight felt anger towards the fairy about stealing the sword because he didn't have the sword anymore. The unicorn and the knight started to hate the fairy.
The next day the unicorn kidnapped the fairy. As a result, the fairy was not free. The fairy felt distress that she was not free.