A few weeks ago, a now unsubscribed Forum member started a thread with a rather strange post {EDIT I originally had a link to a post here but I had linked to the wrong one. Will continue to search for the correct post.} that I had to read twice trying to make sense of it. I finally concluded that it made no sense of all. I looked at some of their other posts and also concluded that there were something very strange in a few of them. I thought that the person was either (1) employing some type of automatic writing technique or (2) the texts had been generated by software, that is, they were products of Artificial Intelligence.
Today, this article pops up in RT: AI proves ‘too good’ at writing fake news, held back by researchers.
Just want to call this to everyone's attention because:OpenAI ... created a machine learning algorithm, GPT-2, that can produce natural-looking language largely indistinguishable from that of a human writer while largely “unsupervised” – it needs only a small prompt text to provide the subject and context for the task.
We've trained an unsupervised language model that can generate coherent paragraphs and perform rudimentary reading comprehension, machine translation, question answering, and summarization — all without task-specific training.
- You may encounter this type of writing online more frequently and it is important not to let yourself spend too much energy trying to make sense of something that was never meant to make sense and is there only to waste your time and rob you of your mental energy.
- Bring this to the attention of the mods who are often trying to figure who (or now, what) is behind these posts.
As we know, there are many strange forces out there trying to distract those of us who are insightful and creative and who attempt to see clearly what is going on in this world. Perhaps they are part of the NSA or GCHQ, or perhaps some place more nebulous, we may never find out, but we will learn to recognize their calling card when they leave it.
Russia Today: AI proves ‘too good’ at writing fake news, held back by researchers