Spread the love [City, State] – [Date] – The Northrop Grumman Cygnus spacecraft, carrying vital supplies and scientific experiments to the International Space Station (ISS), has encountered an issue during its journey to orbit. While the exact nature of the problem remains undisclosed, the incident has triggered a series of precautionary measures. According to NASA officials, the Cygnus spacecraft, named “S.S. Michael Freilich” in honor of the late NASA Earth Science Division director, launched successfully from Wallops Flight Facility in Virginia on [Date]. The issue arose sometime after launch, prompting the mission control team to initiate a thorough investigation. “We are […]
I understood [reference] and am continually amazed how news sources don’t have some kind of automated review process to stop stupid errors like that.
For posterity, here are the relevant sentences which were left incomplete in the article:
Newspapers used to employ teams of sub-editors to fix up the articles. I used to do that job for a major newspaper, and it was surprising to see how bad some of the stuff coming from journalists was. Sometimes you’d basically have to rewrite the whole article from scratch. With the decline in quality of what gets published, I can only assume that when paper sales collapsed and revenues dropped they all decided to cut costs by firing the sub-editors.
But this is just some website that probably never had any quality control to start with.
I’d love to have human editors to fix up stories, but we have the technology now. There are FOSS tools like redpen that will help with spelling and grammar. AI tools ought to do a somewhat reasonable job of appraising a piece of text and yeah, a second human ought to sign off before publishing. I’d have thought content management systems would have review stages like software development. Authors could accept or override suggestions, but be required to acknowledge them. Like why isn’t journops a thing?
With this article I wonder whether we’re seeing a content-management screwup. It looks almost like it’s rendering the metadata markup associated with text instead of the text itself.
What the hell is this nonsense? The article is so generically written like someone prompted an AI to write a template article to speculate.
An AI wouldn’t make mistakes like this. This sort of screwup requires a human touch.
It was just the first article on the topic I found that seemed somewhat coherent. But yeah, quality journalism is really hard to come by these days.
I think you might be right. Another article by the same author seems like it could be entirely made up, only citing Wikipedia for things like the definition of the word ‘confidence’. I don’t know what would prompt it to leave these ‘fill in the blank’ sections though.
Maybe they just wanted to leave it. You know, a sort of MadLibs make-your-own-article thing, for fun. Can’t be any worse than existing internet misinformation sources.