Might AI be responsible for these?

AG31

Literotica Guru
Joined
Feb 19, 2021
Posts
1,504
I see that the AI take threads seem to have dwindled. Perhaps Laurel and co have given up hunting for them. I'd pretty much decided that it was impossible to distinguish not-very-good, but technically correct, human writing from AI. But now I think I might have bumped into it in the real world. What do you think?

The first example

A short while ago I bumped into at least 3, maybe a half dozen instances of misusing lay or lie and their various tenses. It really stood out because the author was a very popular one and someone must have paid a copy editor. A bad one. But what if it was AI built on a LLM that contained tons of similar misuses? Then, either it didn't catch what might be the real author's habitual mistake, or, worse, it "fixed" the uses.

The second example
(Note - I don't know the details of the contract resulting from the writers' strike, but maybe it hadn't kicked in by the time this was created????)

This may be hard to explain to people who have never watched the Prime series Mr & Mrs Smith. Like the movie of some years ago, it's the adventures of two people hired by a CIA-ish secret organization and given cover names of Mr & Mrs Smith. It's very light on thrills and heavy on comedy. The first three episodes had complex enough plots to hold my attention. The pacing was reasonable. While it wasn't believable, it wasn't blatantly ridiculous.

Episode 4 was so bizarrely stupid that I'm wondering if it was written by AI. There's been a lot of talk here about how one can recognize AI writing. Here are some candidates from episode 4.

1 - On the basis of a shared (common!) name two people meeting by chance in a park immediately start sharing top secret details of their lives, including giving the new acquaintances a tour of their safe room.

2 - The first 25 minutes of an hour show are taken up with "witty" banter among 4 people. This had been a funny action show up until this episode.

3 - The MCs take off with their new acquaintances on a "very, very high risk" assignment on a lark. They don't even know what it's about.

4 - They fly from NYC to South America in a helicopter.

If it weren't for the charming acting I would have given up on the episode in about 10 minutes.

But my question is serious. Is this an example of what might happen if producers turn to AI for their scripts?
 
Episode 4 was so bizarrely stupid that I'm wondering if it was written by AI. There's been a lot of talk here about how one can recognize AI writing. Here are some candidates from episode 4.

1 - On the basis of a shared (common!) name two people meeting by chance in a park immediately start sharing top secret details of their lives, including giving the new acquaintances a tour of their safe room.

2 - The first 25 minutes of an hour show are taken up with "witty" banter among 4 people. This had been a funny action show up until this episode.

3 - The MCs take off with their new acquaintances on a "very, very high risk" assignment on a lark. They don't even know what it's about.

4 - They fly from NYC to South America in a helicopter.

If it weren't for the charming acting I would have given up on the episode in about 10 minutes.

But my question is serious. Is this an example of what might happen if producers turn to AI for their scripts?
this sounds like normal Hollywood stupidity rather than AI stupidity. Remember, the target audience of most of these shows is people who probably think South America is a place in Mexico.
 
this sounds like normal Hollywood stupidity rather than AI stupidity. Remember, the target audience of most of these shows is people who probably think South America is a place in Mexico.
Hollywood is pretty bad, but the writers can distinguish Mexico from South America. Yeah, based on various videos I've seen, some audience members won't know any geography. I don't watch many movies now, so I can't off-hand think of a mistake like that in a film. Helicopters? A private jet may be able to cover the distance. And Columbia is a long way from Argentina.

There are endless videos like this.

 
Ah, yes Americans and geography. My own personal all-time favourite example of this stereotype.

 
this sounds like normal Hollywood stupidity rather than AI stupidity. Remember, the target audience of most of these shows is people who probably think South America is a place in Mexico.
(references wall-sized, future world-domination campaign map to verify)

:unsure:

info checks out.(y)
 
I'm normally quite liberal in indulging other people's choices in these regards. I thought "Lars & The Real Girl" was brilliant.

But this is how you know that I have absolutely zero sway on this site:

I'd be chasing suspected AI users out of the submission queue like a crazy old man in my bathrobe telling those pesky neighborhood kids to get off his lawn!
 
Threads have cooled down, because people have given up, not because the policy got better.
This is me. After several attempts I just gave up and posted the work elsewhere. It has made wanting to write for Lit really hard since I put a lot and care into my stories. It is a shame to have them rejected and to be accused of using AI.
 
I see that the AI take threads seem to have dwindled. Perhaps Laurel and co have given up hunting for them. I'd pretty much decided that it was impossible to distinguish not-very-good, but technically correct, human writing from AI. But now I think I might have bumped into it in the real world. What do you think?
Those samples don't feel AI ish to me. They read more like someone who just can't write very well. If they were AI, there'd be far more unnecessary verbiage, a lot more repetition, and in an extract that long, there'd be failures in logic and interconnectivity between beginning and end.
 
I see that the AI take threads seem to have dwindled. Perhaps Laurel and co have given up hunting for them. I'd pretty much decided that it was impossible to distinguish not-very-good, but technically correct, human writing from AI. But now I think I might have bumped into it in the real world. What do you think?

The first example

A short while ago I bumped into at least 3, maybe a half dozen instances of misusing lay or lie and their various tenses. It really stood out because the author was a very popular one and someone must have paid a copy editor. A bad one. But what if it was AI built on a LLM that contained tons of similar misuses? Then, either it didn't catch what might be the real author's habitual mistake, or, worse, it "fixed" the uses.

The second example
(Note - I don't know the details of the contract resulting from the writers' strike, but maybe it hadn't kicked in by the time this was created????)

This may be hard to explain to people who have never watched the Prime series Mr & Mrs Smith. Like the movie of some years ago, it's the adventures of two people hired by a CIA-ish secret organization and given cover names of Mr & Mrs Smith. It's very light on thrills and heavy on comedy. The first three episodes had complex enough plots to hold my attention. The pacing was reasonable. While it wasn't believable, it wasn't blatantly ridiculous.

Episode 4 was so bizarrely stupid that I'm wondering if it was written by AI. There's been a lot of talk here about how one can recognize AI writing. Here are some candidates from episode 4.

1 - On the basis of a shared (common!) name two people meeting by chance in a park immediately start sharing top secret details of their lives, including giving the new acquaintances a tour of their safe room.

2 - The first 25 minutes of an hour show are taken up with "witty" banter among 4 people. This had been a funny action show up until this episode.

3 - The MCs take off with their new acquaintances on a "very, very high risk" assignment on a lark. They don't even know what it's about.

4 - They fly from NYC to South America in a helicopter.

If it weren't for the charming acting I would have given up on the episode in about 10 minutes.

But my question is serious. Is this an example of what might happen if producers turn to AI for their scripts?

I got into writing because of things like Example #2. I said to myself that I could sit down and write a better story than anything Hallmark puts on the screen. I was right on first try, and I've never written before. So much crap is accepted and filmed on TV that you want it to be AI so you could point and say "See? Humans are superior!" but sadly it's no. That's a human writing it.

TV shows are written by committee and edited by committee and re-written by committee. They're written, they're not researched nor are they held up to see if they reflect real life. When watching TV keep that in mind, keep the dozens of committees in mind, then recite this mantra 10 times

Never underestimate the stupidity of people in large groups
 
@Duleigh Think some (of us) writers tend to reverse engineer a premise to get to a visual/emotional/situational payoff too.

So we're so fixated on building the path that gets where we are trying to land, that we don't always do a good job of spotting how implausibly the path works from the other direction.

To wit, I had a scene in a Christmas story that had two girls topless in a tropical setting that I had to retrace to a starting point with three siblings in quarantine in someplace like Wisconsin farmland, during December (whew):

thumb3_Moore-Blame-It-on-Rio_759747.jpg

The solution was a temporary greenhouse, some industrial incubator light fixtures and a diesel generator. Along the way, I had the generator run out of fuel at an inopportune moment, in order to delay some nudity, to keep tension alive for a minute, until we got to the next scene.

In the end, I put a bow on the whole thing and thought, "Oh, the cleverness of me." Until someone, who has evidently spent a lot more time around diesel engines than I have, got worked up enough to leave a comment alluding to all the terrible things that happen if you run a diesel motor out of fuel.

So yeah, it's a fine line between stupid and clever. And I didn't even need a committee. Just trying to stick the landing exactly where I wanted it to be, even if diesel mechanics would be offended.

TBH- I woulda thought the topless girls were enough of a distraction, but I guess that rules is rules when it comes to diesel engines. :rolleyes:
 
It's cold! You could have jelled the diesel fuel which would cause the diesel to sputter. It would have to be shut down while the fuel is heated and reliquefied and you're back on the road, topless on schedule!

I'm currently writing a "Retro Future Scifi" story and I love it! I don't have to research anything! Those old 50's stories went wherever they wanted. The only thing I'm researching is astronaut names, everyone is being named after actual astronauts except for the MCs Captain Scarlett and Colonel Vermillion
 
Those samples don't feel AI ish to me. They read more like someone who just can't write very well. If they were AI, there'd be far more unnecessary verbiage, a lot more repetition, and in an extract that long, there'd be failures in logic and interconnectivity between beginning and end.
There was no extract. Are you referring to a reply rather than the OP?
There were definitely failures in logic. Re-read the OP.
 
@Duleigh Think some (of us) writers tend to reverse engineer a premise to get to a visual/emotional/situational payoff too.

So we're so fixated on building the path that gets where we are trying to land, that we don't always do a good job of spotting how implausibly the path works from the other direction.

There's a very successful novel where the Big Reveal is that the target was the detective all along. The killer had a grudge against the detective and started killing people so that he'd investigate and... get lured into a trap, or something.

Following the story from the detective's POV, it seems to make sense. Somebody (not the killer) brings an intriguing case to him, of course he investigates. But trying to think it through from the killer's POV, there's no explanation for how the killer knew this particular detective would be on the case, out of all the people in NYC who might end up investigating a murder.
 
There's a very successful novel where the Big Reveal is that the target was the detective all along. The killer had a grudge against the detective and started killing people so that he'd investigate and... get lured into a trap, or something.

Following the story from the detective's POV, it seems to make sense. Somebody (not the killer) brings an intriguing case to him, of course he investigates. But trying to think it through from the killer's POV, there's no explanation for how the killer knew this particular detective would be on the case, out of all the people in NYC who might end up investigating a murder.
Sounds like a bad version of "Angel Heart". which btw STILL freaks me out after all the times I've seen it.

Its right up there with " SE7EN" and "Skeleton Key" in a small group of movies that just kind of stick with me ever since the first time I saw them and feel like whoever wrote them needs to be on a watch list somewhere.

I'm not into horror and most movies in the genre seem transparently badly executed but each of those feel like a dorsal fin hinting at something really, really bad lurking just out of sight.

First season of "True Detective" is like that too.
 
There was no extract. Are you referring to a reply rather than the OP?
There were definitely failures in logic. Re-read the OP.
I'm referring to the first and second examples in your OP. "Examples" = "extract". The second example talks about Mr and Mrs Smith, and what might have been nonsense in those scripts, but that's not "failures in logic" in the commentary which you suspect is AI..
 
I'm referring to the first and second examples in your OP. "Examples" = "extract". The second example talks about Mr and Mrs Smith, and what might have been nonsense in those scripts, but that's not "failures in logic" in the commentary which you suspect is AI..
Wait, I've misunderstood. I thought you were referring to those chunks of text as being written by AI, that you'd found somewhere. But I've just cast aspersions on your writing. Oops. Sorry.

It explains why they didn't sound AI though, so I at least passed that test.

As for your suspicions, probably.
 
This is me. After several attempts I just gave up and posted the work elsewhere. It has made wanting to write for Lit really hard since I put a lot and care into my stories. It is a shame to have them rejected and to be accused of using AI.

Where are you posting them?
 
Back
Top