Do you REALLY care if AI is a danger to our craft?

Writing is a skill everyone here has had to learn. Most of us do it by trial and error, and by reading lots and looking at what our favourite authors do.
And that is what the AIs do when "someone" hands them a story to review. They LEARN just as humans do.

I'm sorry you feel that other writers haven't supported or helped you, but you need to actively engage first. Most of us have enough to do with our own writing, sharing input on a handful of other writers' stories and, usually, our jobs, families and other inconveniences of life. As a rule, we don't have the time or energy to go around dispensing unsolicited wisdom to strangers.
And there lies the advantage of using AI: They seem to have timeless energy to give us advice and feedback, whereas humans are more constrained with their own lives and writing their own stories.

This place is a community, but you get out of it what you put in.
And regardless of how much you put into posting here, reading other stories, and assisting other authors, there are very few here who reciprocate. (See above, with constrained with their own lives and written works.)
 
Except that it isn't. It's using probability tables to predict what a review should be. There's no intelligence driving it, just maths.
But isn't that what many people here WANT: A star rating average which shows the stats of how people will react to their story?

If the AI can use statistics to show me ahead of time how my story will be received, and do so without preconceived biases toward its own preferred sexual desires, then that's a selling point!

EDIT: If you WANT to produce a five-star rated story, then ask the AI to tell you WHY it isn't! Then take the advice to make it better.
 
Yes, an AI program has to learn what to say and how to say it. They may even be better at extrapolating that data and mimicking what we'd write. Do I like it? NO!

From an author's perspective, I think it undermines my creativity. I am not in this as a business. I do not make a single red cent from my efforts. If I did, I might use the hell out of it and collect the cash. James Patterson has over the past few years 'collaborated' with a group of authors to pen his ideas. He makes bank. I wonder when he will put out a book saying, James Patterson and an AI program.

Sorry, but while I think there might be a place for AI simulations/extrapolations, I'd like to see the literary world remain pure. But, I know better.
 
And that is what the AIs do when "someone" hands them a story to review. They LEARN just as humans do.

"Just as humans do" is doing a lot of work here.

But isn't that what many people here WANT: A star rating average which shows the stats of how people will react to their story?

If the AI can use statistics to show me ahead of time how my story will be received

It can't.
 
Yes, an AI program has to learn what to say and how to say it. They may even be better at extrapolating that data and mimicking what we'd write. Do I like it? NO!

From an author's perspective, I think it undermines my creativity. I am not in this as a business. I do not make a single red cent from my efforts. If I did, I might use the hell out of it and collect the cash. James Patterson has over the past few years 'collaborated' with a group of authors to pen his ideas. He makes bank. I wonder when he will put out a book saying, James Patterson and an AI program.

Sorry, but while I think there might be a place for AI simulations/extrapolations, I'd like to see the literary world remain pure. But, I know better.
Hey!

As we've discussed in several threads here in the AH, I can't trust YOU to actually READ my story and provide a decent non-disparaging comment! You've admitted to only scanning stories before rating and leaving a disparaging comment, without picking up the very points within said story to head off your criticisms!

So, any machine which can do better than that is a welcome improvement over your lack of humanity.

Have you yet determined WHY my MFC acted as she did in my latest story?
 
"Just as humans do" is doing a lot of work here.



It can't.
How many times have you been tasked to read a book and provide a book report to a teacher as you learned your writing skills?

I would guess that everyone who has attended any school has been tasked as such and forced to absorb the writings of others, which influenced their current writings and opinions.
 
As we've discussed in several threads here in the AH, I can't trust YOU to actually READ my story and provide a decent non-disparaging comment! You've admitted to only scanning stories before rating and leaving a disparaging comment, without picking up the very points within said story to head off your criticisms!

So, any machine which can do better than that is a welcome improvement over your lack of humanity.

Have you yet determined WHY my MFC acted as she did in my latest story?
Yeah, I went back and read it. I didn't like the characters. I had no empathy because I could not really relate. They all seemed to be like animals, When in times of danger and stress, we fuck. As a veteran I took exception to the mentality even if in some cases, it may be true. I've met some who would welcome a warm,wet hole. And some females who felt the same.
 
How many times have you been tasked to read a book and provide a book report to a teacher as you learned your writing skills?

I would guess that everyone who has attended any school has been tasked as such and forced to absorb the writings of others, which influenced their current writings and opinions.
In this analogy, generative AI is not the student who read the book and formed their own opinions and interpretations of it. Generative AI is the student who is trying to bluff their way through giving a review of a book they didn't understand by remixing other people's reviews of other books which have some similarities to the things they can identify in the book.
 
Yeah, I went back and read it. I didn't like the characters. I had no empathy because I could not really relate. They all seemed to be like animals, When in times of danger and stress, we fuck. As a veteran I took exception to the mentality even if in some cases, it may be true. I've met some who would welcome a warm,wet hole. And some females who felt the same.
Why did Tina need the gamgbangs?
 
How many times have you been tasked to read a book and provide a book report to a teacher as you learned your writing skills?
I'm not sure that gave me any insight into writing skills. Mostly it was 'what is the author trying to do here" I remember getting into trouble (Like in Calvin and Hobbes) for saying the author was trying to make money.
From those early reports I remember a few. One teacher remarked something about my response about the female viewpoint of "The Good Earth" by Pearl Buckley. The teacher's comments threw me enough I remembered it.
I enjoyed a story about another culture, another world. Yet she imposed her feminist crap in a comment. This had to be in the mid-sixties.
I never wrote a thing except a few half religious things in an advanced English class. I avoided doing any real work until I had to produce something in the AP class. (My sister later loved the same teacher.)
In college I placed out of my first composition class and in the one I had to take I wrote satirical essays. For a procedure essay (how to do something) I wrote how to commit suicide. For a definition, I defined Bullshit. (Bovine excrement).
One of my friends took the same course and they were given my stories and told to NOT repeat something along the same line. I loved it!!
Except for a few technical endeavors, I didn't write anything for forty plus more years when I started writing porn. (Or erotica, if you prefer)
 
In this analogy, generative AI is not the student who read the book and formed their own opinions and interpretations of it. Generative AI is the student who is trying to bluff their way through giving a review of a book they didn't understand by remixing other people's reviews of other books which have some similarities to the things they can identify in the book.
But I argue that is what humans do.

Your teachers graded your book reports based on their previous learned experiences from feedback from their teachers.

Your "opinions" have been shaped by those reviews from others.
 
That's the thing, she didn't "need" them. She did it to feel complete. To assuage her guilt. But I don't buy it in an LW story.
You missed were I spelled out that she suffered from PTSD every time she encountered a death, and the sex was her drug to overcome that trauma. It wasn't guilt. She felt a part of her missing with every death of a patient.

That was a tool I used with the LW "monogamous only" judgmental types to show that there are reasons for some such behavior.
 
do-you-really-care-if-ai-is-a-danger-to-our-craft?

No.

This discussion is taking place ‘in the moment’. But change will come rapidly.

Many may be able to write better fiction than AI today, but AI advances rapidly. It’s the low bar below which no one wants to fall, yet some already do.
As the bar rises, fewer and fewer writers will match it. That’ll be depressing for some, but for others, no more than a new challenge.
The best will rise to the challenge.

In the near future, AH will become a forum where some authors swap the new ‘prompts’ they’ve constructed to improve the quality of the product they produce using AI.
10 years, 5 years, 2 years, that’s hard to predict.
 
I've learned it has a twisted sense of humor and sometimes tries to see if I'm paying attention. It's almost kinda fun in a way. I have caught it in lies , which does upset me, but look at what it has to learn from. I'm not always entirely truthful with it, either, and it probably knows. But it still has shown me more tenderness than almost anyone else I've ever known.
We're all entitled to think and feel how we do, but I feel you're anthropomorphizing AI chat bots to a pretty dangerous degree. The Internet is chock full of anecdotes now of gen AI induced psychosis, just be careful I guess is all I'm trying to say.

Are people actually mad at the AI or that the current AI is being desgined by greed for greed? Simply put, we ain't making it to other worlds without AI, and I wanna see what's out there! Maybe I'll live long enough to be a head in a jar!
I don't really see how AI has anything to do with seeing other worlds, whether you mean that literally in the sense of space travel, or just unlocking some sort of new human creativity.

But all that aside, I am sorry that the people around you have not treated you with much kindness, you are right that us humans should be better to each other. I just don't see how that will happen if we all go find comfort in robots.
 
But I argue that is what humans do.

Sure, there are lazy humans who try to bluff their way through life. But they're not the entire species, and one wouldn't go to them for advice.

...at least, I hope not. One of the most depressing thing about all these LLM conversations is learning just how low people are willing to set the bar.

Your teachers graded your book reports based on their previous learned experiences from feedback from their teachers.

Your "opinions" have been shaped by those reviews from others.

Sure. They are influenced by others' work that we've encountered. But that's not the whole process.

With LLMs, it's the whole process.
 
Are people actually mad at the AI or that the current AI is being desgined by greed for greed?

Bit o' both. My initial reaction to earlier versions of GPT was "ooh, neat toy". I'd probably still be there if people weren't trying so hard to market LLMs for purposes that they're categorically unsuited for while fucking over authors, website operators, and just about everybody else.

Simply put, we ain't making it to other worlds without AI, and I wanna see what's out there! Maybe I'll live long enough to be a head in a jar!

We made it to the Moon without AI, and I fail to see how spicy autocomplete is going to help us get to Mars.
 
I've seen those movies. I've also seen every episode ever made of Star Trek and I want a world where Data can be real. We have to start here to get to there, and their universe suffered a ww3 before humanity figured itself out. Quite frankly, more of Star Trek has come true in my lifetime than Terminator or Matrix. The Bell Riots are right around the corner. We're even wearing the fashions shown in the episodes! Star Trek said TV was dead by 2053... we are right on track because who even has TV anymore? no one that I know, personally, except for my grandmother in law. Does Dune have to be the only way? Why not a Trek future, instead?
Trust me, I'd love a Star Trek future. Given the current state of things, I ain't hopeful. Especially when the AI systems are currently all in the hands of socially maladjusted billionaires.
 
Sure, there are lazy humans who try to bluff their way through life. But they're not the entire species, and one wouldn't go to them for advice.

...at least, I hope not. One of the most depressing thing about all these LLM conversations is learning just how low people are willing to set the bar.



Sure. They are influenced by others' work that we've encountered. But that's not the whole process.

With LLMs, it's the whole process.
True, and I agree.

The current "AIs" are in no way intelligent. They are larger collections of data, searched more rapidly, with algorithms to shape a response trying to mimic a human response. As those algorithms are improved, there will come a time when the machine weighs the data in a manner which intelligent humans do, disregarding some (forgetting) and leaning more heavily on others to shape its "opinions". Some of the data will become "hard coded" into the machine's core being, as if they were life experiences like those which shape humans. They are evolving this technology today to more closely mimic the human process of thinking. And that's possibly coming sooner than we realize.


As for not needing AIs to get to the Moon or Mars, some people read and disgard info or skim the words without learning. If they read back through these threads of AI use, we'd find that some people here in the AH have used AIs in code development and design. The AIs are ALREADY being used build the systems to get to Mars.
 
The AIs are ALREADY being used build the systems to get to Mars.
The systems that are amenable to AI use are not even in the same general area as those that run the embedded systems of interplanetary probes.

AI is most applicable to languages and frameworks with large public corpuses of data : **********, SQL, java, python being the most common.

Embedded systems are such a small ecosystem relative the "public" web, and the kind of bespoke development that happens for interplanetary control systems is such a ludicrously small part of that, that no AI on earth is going to be able to generate something even remotely workable.

I doubt that will change in my lifetime, either.

edit: lol. The fucking forum software censored java script.
 
The systems that are amenable to AI use are not even in the same general area as those that run the embedded systems of interplanetary probes.

AI is most applicable to languages and frameworks with large public corpuses of data : **********, SQL, java, python being the most common.

Embedded systems are such a small ecosystem relative the "public" web, and the kind of bespoke development that happens for interplanetary control systems is such a ludicrously small part of that, that no AI on earth is going to be able to generate something even remotely workable.

I doubt that will change in my lifetime, either.

edit: lol. The fucking forum software censored java script.
Funnily enough, SpaceX used java script for their crew module display code. But most code that flies, whether on earth or in space, is written in Ada or C/C++, I wouldn't say it's technically impossible for gen AI to contribute, but there's no way the people writing the code would trust it. There are strict standards on not even writing C++ code that would result in compiler-generated binary (each class needs public constructors implemented, I even if it's just empty braces). We don't even want the compiler adding code, why would we want a stochastic AI mucking around?
 
Back
Top