Fake is Fake And I Hate Fake In All Forms

Generative AI's the big hot-button now. In some ways it is a new kind of thing that feels like it brings with it the capacity for a new and pernicious kind of fakery. But is there anything really new here? Bad text and bad images have been being generated on purpose by people for as long as those things have been able to exist. Is the capacity to generate these things on demand and with little effort really that different?
The explicit difference with Generative AI and everything else you mentioned is that AI is built around stealing from existing works, most often without consent. Every AI generated story, every shitty AI image... those are all hodgepodged together (very badly) from something an actual human took the time and effort to create.

Conflating AI use with CGI, practical effects, new editing techniques, etc., is a disservice to the people who innovate those new tools. Yes, they're tools that help them create the next best thing, but they're not doing that by stealing.
 
Well, the recent MIT study is extremely disturbing. According to the study, ChatGPT may be eroding our critical thinking skills. LINK AI is not being used to assist, but to take place of actual work being done. People are too lazy to put in the work, so they have AI do it for them.
 
Point is, is generative AI 'fake'? I guess. But it's not new fake. It's just easier fake. The problems it creates will be shaped by the medium in which it operates.
I fundamentally disagree with your point here. I teach a very popular course on the history of technology. I am very aware of the impact of new technologies and the people who fawn over any new technology and those who have just as knee-jerk negative reaction to them. People who want to draw a line and say nothing after X. Paper, The Hindu-Arabic Number system. Cameras. Recorded music. They were all going to be the end of civilization is one way or another.

Maybe I am just getting too old, but I really do think generative AI is different in a fundamental way. It combines the worst aspects of cameras and recorded music into one piece. It is not creating anything. It is stealing actual creative artists work and mashing it together in ways never possible before. In some sense, cameras do not create anything new. But they can capture a new vantage point, a new experience. Photography is most definitely a creative art, which many artists denied in the mid nineteenth century. Recorded music does not create something new; the performance that is being recorded is what is new. It simply enables sharing that performance with a larger audience, larger in both place and time. But it does not obscure the artist, the creator. Even a hundred years after the invention of the phonograph, in the golden age of vinyl records, when the recording engineer did have creative powers, knowing who that was on an album was an obscure bit of trivia. The listeners want to know who the performer is, maybe the songwriter, maybe even the producer.

Generative AI just steals actual creative work, with no credit, to produce a mediocre mish mosh.I guess it could be considered a democratization of the arts, but that is the worst kind of democratization. Forcing everyone together into mass mediocrity. Viva la difference.

Enough of a rant.
 
I see the argument likening anxieties about generative AI to traditional resistance to new technologies. It's hard to deny the similarities, and I can admit that I likely have blind spots being inside my own cultural context. At every point along that chain of innovation and resistance I'm sure people said confidently But this is different!

But... this does feel different.

I'm sure generative AI can be used as a tool like any other to make enhancements or smooth the edges on a human creation. I've seen short stories published that were written in "collaboration" with ChatGPT or others, where part of the point is to play with the technology and see how it can be tweaked and steered to make something interesting.

And as a tool, I'm okay with it. Use what's available, make yourself better. I'm not drawn to it personally -- I happen to like the process itself, choosing specific words, replacing them with better ones -- but I'm sure used in moderation it can have value.

When CGI is overused the result is ugly and a little soulless. CGI characters move not quite naturally, they look to me a little flattened, even with the latest and greatest technology. The best effects are a modest amount of CGI used to polish up practical effects wherever possible. Jurassic Park still looks great 20+ years later, in my opinion, because of the judicious mixing of the two.

When AI is overused, it bypasses the creative process entirely. It produces, by definition, derivative schlock, but it's derivative schlock that doesn't even have the added benefit of being written. It's ugly and soulless on a whole different level.

I'm not myself a fan, generally, of derivative schlock. It can be good in movie form on a hungover Sunday or something, a mindless diversion. But as derivative and as schlocky as it gets, at least it feels human. I think that gets lost when you start to lean on AI.
 
When AI is overused, it bypasses the creative process entirely. It produces, by definition, derivative schlock, but it's derivative schlock that doesn't even have the added benefit of being written. It's ugly and soulless on a whole different level.

I'm not myself a fan, generally, of derivative schlock. It can be good in movie form on a hungover Sunday or something, a mindless diversion. But as derivative and as schlocky as it gets, at least it feels human. I think that gets lost when you start to lean on AI.

I'm not much of a tech person, but but it seems pretty likely to me that AI is going to keep cranking out errors which later generations of AI will take in as data and regurgitate in an endless loop of increasingly less coherent information until the whole system breaks down into chaos, the way photocopies of photocopies decline in quality with each iteration.
 
I'm not much of a tech person, but but it seems pretty likely to me that AI is going to keep cranking out errors which later generations of AI will take in as data and regurgitate in an endless loop of increasingly less coherent information until the whole system breaks down into chaos, the way photocopies of photocopies decline in quality with each iteration.
This is already happening in real time. Every garbage AI article gets added to the pile that gets scraped the next time someone runs a query. Every shitty AI image gets used the next time someone feels like casually creating a profile image or cover art. And more and more companies continue to just double down on using AI while its unregulated. It's death by a thousand papercuts in one of the most ridiculous ways imaginable.
 
I'm not much of a tech person, but but it seems pretty likely to me that AI is going to keep cranking out errors which later generations of AI will take in as data and regurgitate in an endless loop of increasingly less coherent information until the whole system breaks down into chaos, the way photocopies of photocopies decline in quality with each iteration.
That's kinda my hope. I don't really know what to expect. I see those reports and hear the prognostications of just that result from people who seem to know what they're talking about, and I believe them. But I also hear and read things about people within the industry talking about the inevitability of AGI, how AI will become a self-perpetuating loop of improvement until it surpasses human cognition. And I don't exactly believe them, but I guess I'm too cautious to dismiss the possibility out of hand. I'm sure it's exaggerated; but it wouldn't take quite the huge leaps they're predicting to really turn shit sideways.

Sometimes it's hard not to assume the worst possible outcome. I blame the daily news.
 
I'm not much of a tech person, but but it seems pretty likely to me that AI is going to keep cranking out errors which later generations of AI will take in as data and regurgitate in an endless loop of increasingly less coherent information until the whole system breaks down into chaos, the way photocopies of photocopies decline in quality with each iteration.
GIGO, according to one of my AI sources:

Garbage in, garbage out" (GIGO) is a data principle stating that the quality of the output is directly dependent on the quality of the input. In other words, if you feed a system flawed, inaccurate, or irrelevant data (the "garbage in"), the results you get (the "garbage out") will be equally flawed or unreliable. This principle applies to various fields, including computer science, data science, and artificial intelligence.
 
But I also hear and read things about people within the industry talking about the inevitability of AGI, how AI will become a self-perpetuating loop of improvement until it surpasses human cognition.
They are either true believers or they are management trying to sell you on AI.

AGI is decades or more away. Current AI is in no way intelligent, let alone conscious/sentient.
 
They are either true believers or they are management trying to sell you on AI.

AGI is decades or more away. Current AI is in no way intelligent, let alone conscious/sentient.
I absolutely agree re: current AI. And I'm not predicting imminent AGI. I hope you're right.
 
Back
Top