Advice from veteran authors

I've read an interview/discussion on the subject of AI recently, and one of the participants pointed out this fallacy. I thought it was an interesting aspect of the whole AI-conversation. Intelligence and education can take us far, so far as to understand a Highly Complicated System, such as a computer. As incredibly complicated as they are, and almost appearing magical in their nature to the people who don't know much about them, computers are still predictable. We know exactly what their capabilities are.
Academically they are seen as a dead end with respect to AGI. The fact that they are unpredictable and maybe have other capabilities isn’t normally seen as changing this perspective. You can maybe use your Cybertruck to keep pigs in, doesn’t mean it’s inherently a cool product.
 
I think this is mixing terminology from two fields, which may be great for SciFi, less so for science.
I've seen at least two unrelated experts in the field call LLM a complex system. Again, I am not sure if this is universally accepted classification.

I agree about the comparison with AGI, but AGI isn't here yet and won't be for years, maybe decades. So LLMs, with all their issues and challenges, will be the basis of our AI experience for the next five to ten years at least, and so much can happen in that period of time. A whole paradigm shift maybe.
 
LLMs are a step up from that, moving into Complex Systems. They might appear the same as a highly complicated system, but that isn't so. We can't predict their output or their evolution with 100% certainty, nor can we try to understand the way they work by seeing them as a sum of their parts - the way we can analyze highly complicated systems. I think that's an important point.
There's a subtle flaw in reasoning going on here, which is to place highly complicated systems and complex systems on some kind of spectrum with each other. They're just different kinds of systems. One is not more advanced than the other, nor are they exactly mutually exclusive. LLMs, after all, run on computers.

The climate is a complex system. The human body is a complex system. Neither is more 'advanced' than a computer. The computer has the advantage of being deterministic, but that's a limitation though another lens.

So the flaw in reasoning is to assume that LLMs being a complex system signals some kind of advancement. It's not. We've been able to build complex systems within computers for about as long as there have been computers. This has absolutely nothing whatsoever to do with LLM's capacity to become AGI, which, as far as I can tell, is decidedly zero.
 
Last edited:
There's a subtle flaw in reasoning going on here, which is to place highly complicated systems and complex systems on some kind of spectrum with each other. They're just different kinds of systems. One is not more advanced than the other, nor are they exactly mutually exclusive. LLMs, after all, run on computers.

The climate is a complex system. The human body is a complex system. Neither is more 'advanced' than a computer. The computer has the advantage of being deterministic, but that's a limitation though another lens.

So the flaw in reasoning is to assume that LLMs being a complex system signals some kind of advancement. It's not. We've been able to build complex systems within computers for about as long as there have been computers. This has absolutely nothing whatsoever with LLM's capacity to become AGI, which as far as I can tell, is decidedly zero.
I am not sure how you inferred all that from my post. I never said LLMs were an advancement of anything. I merely pointed out that many people assume experts can wholly predict the behavior of their LLMs, which would be possible if they were complicated systems, but they aren't. I also never mentioned LLMs evolving into AGI?
 
I am not sure how you inferred all that from my post. I never said LLMs were an advancement of anything. I merely pointed out that many people assume experts can wholly predict the behavior of their LLMs, which would be possible if they were complicated systems, but they aren't. I also never mentioned LLMs evolving into AGI?
Because of the language you used:

Intelligence and education can take us far, so far as to understand a Highly Complicated System, such as a computer. As incredibly complicated as they are, and almost appearing magical in their nature to the people who don't know much about them, computers are still predictable. We know exactly what their capabilities are.
LLMs are a step up from that
Step up in what sense if you didn't mean advancement?

Complex systems aren't exactly difficult to 'understand', they just have chaotic outputs and so are near-impossible to predict. Being predictable isn't exactly the same thing as being understandable.

I'm not really trying to argue with you, it's just the hype men have a habit of using language like this to muddy the waters. As you say, most of this is so far over most people's heads it might as well be magic. The fact that LLMs have unpredictable outputs is not a signal of anything other than the fact they are coming out of a complex system, which is a tautological observation based on the definition of a complex system.
 
Step up in what sense if you didn't mean advancement?
A step up in complexity, not an evolutionary step. I am certain no one here thinks LLM is an evolution of a computer.

Because of the language you used:



Step up in what sense if you didn't mean advancement?

Complex systems aren't exactly difficult to 'understand', they just have chaotic outputs and so are near-impossible to predict. Being predictable isn't exactly the same thing as being understandable.
Can you tell me exactly which complex system we completely understand?
This is an error in logic, as predictability comes directly from the understanding of how a system works. The apparent randomness we see in complex systems comes as a result of our understanding not being at 100%.
You mentioned the human body as an example of a complex system. I am certain you are aware that after all the research and time invested in understanding the human body, and all our great advancements, we still don't understand how some things in our bodies happen.

The fact that LLMs have unpredictable outputs is not a signal of anything other than the fact they are coming out of a complex system, which is a tautological observation based on the definition of a complex system.
A less-than-predictable output isn't the only characteristic of a complex system, nor the only difference in comparison to a complicated system.
 
I've seen at least two unrelated experts in the field call LLM a complex system. Again, I am not sure if this is universally accepted classification.
LLMs are complex systems, that’s not the same as Complex Systems, which is a precise mathematical term. From what you cited, they seemed to be conflating the two.
 
A step up in complexity
Yeah. That's the error in reasoning. It isn't true. The langue is confusing because it is jargon, and it has specific use cases for specific domains.

There are such things as simple complex systems and complex simple systems. Simple and complex have very specific meanings in this context that are different from the colloquial usage.

In any case, the climate is an example of a complex system that is well-understood. It's really not even that hard to understand 99% of how it works. But it's impossible to predict weather with certainty, regardless of one's level of knowledge of it. You are incorrect that our level of understanding of something is always correlated with our ability to predict it. Sometimes understanding of a thing only leads to the certainty that it is impossible to predict. Usually in fields like this, predictions are issued with probabilities, as in the case of weather predictions. The determinism or chaos of a system is unrelated to how difficult that system is to understand.
 
This is an error in logic, as predictability comes directly from the understanding of how a system works. The apparent randomness we see in complex systems comes as a result of our understanding not being at 100%.

No, understanding the rules of a system does not necessarily give us the ability to predict its behaviour perfectly. Examples: the Halting Theorem, Conway's Game of Life, the three-body problem. In these cases, we have a complete and perfect understanding of the rules governing these systems, but in general it's not possible to perfectly predict their behaviour.
 
No, understanding the rules of a system does not necessarily give us the ability to predict its behaviour perfectly. Examples: the Halting Theorem, Conway's Game of Life, the three-body problem. In these cases, we have a complete and perfect understanding of the rules governing these systems, but in general it's not possible to perfectly predict their behaviour.
Which is why two things are clearly being conflated here.
 
Which is why two things are clearly being conflated here.
Indeed. To the hype men's benefit, unsurprisingly.

The output of LLMs is difficult to predict because of their properties as a Complex System (several components interacting), not because they are more advanced or represent an evolution of technology. In the same way that the result of a simulation of the Three Body Problem is difficult to predict.

This is increasingly a drawback of LLMs, in any case. It makes it very difficult to solve the problems they create, because it is not always obvious why the problems happened to begin with, and so it is not always obvious what ought to be done to solve them. A tool being a complex system is generally not a good thing. Imagine if your car had to be worked on by the weather man.

"Well, you see, there's a 60% chance that your cam shaft is going to go out in the next 3 months. Of course, there's a 23% chance it could last another 10 years. Who can say? So, what do you want to do?"
 
A really dumbed down definition of a complex system in math is one whose governing equations are easy to state, but which leads to outcomes that are not just hard to predict, but may be rigorously proven to be impossible to predict.

The cannonical example is the logistic map, where r > 3:

IMG_2031.jpeg
This is not the same as the black box element of LLMs.
 
To test some of the comments in posts here - I fed one of my story to ChatGPT (with grammatical mistakes, little idea of how people talk in West and all etc) and asked it to proofread and refine it so that it fits as if based in USA.
To my surprise it did really well - like shockingly well. The premise remained same but the language got changed as if I 'know' English - at least to my eyes.
Also it ruined certain sections by removing all emotions and making it mechanical - so if you are reading those portions you know it is machine talking.
 
I'm going to weigh in, gingerly, on this ChatGPT/AI issue, because I have nothing to say about it from a technical standpoint, being ignorant. My perspective is that of a lay user.

I personally don't think it matters what tools you use. It depends on how you use them. You can, if you want, use a whole array of AI-type tools, and you can still write with your own voice. It just requires some conscientiousness as you write. AI is the future, and there's no avoiding it, so the issue shouldn't be how to avoid AI but how to continue to be an original, creative artist in an AI world.
 
Amen. Jolee Bindo's path always seemed the most logical and realistic path between two opposite peaks of Zealotry.

*Steps up on soapbox*

I've got to credit Noah Caldwell-Gervais for exposing me to this specific analytical lens, but I've found that by far the most interesting way to look at Star Wars in general and KOTOR 1&2 in particular is via Joseph Campbell and The Hero With a Thousand Faces. This is because that particular work and line of thinking became so commonly applied in American media up until maybe 5-10 years ago that the entire thing can be read as a meta-meta commentary on the state of American storytelling (and culture, by extension) in that time. This requires laying a bit of pipe, so if you don't care, please feel free to ignore me.

Star Wars (the original trilogy) was consciously written to be the ur-hero's journey after Lucas was exposed to Campell's work in college. Lucas isn't shy about this. And, funny enough, KOTOR was consciously written to be an even more exact 1:1 translation of each and every step of the hero's journey than Lucas's originals were.

And that's where it gets real interesting, because KOTOR 2 was consciously written to be an incisive repudiation of the very concept of the hero's journey, picking it apart piece by piece, even to the detriment of the game itself.

The Force is kinda the quintessence of Campell's and by extension Lucas' worldview, and the Jedi is Lucas' attempt to square that worldview with an ethical framework. I find this absolutely fascinating, because it seems to me to have accidentally encapsulated the glaring limitations of this view of the world, which has a lot of implications to the real world, given how influential both Star Wars and THWaTF became. In short, it is inherently authoritarian to attempt to flatten all culture into a monoculture, and that origional sin will cause many problems regardless of how benevolent its practitioners appear. It's an inherent critique of hegemony by way of simply portrying a straighforward hegemony.

Ok, that pipe laid, Jolee Bindo is a pretty great representation of the best that a regular-type person could do under the thumb of the Jedi/Sith forced dichotomy. But fundamentally, he feels he is forced to accept this status quo, and to a large extent, that is true in that he is not willing to pay what it would cost to buck that system even more than he already does. And this is where you can bring in ideas like Capitalist Realism and Fukyama's vaunted The End of History to encapsulate both the incredible hubris of the Jedi and also the incredible power of a hegemonic status quo that most people are absolutely convinced is completely impossible to change.

And that's why you have Kreia in KOTOR 2. What does it take to actually disrupt an intractable status quo? Well, a force anarchist, of course. Borrowing a character concept from Planescape: Torment to come in and wreck the Jedi and Sith's bullshit (and Lucas and Campell's by extension) (and the entire media-industrial complex built on top of The Hero With a Thousand Faces by extension extension). Kreia is not a regular-type person. She's willing to fuck some shit up, and she has, and she does. Doesn't matter what it takes, doesn't matter what it costs, she's going to burn that shit down or die trying.

Jolee and Kreia are both really interesting and imperfect approaches to being countercultural in the face of a calcified, decaying hegemonic power. I've got a sense that some kind of synthesis of the two would be a fascinating thing to explore, and maybe some day I'll feel masochistic enough to try.

And if that was at all interesting to anyone, check this out. Noah covers this in a lot more detail than I could.
 
*Steps up on soapbox*

I've got to credit Noah Caldwell-Gervais for exposing me to this specific analytical lens, but I've found that by far the most interesting way to look at Star Wars in general and KOTOR 1&2 in particular is via Joseph Campbell and The Hero With a Thousand Faces. This is because that particular work and line of thinking became so commonly applied in American media up until maybe 5-10 years ago that the entire thing can be read as a meta-meta commentary on the state of American storytelling (and culture, by extension) in that time. This requires laying a bit of pipe, so if you don't care, please feel free to ignore me.

Star Wars (the original trilogy) was consciously written to be the ur-hero's journey after Lucas was exposed to Campell's work in college. Lucas isn't shy about this. And, funny enough, KOTOR was consciously written to be an even more exact 1:1 translation of each and every step of the hero's journey than Lucas's originals were.

And that's where it gets real interesting, because KOTOR 2 was consciously written to be an incisive repudiation of the very concept of the hero's journey, picking it apart piece by piece, even to the detriment of the game itself.

The Force is kinda the quintessence of Campell's and by extension Lucas' worldview, and the Jedi is Lucas' attempt to square that worldview with an ethical framework. I find this absolutely fascinating, because it seems to me to have accidentally encapsulated the glaring limitations of this view of the world, which has a lot of implications to the real world, given how influential both Star Wars and THWaTF became. In short, it is inherently authoritarian to attempt to flatten all culture into a monoculture, and that origional sin will cause many problems regardless of how benevolent its practitioners appear. It's an inherent critique of hegemony by way of simply portrying a straighforward hegemony.

Ok, that pipe laid, Jolee Bindo is a pretty great representation of the best that a regular-type person could do under the thumb of the Jedi/Sith forced dichotomy. But fundamentally, he feels he is forced to accept this status quo, and to a large extent, that is true in that he is not willing to pay what it would cost to buck that system even more than he already does. And this is where you can bring in ideas like Capitalist Realism and Fukyama's vaunted The End of History to encapsulate both the incredible hubris of the Jedi and also the incredible power of a hegemonic status quo that most people are absolutely convinced is completely impossible to change.

And that's why you have Kreia in KOTOR 2. What does it take to actually disrupt an intractable status quo? Well, a force anarchist, of course. Borrowing a character concept from Planescape: Torment to come in and wreck the Jedi and Sith's bullshit (and Lucas and Campell's by extension) (and the entire media-industrial complex built on top of The Hero With a Thousand Faces by extension extension). Kreia is not a regular-type person. She's willing to fuck some shit up, and she has, and she does. Doesn't matter what it takes, doesn't matter what it costs, she's going to burn that shit down or die trying.

Jolee and Kreia are both really interesting and imperfect approaches to being countercultural in the face of a calcified, decaying hegemonic power. I've got a sense that some kind of synthesis of the two would be a fascinating thing to explore, and maybe some day I'll feel masochistic enough to try.

And if that was at all interesting to anyone, check this out. Noah covers this in a lot more detail than I could.
I know that AH has true champions of derailment, but I'd say that the topic of KOTOR, the only great piece of Star Wars after episode three, deserves more than to be another derailment of the original topic.
Maybe it is god damn time to have a dedicated SW thread? ;)
And it's probably fair if @Kumquatqueen gets the honor of creating it :p
 
You shut your whore mouth. Rogue One stands with Empire and RotJ (yes, even with those fucking Ewoks) as the three best Star Wars movies ever made. KOTOR can join them in the shrine, though, for reals.
I never got very far with KOTOR (I've been to Kotor, though), but the hierarchy of SW movies is:

Empire
Rogue One
A New Hope
Jedi (might rank higher if they'd used Wookiee instead of Ewoks)
The space battle from the start of Revenge of the Sith
A big pile of disappointment that includes pretty much everything else
 
You shut your whore mouth. Rogue One stands with Empire and RotJ (yes, even with those fucking Ewoks) as the three best Star Wars movies ever made. KOTOR can join them in the shrine, though, for reals.
Haha, I liked Rogue One, and I think it's a good movie to be honest, but I've always seen it as... tangential Star Wars, sort of. Andor is pretty decent as well, but it has the same level of importance to the big story.
 
Hands down, the Vader scene in Rogue One is my favorite scene in the entire franchise. First time on screen that Vader lived up to his name.
 
Back
Top