"So long, my friends, and thanks for all the fish!" - Replaced by AI - Not your usual AI post.

Yeah.

Sometimes, after reading threads on the internet, I'm not sure it's such a bad thing that we're getting sidelined.

But I think you're underestimating humanity a bit.

Again, nope. We can even take a thread designed to let us know about a colleague's misfortune and turn it into a dick-measuring contest. It's like we can't even control ourselves sometimes.

We have all the ability in the world, which is why it's so frustrating that we just piss it away.

Anyway, best of luck, OP. Best wishes.
 
Sorry to burst your bubble, but I HAVE quit jobs when dealing with complete morons!

I walked out of one job after ...
I've walked out of more than one after some clashes.

One lady threw a fit because I didn't say hello to her in an elevator. I had stuff on my mind and didn't even know anyone was there. She complained to my boss who complained to me. I quit almost on the spot. Just had to wait a day or two to get all my personal belongings out of the building before I did it.
 
FWIW, I quit a job on the occasion of a co-worker being "laid off" (yeah, right) two weeks before his pension vested. He was screwed, and it simply underscored the evil I was dealing with.

In the end, I jumped from the frying pan into the fire. My new employer badly misrepresented (i.e., lied about) the job and the company's financial condition. It took three months to fully audit the books, and I quit on a day's notice once the reality and bottom line came into focus.

I ended-up taking a job at 2/3 my salary at the original place, and it was yet another job I hated. To say my industry at the time was in flux is an understatement.
 
Again, nope. We can even take a thread designed to let us know about a colleague's misfortune and turn it into a dick-measuring contest. It's like we can't even control ourselves sometimes.

We have all the ability in the world, which is why it's so frustrating that we just piss it away.
Ever the pessimist šŸ™ƒ

I could be wrong, and I'm willing to be wrong. And if I was, I apologize. But my intent was call-out, not cock-off. And I couldn't agree more that the garbage I was attempting to put an end to should not be what this thread is about, and I said as much.

I've spent enough time in my life staying quiet when shit's going sideways. Learning how to speak up and what method to use when means mucking that up a bunch of times till I get it right. I'd rather fuck that up a thousand times than stop learning.
 
I never waited to be obsolete!

I always looked to improve my skills and looked for a new job within months of starting my current job. And it didn't take student loans or college degrees to make myself more valuable. A book was all I needed, and time to read and take practice exams to earn tech certs. I became an MCSE through self-study and paying to take the tests.

And I didn't have a rich family to back me. When I graduated high school, I left home with twenty dollars and a few clothes, and never again depended on my parents to support me. In fact, years later, I was giving my father money when his steel mill went on an extended strike! The union made demands and drove the company into bankruptcy! So, I lose no sleep over those who fill entitled to a job and paycheck. I learned the hard way that I had to look out for myself and my family, regardless of the shitty work I had to endure! And after 42 years of relentless WORK, I don't owe anyone anything, and I can relax with what I have.
So, this isn't really doable anymore.

If you don't have the required degree, your resume never even lands on the desk of a potential employer because it has been weeded out by a machine scanning it for keywords.

There was a time, long ago, where you could get before a person hiring for a job and convince them you knew your shit even without the degree the job "requires". That time is long gone and now, unless you know someone who knows someone who can get you past the initial screening process, your resume is never even seen by a human and you have almost zero chance of proving yourself on self taught skill alone.

Even if you have the appropriate certifications, if the job lists a degree as part of the requirements and you don't have it, they never see your resume. Even if they claim you should apply even without meeting all of the requirements.

Very few companies don't use an initial bot to scan resumes to weed out any that don't meet the minimum requirements. Some will pass through a handful that meet most but not all of the requirements, then they get rejected by a human because that human doesn't want to take a risk on this person not lasting in the role, causing increased costs to the company to then replace that person.

Basically: There is currently no guarantee your resume ever sees a human without a degree being listed on it.

And for what it's worth, no, the customer isn't always right. There are many jobs and businesses where, if they abided by what the customer wanted, they would be in violation of codes or their own ethics. Many businesses will turn down your money on principle if you ask them to do something they cannot legally or ethically do. And some will turn you down just because they don't like you. Many contractors who come out to do a quote on a job will give the owner a "fuck off" price because they don't want the job once they meet the customer and see how difficult they are.

These people are not hurting for customers largely because they show an ethical bend towards quality over appeasing someone's personal whims.
 
So, this isn't really doable anymore.

If you don't have the required degree, your resume never even lands on the desk of a potential employer because it has been weeded out by a machine scanning it for keywords.

There was a time, long ago, where you could get before a person hiring for a job and convince them you knew your shit even without the degree the job "requires". That time is long gone and now, unless you know someone who knows someone who can get you past the initial screening process, your resume is never even seen by a human and you have almost zero chance of proving yourself on self taught skill alone.

Even if you have the appropriate certifications, if the job lists a degree as part of the requirements and you don't have it, they never see your resume. Even if they claim you should apply even without meeting all of the requirements.

Very few companies don't use an initial bot to scan resumes to weed out any that don't meet the minimum requirements. Some will pass through a handful that meet most but not all of the requirements, then they get rejected by a human because that human doesn't want to take a risk on this person not lasting in the role, causing increased costs to the company to then replace that person.

Basically: There is currently no guarantee your resume ever sees a human without a degree being listed on it.

And for what it's worth, no, the customer isn't always right. There are many jobs and businesses where, if they abided by what the customer wanted, they would be in violation of codes or their own ethics. Many businesses will turn down your money on principle if you ask them to do something they cannot legally or ethically do. And some will turn you down just because they don't like you. Many contractors who come out to do a quote on a job will give the owner a "fuck off" price because they don't want the job once they meet the customer and see how difficult they are.

These people are not hurting for customers largely because they show an ethical bend towards quality over appeasing someone's personal whims.
I never said: "Don't get a degree".

I said, plan ahead, improve your skills, and always look for another job!!!

The OP formed a clique/union with the co-workers and demanded the company do things their way! I pointed out how a similar union strike bankrupted the factory where my own father worked! That union wanted things done their way! Those who pay you to do a job are entitled to get the job done the way they want it done! Otherwise, as happened, they took their money previously paid to the OP elsewhere. In my father's case, the bankrupted company was bought out and many lost their jobs! The collective DEMANDS failed individuals who lost their jobs.

But some others have shown themselves incapable of learning from the mistakes of others, closing their eyes and ears, and are thus doomed to make the same mistakes.

When humans learned to write, they achieved the ability to impart knowledge and experience to others. But it's only through sharing those experiences, reading, and attempting to understand the lessons that anyone LEARNS!
 
Last edited:
This may come as a shock, but people with money can be wrong about things. And when you work for them, you're not exactly doing yourself any favors by indulging their ignorance and short shortsightedness for fear of your job. They're just as likely to endanger your job via hubris because you were too afraid to say something within your domain of expertise than they are to get rid of you for standing up to them. It's a no-win scenario for labor.

If you got by on being a yes-man, good for you. That's more luck than skill, I'm sorry to tell you. And wagging your finger at somebody else who had worse luck than you with that problem is not a good look.

You are making this out to be a binary choice.
Refuse to do what your boss asks or be a "Yes man".
If I have an issue with something my boss wants me to do I raise my objections in a polite, professional manner and state my case.
"I think we should reconsider this because..."
If my boss insists as long as it isn't unsafe, illegal or unethical then we're doing it the bosses way.
That isn't being a "Yes man" it's recognizing your role in an organization.
 
The number of job losses to AI is going to explode as AI driven androids become readily available.

This isn’t science fiction. This is going to happen for the convenience and economics of those who already have money. It’s going to make it so they don’t need to pay or rely on employees for their wants and needs.

Robots can and will continue to get more cost effective for:

Care for elderly
Care for children
Walking dogs
Yard work
Driving
Shelf stocking
Piloting
Delivery
Construction
Warehouse work
Farming
Etc….

Shit tons of jobs are going to disappear unless…. What?


Also, I still see many people still saying how AI only relies on the training data that is spoon fed to it. Wake up. Get your head out of the sand and pay attention. AI also self-learns from experience. Android robots designed to interact with people will have their own novel experiences and will not rely solely on programming.


ā€œFreedomā€
 
The number of job losses to AI is going to explode as AI driven androids become readily available.

This isn’t science fiction. This is going to happen for the convenience and economics of those who already have money. It’s going to make it so they don’t need to pay or rely on employees for their wants and needs.

Robots can and will continue to get more cost effective for:

Care for elderly
Care for children
Walking dogs
Yard work
Driving
Shelf stocking
Piloting
Delivery
Construction
Warehouse work
Farming
Etc….

Shit tons of jobs are going to disappear unless…. What?


Also, I still see many people still saying how AI only relies on the training data that is spoon fed to it. Wake up. Get your head out of the sand and pay attention. AI also self-learns from experience. Android robots designed to interact with people will have their own novel experiences and will not rely solely on programming.


ā€œFreedomā€
Many of those things have already been automated over the past 30-40 years, and jobs HAVE already disappeared, laying people off work!

That shit has always happened, from the time people making carriages and selling horses had to endure layoffs when the Model T came out!

Things change! And those who insist on telling their boss "Don't tell ME how you want the carriage built, because I know better!" are DARING their boss to build Model T's! Then they won't need you making carriages anymore or have to listen to you talking back to them.
 
I have a very strong suspicion that there is a watershed moment coming in the very close future, where companies discover that an AI is great up until it fails - at which point they've outsourced or retrenched anyone with the tribal knowledge to save the company.

I've played with AI image generation, text generation (never for stories, but just to see how it worked), and code generation. Of those three disciplines, I'm a competent writer, a terrible artist, and a professional developer, and I'll tell you this: there's going to be a fucking apocalypse in a few years because of vibe coding. Maybe several of them.

As part of my job, I've recently started using AI to throw out boilerplate code here and there, the same sort of stuff I'd normally have had a junior dev do in the past, or, in more recent years, copied off of Stack Overflow or Experts' Exchange. I'd resisted it for a while, but I wanted to see what all the fuss was about, and I have to admit that it was pretty eye opening.

However, not necessarily for the reasons you're thinking.

Because I'd played with other text-based LLMs in the past, I knew about some of the foibles of the technology: hallucinations, forgetting the thread when the conversation went too long, etc. But what I hadn't realized was how fucking sociopathic it is.

I know that sounds like anthropomorphizing, and it is, but that's the best way I can describe it. Previously, I'd been told to treat it like a very enthusiastic, fairly dim junior dev. That's somewhat accurate but misses one caveat: it lies constantly. Or, not lies, because there's no malice attached, but it's 100% sure it's correct, presents "fixes" that will break things, and then immediately does a 180 and tells you "oh, you're right" as soon as you challenge it.

It can't know that it's wrong, because that's not how the tech works, so it instead presents everything with absolute confidence. It feels like talking to some combination of the most obsequious asshole and the most backstabbing colleague I've ever worked with. If you don't know your code and the language you're working with back and forward, AI will 100% fuck you over.

A few highlights:

  • It told a client of mine to store API keys in plaintext in a mobile app; for the non-technical, that's like publishing your bank account info in the yellow pages.
  • It gave me code that literally could not and would never run, then gave me a different set of code that also couldn't run while assuring me this would definitely fix it.
  • It wrote code that would have destroyed data if I'd run it as-is without testing first, because of a difference in how two similar languages handled cutting a few characters out of the middle of a string (like a word, for the non-technical).

That's just the tip of the iceberg. Because of the nature of the apps I'm working on and my general distrust of the technology, I'm only using it in places where I can easily verify what it's doing and why. But that's only possible because I have a few decades of experience as a developer. For the folks vibe coding? Whew. I will almost guarantee you that sometime in the next few years there's going to be a massive breach involving SSNs, passwords, etc. because Bob from middle management said "I'll have ClaudeAI make a new user management site!"

The other thing it's doing is exposing how very much we're all affected by Dunning-Kruger. I'm seeing writers I respect use AI covers that have extra fingers and weird lighting, game developers whose work I enjoy using ChatGPT to generate terrible translations and bland lorem ipsum placeholder text, and businessmen who I know from personal experience are smart people embracing AI coding because it seems like it's almost magical... except it's not. Each of these people in their own spheres recognize how flawed the tech is, yet they insist on using it elsewhere, because they don't know how much they don't know, and they refuse to admit that, hey, maybe all of these are a lot harder than they seem.

To sinfantasy: I'm sorry you're going through this. I agree that every one of us working in some kind of knowledge field is going to as well, sooner or later. I wish I had something more optimistic to say, but all I can do is wish you well and hope you make your way back here one day.
 
Last edited:
there's going to be a fucking apocalypse in a few years because of vibe coding. Maybe several of them.
Yup. It's sort of like everyone outsourcing their stuff to the cheapest overseas contractor and then unquestioningly merging the results into their codebases. The amount of work necessary to clean that up when the hype cycle passes the Peak of Inflated Expectations is going to overshadow Big Data, cloud and crypto combined. Experienced developers will certainly need to judiciously use whatever LLMs remain after the inevitable market correction just to deal with this unholy mess, because there just won't be enough of them to rewrite it back into sanity.

Not to mention that infosec guys will have a field day with all the 0-days (or even -1-days, given there are already reports of LLM training data being maliciously seeded with vulnerabilities). Fun times ahead!
 
Many of those things have already been automated over the past 30-40 years, and jobs HAVE already disappeared, laying people off work!

That shit has always happened, from the time people making carriages and selling horses had to endure layoffs when the Model T came out!

Things change! And those who insist on telling their boss "Don't tell ME how you want the carriage built, because I know better!" are DARING their boss to build Model T's! Then they won't need you making carriages anymore or have to listen to you talking back to them.
That's not how it's going to work this time.

First off, you've misidentified the problem. It's not a question of people doing their job poorly/wrong; it's a question of the AI doing the job significantly cheaper with no obvious downside. I've already gone into the actual, invisible "obvious" downside above, so refer to that.

Second, and more importantly, there's nothing to retrain to, at least not in the way you're thinking. At this point, the time to retrain in any kind of knowledge work field is less than the time it will take to train AI to do the same thing minimally competently. When buggy whips stopped being necessary, there were factory jobs to go to for building Model Ts. When factory jobs went overseas, people were encouraged to train in white collar skills or, failing that, go into nursing, truck driving, or some similar trade.

There is no similar trade to go into now, at least not in the numbers necessary to absorb the jobs being lost, and certainly not at the same payscales.

Put it this way: I'm a professional developer at a senior level. I can and have run teams before (still do occasionally), but I prefer to write code. Right now, the AI is too dumb to write more than rudimentary code, but it can do it fast. If it never gets better, my job is safe, but a bunch of junior level devs are going to get pushed out before they can become senior level devs. If it does get good enough to replace me? I have no idea where I would go.

Some senior devs and team leads might move up into "AI architect" roles, where they keep a bunch of LLMs humming, but there are only so many of those that are going to open up, and only for so long; people are going to keep pushing to make the AI smart enough so that all devs go away except for maybe some of the ones at FAANG companies that are creating new firmware and OSes.

Let's say I can't hang there; I'm realistic enough to admit I likely would lose out to some guy 10 years younger who's good enough for the gig that needs doing. Where do I retrain to? Pretty much every related field (IT, security, devops, etc.) are facing the same AI crunch, so no dice there.

Going further afield to management and HR, same thing. Accounting, same thing. Procurement, sales, planning, all impacted to a greater or lesser extent, and all of them accelerating faster than people can retrain to be as good as an entry level worker, which is the skill level that AI can consistently fake in each new sphere it enters. Writing? Hah! We all know how that's looking; just swing a dead cat here on the forums.

What about blue collar work? I could retrain to nursing (and take a big paycut), but even medical fields are being hit by AI (along with general greed among the healthcare companies that are buying up medical centers left and right), and that's just going to accelerate. Self-driving trucks are already running long haul jobs, even if the trucking industry hadn't already eaten its own workers and put them at barely above minimum wage levels. What little factory work exists in the US is being rapidly automated, along with warehouse work and similar fields. I know some white collar folks are switching to plumbing, electrical, etc. work, but there are only so many of those jobs to go around.

For the first time in... basically ever, we have a new technology that is creating fewer jobs than it's replacing over even the long term, and the jobs it's leaving behind are the ones most people don't want to do in the first place, the ones our parents and grandparents and great-grandparents left behind on farms and later factory floors so their kids would have a better future. Now the future is here, and... yeah.
 
Experienced developers will certainly need to judiciously use whatever LLMs remain after the inevitable market correction just to deal with this unholy mess, because there just won't be enough of them to rewrite it back into sanity.

I’m not intending to challenge your idea here, but I am trying to figure out what will define the ā€œneedā€ to fix it.

What will define the need for a market correction?

What determines that the ā€œmessā€ must be dealt with?


These are not problems that people with vast wealth will be faced with. Are you suggesting the vastly wealthy will feel a need to act for the benefit of humanity? Why?

I’m not saying they shouldn’t, I’m just wondering what you think the mechanism and motivation for ā€œcorrectionā€ will be.
 
That's not how it's going to work this time.

First off, you've misidentified the problem. It's not a question of people doing their job poorly/wrong; it's a question of the AI doing the job significantly cheaper with no obvious downside. I've already gone into the actual, invisible "obvious" downside above, so refer to that.
Read the OP's first post.

That was the "Carriage builder telling his boss how he knows best what to do, and dares them to fire him!"

EDIT: And second, regardless of wht tech comes out or how many jobs are vanquished, it all comes down to economics of "Why bother if there's no market to sell their products?"

People need the income to buy whatever crap the AI and robotics are producing. So, someone will need to come up with the human jobs to earn their value-added to give the producers' their market.

Look upward at the holistic picture of "WHY?"
 
Last edited:
Read the OP's first post.

That was the "Carriage builder telling his boss how he knows best what to do, and dares them to fire him!"

EDIT: And second, regardless of wht tech comes out or how many jobs are vanquished, it all comes down to economics of "Why bother if there's no market to sell their products?"

People need the income to buy whatever crap the AI and robotics are producing. So, someone will need to come up with the human jobs to earn their value-added to give the producers' their market.

Look upward at the wholistic picture of "WHY?"

No. That was employees pushing back on a carriage builder telling his employees to set up a new factory assembly line so he could lay-off workers.

Either way, they’re out of a job. They only got to decide if they would help the boss replace them.
 
Read the OP's first post.

That was the "Carriage builder telling his boss how he knows best what to do, and dares them to fire him!"
Sometimes the carriage builder does know best; see the Dunning Kruger thing again. And sometimes people refuse to be cruel for a paycheck, which I think we should laud rather than lambaste. But that's neither here nor there.

EDIT: And second, regardless of wht tech comes out or how many jobs are vanquished, it all comes down to economics of "Why bother if there's no market to sell their products?"

People need the income to buy whatever crap the AI and robotics are producing. So, someone will need to come up with the human jobs to earn their value-added to give the producers' their market.

Look upward at the wholistic picture of "WHY?"
Because there's a been a shift in the last 20 years, and especially the last 10, of the attitude towards that type of thing.

I remember when I was a kid hearing a story about the head of one of the big automakers bringing the head of the UAW, then on strike, to show off the new robots that they planned to use to replace the striking workers, expecting the union to back down. Instead, the UAW head said, "Robots don't buy cars."

We're now in an era where the corps sort of, kind of, expect robots to buy cars. Not literally, of course; that would be silly. But in a more roundabout way.

None of the major AI companies has ever generated a profit. Ever. Some of the hangers-on have, like Nvidia with its hardware, but the companies that make the algorithms and license them? They're all in perpetual "investment" mode, all betting on the come. That is, in a nutshell, how a ton of business works these days. Everyone is so leveraged against their own future value that few of them ever calls each other on it; no one wants to be the one without a chair when the music stops playing, so they'll all do what they must to make sure the tune never ends.

That thinking has filtered down to their workforces, too. I know, 100%, that there's an attitude that if we reach the point where Americans (and Europeans, Canadians, etc.) can't afford the products because they don't have jobs anymore, there's still China and Africa to sell to. It would be inconvenient for them, but that's all it would be: inconvenient.

They already took a run at offloading all the knowledge work they could to India, Eastern Europe, and Southeast Asia with varying levels of success, so they're already over the fear of what happens if their wealthiest potential customers suddenly aren't; after all, that's somebody else's problem. The market will adjust, and failing that the government will step in. But the problem is that nowadays... like, neither of those is going to happen. The same people who're banking on Ai eventually becoming profitable (and effective instead of just marketable) have spent huge amounts of money making that antithetical to decisionmakers in the current political and social climate.

TL;DR: it's the tragedy of the commons writ large.

"Everyone being out of a job isn't my fault; I didn't fire all of them, and I was just doing what was in my best interest and the best interest of my shareholders. So what if everyone else does that, too? That's what we're supposed to do." It's the same kind of magical thinking that keeps people pouring money into AI that only about 80% works and 1% of the time completely shits the bed.
 
No. That was employees pushing back on a carriage builder telling his employees to set up a new factory assembly line so he could lay-off workers.

Either way, they’re out of a job. They only got to decide if they would help the boss replace them.
Nope.

RE-read the OP.

They collectively pushed back on doing the job in a "brutal manner" which was what the employer wanted.

The employer just took the work they WERE doing and fed it to the AI saying, "Now do this same thing, but in a brutal manner!" They didn't even have the choice of helping their boss. The boss appears to have merely fed the work they did and he paid them for but wanted it in a different tone.

And let's face it: the FREE Ai's online can take any of our stories and quickly re-write them in a different tone.
 
Sometimes the carriage builder does know best; see the Dunning Kruger thing again. And sometimes people refuse to be cruel for a paycheck, which I think we should laud rather than lambaste. But that's neither here nor there.


Because there's a been a shift in the last 20 years, and especially the last 10, of the attitude towards that type of thing.

I remember when I was a kid hearing a story about the head of one of the big automakers bringing the head of the UAW, then on strike, to show off the new robots that they planned to use to replace the striking workers, expecting the union to back down. Instead, the UAW head said, "Robots don't buy cars."

We're now in an era where the corps sort of, kind of, expect robots to buy cars. Not literally, of course; that would be silly. But in a more roundabout way.

None of the major AI companies has ever generated a profit. Ever. Some of the hangers-on have, like Nvidia with its hardware, but the companies that make the algorithms and license them? They're all in perpetual "investment" mode, all betting on the come. That is, in a nutshell, how a ton of business works these days. Everyone is so leveraged against their own future value that few of them ever calls each other on it; no one wants to be the one without a chair when the music stops playing, so they'll all do what they must to make sure the tune never ends.

That thinking has filtered down to their workforces, too. I know, 100%, that there's an attitude that if we reach the point where Americans (and Europeans, Canadians, etc.) can't afford the products because they don't have jobs anymore, there's still China and Africa to sell to. It would be inconvenient for them, but that's all it would be: inconvenient.

They already took a run at offloading all the knowledge work they could to India, Eastern Europe, and Southeast Asia with varying levels of success, so they're already over the fear of what happens if their wealthiest potential customers suddenly aren't; after all, that's somebody else's problem. The market will adjust, and failing that the government will step in. But the problem is that nowadays... like, neither of those is going to happen. The same people who're banking on Ai eventually becoming profitable (and effective instead of just marketable) have spent huge amounts of money making that antithetical to decisionmakers in the current political and social climate.

TL;DR: it's the tragedy of the commons writ large.

"Everyone being out of a job isn't my fault; I didn't fire all of them, and I was just doing what was in my best interest and the best interest of my shareholders. So what if everyone else does that, too? That's what we're supposed to do." It's the same kind of magical thinking that keeps people pouring money into AI that only about 80% works and 1% of the time completely shits the bed.
You're describing the "Internet Bubble". (Over-investing in unprofitable companies. BTW: I worked for PSINet and saw the writing on the wall!)

But we (the economy, not PSINet) survived and grew even after that meltdown and reshuffling.
 
You're describing the "Internet Bubble". (Over-investing in unprofitable companies. BTW: I worked for PSINet and saw the writing on the wall!)

But we (the economy, not PSINet) survived and grew even after that meltdown and reshuffling.
I was here for that, too, but the big difference here is this: the internet changed jobs, but it very rarely got rid of them entirely. It slimmed down workforces at individual companies and even killed some companies, but it very rarely killed the jobs themselves; the classes of jobs, I mean. A given company might need fewer salespeople, but the economy as a whole still needed salespeople, for example.

That's not what's happening this time around. Entire categories of careers are just... going. Not entirely, but enough; for an analog, think of the few people who were still able to keep handcarving furniture once factories spun up. The difference now is that the handcarving jobs are going away and nothing is replacing them. At all. Even the careers that were supposed to (think all the folks who were told to shift from programming to "prompt engineering" for the last couple of years until the AIs got good enough at prompt engineering that it made sense to let them do that, too) are starting to evaporate. It's not even how "webmaster" transitioned/split into "web developer," "UI developer," and "system admin." The "new" jobs that AI have theoretically opened up are just going *poof* too.
 
So give up on humanity? Let the market sort it out?
Have you taken a look at "humanity" lately?

You should read my latest story of the Band, where the disgruntled soldier says and asks:
"Who risks their lives to defend a country full of lazy leeches and worthless politicians?" I asked. "Am I fighting evil or protecting it?
 
Back
Top