Is AI actually helpful?

Y'all are reminding me I have two paper boxes in the garage full of language manuals accumulated since the late 60s. Some were relative rarities even in their day, such as a three-volume Postscript set where I am unable to let loose of the idea they may have value in 2025, again in their rarity. I'm going to have to get over myself and haul them all to the recycler. This isn't Antiques Road Show.

Gawd. There's even a FORTRAN manual (with WATFOR and WATFIVE!) in there. OMG.
 
Y'all are reminding me I have two paper boxes in the garage full of language manuals accumulated since the late 60s. Some were relative rarities even in their day, such as a three-volume Postscript set where I am unable to let loose of the idea they may have value in 2025, again in their rarity. I'm going to have to get over myself and haul them all to the recycler. This isn't Antiques Road Show.

Gawd. There's even a FORTRAN manual (with WATFOR and WATFIVE!) in there. OMG.
You only have two boxes? What of the dangers of being a professor is they just keep giving me more bookshelves. I need to get myself emeritus standing next year just so they don't make me pack up. Do you want a PDP-10 operators manual from 1972 (or thereabouts)? I think it still has the paper tape in it to reboot the machine. I have assembly language manuals for at least six processors that are no longer made, not counting the BAL (IBM mainframe assembler) manual from sometime in the 70's. And at least three versions of fortran.

I'm way too much of a packrat.
 
If asked to cite sources, ChatGPT has a history of fabricating them. Don't rely on the prompt to avoid hallucinations.
I thought Chatgpt merely took what you offered it and fabricated more. I had a person on a cruise tell me how it was so great. It would do my writing for me and then I'd just have to go over it and change a few things.
If I wasn't a little afraid of engaging it and letting it into my computer, I'd consider trying it. I'd make up a basic sample and ask it to write. Then compare the product to my own that I'd already produced.
 
Do you want a PDP-10 operators manual from 1972 (or thereabouts)?

I'll make room for it next to my DEC DOS/BATCH book. That thing is 3-1/2" thick! I'm surprised the newsprint it's on hasn't disintegrated in the 50 years since.

Gawd, we're old. 😞
 
I think it still has the paper tape in it to reboot the machine.

Yeah, that too. I have the paper tape bootloader for a PDP-11/70 somewhere in my stash. Thank goodness I have forgotten the panel toggle sequence for using it. What I haven't forgotten is the boot toggle for the hard disk startup - 17777650. Octal, of course. Stored in both mental and muscle memory. Oh, lordy.
 
Yeah, that too. I have the paper tape bootloader for a PDP-11/70 somewhere in my stash. Thank goodness I have forgotten the panel toggle sequence for using it. What I haven't forgotten is the boot toggle for the hard disk startup - 17777650. Octal, of course. Stored in both mental and muscle memory. Oh, lordy.
It's DEC, of course it was octal.
 
It's DEC, of course it was octal.

A real shocker when I migrated to the IBM world! EBCDIC? Hexadecimal? Hollerith cards? These kids around here have no idea of the glories of real computing that they missed. 🤣
 
I went DEC/TOPS10 (Fortran 4 mostly) -> IBM (BAL) -> DEC(BSD unix with C and then languages I created)
 
A real shocker when I migrated to the IBM world! EBCDIC? Hexadecimal? Hollerith cards? These kids around here have no idea of the glories of real computing that they missed. 🤣
My freshman year of college, I knew the girl who worked the student help desk(family friend). The junior weed-out project was a four-box set of cards. Everybody pretty much wrote exactly the same code. It was a three-day turnaround for card processing. I was sitting at her desk visiting when a girl came in literally in tears. Her project didn't do anything. A cursory review of the code at the top of the cards looked good. Again, all the students wrote pretty much the same code.

Conversation went something like this...

"Which teletype machine did you use to type these?"

"Um, I did them at home on my Selectric.(IBM typewriter)"



Silence. IYKYK
 
All of our [interminable] debates about AI seem to assume some sort of benefit to the technology. Arguments instead talk about ethics and loss of human jobs. But what if AI doesn’t actually boost productivity?

This is an excerpt from an article on The Atlantic. It’s behind a paywall (I subscribe), but someone normally finds a free version.



If there is any field in which the rise of AI is already said to be rendering humans obsolete—in which the dawn of superintelligence is already upon us—it is coding. This makes the results of a recent study genuinely astonishing.

In the study, published in July, the think tank Model Evaluation & Threat Research randomly assigned a group of experienced software developers to perform coding tasks with or without AI tools. It was the most rigorous test to date of how AI would perform in the real world. Because coding is one of the skills that existing models have largely mastered, just about everyone involved expected AI to generate huge productivity gains. In a pre-experiment survey of experts, the mean prediction was that AI would speed developers’ work by nearly 40 percent. Afterward, the study participants estimated that AI had made them 20 percent faster.

But when the METR team looked at the employees’ actual work output, they found that the developers had completed tasks 20 percent slower when using AI than when working without it. The researchers were stunned. “No one expected that outcome,” Nate Rush, one of the authors of the study, told me. “We didn’t even really consider a slowdown as a possibility.”

No individual experiment should be treated as the final word. But the METR study is, according to many AI experts, the best we have—and it helps make sense of an otherwise paradoxical moment for AI. On the one hand, the United States is undergoing an extraordinary, AI-fueled economic boom: The stock market is soaring thanks to the frothy valuations of AI-associated tech giants, and the real economy is being propelled by hundreds of billions of dollars of spending on data centers and other AI infrastructure. Undergirding all of the investment is the belief that AI will make workers dramatically more productive, which will in turn boost corporate profits to unimaginable levels.

On the other hand, evidence is piling up that AI is failing to deliver in the real world. The tech giants pouring the most money into AI are nowhere close to recouping their investments. Research suggests that the companies trying to incorporate AI have seen virtually no impact on their bottom line. And economists looking for evidence of AI-replaced job displacement have mostly come up empty.
Here's a parallel tech development: When database systems advanced in power, lots of folks figured doctors and medicine would really benefit from electronic medical records. No dusty folders, and doctors could dictate their findings into computers with voice recognition as they went along. Long story short, a nightmare for doctors. Voice recognition software was fine at recognizing diseases, but not people's names and complaints. And where a doc had been able to say, "Jerry, schedule Mrs. Jones for three more visits and give her erythromycin twice a day for two weeks." the doc needed to sit down between visits or after work and weave through form after form to get the info in. And voice dictations were gobbledygook and often dangerously wrong. Docs hate EMR and their productivity is in the toilet.
 
Y'all are reminding me I have two paper boxes in the garage full of language manuals accumulated since the late 60s. Some were relative rarities even in their day, such as a three-volume Postscript set where I am unable to let loose of the idea they may have value in 2025, again in their rarity. I'm going to have to get over myself and haul them all to the recycler. This isn't Antiques Road Show.

Gawd. There's even a FORTRAN manual (with WATFOR and WATFIVE!) in there. OMG.
Mixed in with a pile of excellent but now ancient C++ books, I have one for Cocoa - Apple's Nextstep - and an ancient manual for Postscript that was tremendously useful to me 25-30 years ago.

ETA: The blue circles in my banner were created in an EPS generated by... C++, I think.
 
All of our [interminable] debates about AI seem to assume some sort of benefit to the technology. Arguments instead talk about ethics and loss of human jobs. But what if AI doesn’t actually boost productivity?

This is an excerpt from an article on The Atlantic. It’s behind a paywall (I subscribe), but someone normally finds a free version.



If there is any field in which the rise of AI is already said to be rendering humans obsolete—in which the dawn of superintelligence is already upon us—it is coding. This makes the results of a recent study genuinely astonishing.

In the study, published in July, the think tank Model Evaluation & Threat Research randomly assigned a group of experienced software developers to perform coding tasks with or without AI tools. It was the most rigorous test to date of how AI would perform in the real world. Because coding is one of the skills that existing models have largely mastered, just about everyone involved expected AI to generate huge productivity gains. In a pre-experiment survey of experts, the mean prediction was that AI would speed developers’ work by nearly 40 percent. Afterward, the study participants estimated that AI had made them 20 percent faster.

But when the METR team looked at the employees’ actual work output, they found that the developers had completed tasks 20 percent slower when using AI than when working without it. The researchers were stunned. “No one expected that outcome,” Nate Rush, one of the authors of the study, told me. “We didn’t even really consider a slowdown as a possibility.”

No individual experiment should be treated as the final word. But the METR study is, according to many AI experts, the best we have—and it helps make sense of an otherwise paradoxical moment for AI. On the one hand, the United States is undergoing an extraordinary, AI-fueled economic boom: The stock market is soaring thanks to the frothy valuations of AI-associated tech giants, and the real economy is being propelled by hundreds of billions of dollars of spending on data centers and other AI infrastructure. Undergirding all of the investment is the belief that AI will make workers dramatically more productive, which will in turn boost corporate profits to unimaginable levels.

On the other hand, evidence is piling up that AI is failing to deliver in the real world. The tech giants pouring the most money into AI are nowhere close to recouping their investments. Research suggests that the companies trying to incorporate AI have seen virtually no impact on their bottom line. And economists looking for evidence of AI-replaced job displacement have mostly come up empty.
there's a heavy question, too. What is 'productivity'? Logically, it means you can make more stuff for 'less cost'. What does 'less cost' mean? If A.I. makes a product 'better', that's one thing (though who designs 'better'? When I worked in QA at AVID, developers were always popping new 'features' into the Media Composer 'because they could.' Okay, evolution. But if it isn't asked for and gets in the way of doing the existing job it's not a 'cool new thing', it's a bug.) I really don't want 'Paperclip' 'helping' me when I use Zoom by accessing anything I might use in a Zoom meeting. Like she said, "Stay out of my drawers unless I specifically tell you how and when I want you in them."
 
Mixed in with a pile of excellent but now ancient C++ books
Nothing about C++ can be ancient, can it? That can't be 40 years, can it? I still remember having a long running semi-public argument with Bjarne. It certainly wasn't that long ago... Sigh
 
Nothing about C++ can be ancient, can it? That can't be 40 years, can it? I still remember having a long running semi-public argument with Bjarne. It certainly wasn't that long ago... Sigh
I liked making pretty evolving pictures with Basic. Essentially A.I. Big screens with roiling waves of color in museums? Pretty much the same thing.
 
A real shocker when I migrated to the IBM world! EBCDIC? Hexadecimal? Hollerith cards? These kids around here have no idea of the glories of real computing that they missed. 🤣
I remember an I've Been Moved salesman telling me that when customers asked. "Yes, but how does it work? His response was, "Just fine."
 
Back
Top