I find it so hard to reconcile using Chat-GPT to write some Terraform code which hallucinates the way of using the timer provider, tells me that it's right because it "simulated" the rest on an AWS staging environment that it has access to (can I see it? No.) then when confronted with the evidence that this doesn't work in a GitHub issue tells me I've found the "smoking gun" of evidence with what I read in the news articles and my LinkedIn feed about how programmers are out of a job in six months.
I feel like I'm taking crazy pills here, is anyone else getting this?
28 days ago [-]
pablitokun 27 days ago [-]
As an SRE, I completely agree.
I've had some more success with claude code recently. I use it as a fancy find and replace, if I explain it in detail what I did in one file it can repeat it in a dozen other files. Which is super handy, like once every 2 days, when I have to work in a legacy monorepo.
Whenever I try to use it to generate code from nothing I end up losing more time than winning.
Our_Benefactors 27 days ago [-]
Have you tried using cursor?
Chat GPT seriously is outdated nowadays, and not the preferred choice for writing code.
atonse 28 days ago [-]
One thing AI won’t be able to do is take the blame for big decisions.
I’ve realized that companies like McKinsey also provide a scapegoat for a manager’s decision gone wrong. They can just blame McKinsey for advising them to do it. But they can’t as easily blame an AI. And McKinsey is happy to be the bad guy.
I think this is an area where we will still keep humans. To blame someone when something goes wrong.
Unfortunately for McKinsey, that’s not going to be enough to prop up their revenues.
nwmcsween 27 days ago [-]
This is more than enough to prop them up. More often then not the consultee already has a plan even if the plan is horrendous the consulting company is there to insulate management from any fallout (read incompetence).
I feel like I'm taking crazy pills here, is anyone else getting this?
Whenever I try to use it to generate code from nothing I end up losing more time than winning.
Chat GPT seriously is outdated nowadays, and not the preferred choice for writing code.
I’ve realized that companies like McKinsey also provide a scapegoat for a manager’s decision gone wrong. They can just blame McKinsey for advising them to do it. But they can’t as easily blame an AI. And McKinsey is happy to be the bad guy.
I think this is an area where we will still keep humans. To blame someone when something goes wrong.
Unfortunately for McKinsey, that’s not going to be enough to prop up their revenues.