Earlier today, I used ChatGPT to help me bang out a Ruby script to clone a repository, extract its documentation, and add those docs to another site that will serve as a centralized documentation source for a collection of projects.
I know Ruby and have been using it since 2007, but I still have to look things up with it all the time. By giving ChatGPT a bunch of poorly worded, lazily proofed commands, I was able to cut the time of development probably in half.
It wouldn’t be nearly as good with a language I didn’t know, but saying it’s a waste of time and money feels like it’s really missing the sea change that’s happening.
I’m still productive without ever having pursued any code completion features, and I think one aspect of your productivity you are leaving out is “is your bump in productivity equal to, or preferably favorable to, its costs?”.
What Rube Goldberg type of a mechanism of misery has to happen to help you complete your code?
Such as the blatant disregard for creatives’ licensing stipulations, tech companies building nuclear reactors, the political elevation of the scumbags who run these companies, et al?
~“I saved 10 keystrokes, and it only cost my grandchildren their clean drinking water.”
I hope they build lots of them. The alternative is that they use fossil gas which continues to heat up our atmosphere. Nuclear fission is certainly not problem-free (Hanford, three mile island, Chernobyl, etc), but it’s a whole lot better than pushing us past 2°C.
Human thinking is also guessing to a large extent. It is guessing with many feedback loops until a resolution that is good enough, is found. It indeed mimics human behavior and is quite good at it. What counts is, how well can it guess? Since I use AI quite frequently for things where my output is part of the feedback loop, I must say, AI's guesses are very often spot on.
About parroting: I'm very sure AI does not simply parrot, I'm very often still amazed by how well my questions are understood, it gets what I'm actually asking for. A parrot never will even process my question.
Opinion columns of newspapers are largely platforms to publish nonsense with no review or editorial accountability. This seems to fit the pattern.
Please no. I agree with the rest (imnproved workflow too), but leave social tasks to humans.
AI is going to quickly surpass the quality of medical diagnosis by doctors, at which point hopefully people can see a the right kind of specialists faster and get treatment quicker.
Maybe that says more about how low-entropy code really is than it does about AIs intelligence, but in any case it works.
I'm not sure what else I'd ever use it for though. I have no interest in Replika or anything similar, and I want it to stay out of creative writing and personal communication.
- "generative AI is a mimic of human action, parroting back our words and images"
- "[AI] take[s] computing capacity away from other, potentially more useful activities"
- "[AI] requires an enormous amount of energy", "environmental costs are well-known"
- "AI is underpinned by significant capital investment in computing infrastructure" "This investment could go somewhere else, more useful"
- "AI is also sucking up innovation funding, especially venture capital."
- "it is threatening to overwhelm us with AI spam"
- "AI will necessarily lead to significant social change and associated costs"
TLDR: AI is a stochastic parrot, is not environmentally friendly, is expensive and generates spam
> Kean Birch is director of the Institute for Technoscience & Society at York University.
Academic sociologist argues that AI should be controlled by academic sociologists. Color me surprised.
The article is making pretty clear arguments for costs of generative AI, and raising the author's opinion that it isn't worth it. Just claiming it is in fact worth it without anything to support that isn't super helpful.
Similar AI endeavors have been underway for medicine and human health.
The author is making extremely shallow, flawed arguments that hinge on an ignorant (or possibly, deliberately narrow-minded) understanding of what generative AI is, how it is already being used, and the magnitude of what is already being achieved with it.
With EVs the focus becomes how wonderful they are because they do not burn fossil fuels in their engines. Great, but what about what all the other issues (including non renewable issues in the rest of the supply chain involved in building the EV)? They're greenwashed away - no need to discuss public transit and densification. EVs will fix everything and we don't need to change our life on any significant way.
They are a way to continue going down the wrong path and feel good about it.
Source: I'm a teacher.