Just realized Russian troll farms and Nigerian scammers are going to fukking eat off of this shyt. Fake news sites too. They'll be able to just straight make shyt up at record pace, tell ChatGPT to tailor it for whatever audience.
Fake-ass motivational speakers, pyramid schemes, get-rich-quick schemers, self-help book writers too. This is pretty much exactly at their writing level, but they'll be able to produce content much more quickly and tailor it more effectively than before.
Pretty much anything where the style is more important than the truth or depth of the substance, and where volume matters, is going to be on easy mode now.
fiverr
I mean, that does seem like its best use so far - it produces filler.
I can see it being used a lot by non-writers who need to produce words but don't want to waste their time writing. It can basically boost almost anyone up to "competent" stage, or can take a competent writer and give them what they would have produced anyway but in a lot less time. However, it doesn't seem to be producing any good writing.
Does that mean people who use ChatGPT to produce competent writing are now never going to put in the work to become legitimately good writers?
That leads into the second potentially negative side effect I worry about it. Right now, it's being used as a time-saving mechanism by competent professionals who have already trained themselves to a certain point. I assume you're using it to produce the sort of things you've already had to write numerous times in the past - so you know how to write them, you know what you're looking for, and after you apply ChatGPT to them, you can evaluate the result and tweek it if it's off a bit.
That's great if all you ever need is the sort of boilerplate competency to fill out a report. But it feels like once you start using ChatGPT, that's all you'll top out at. The process of using ChatGPT means both that you'll put substantially less thought into the work and you give yourself an automatic ceiling for how good it is. Maybe that's all you'll ever want - but in the real world without ChatGPT, certain people would have kept advancing, kept producing higher and higher quality work as a result of the hard work they'd put in every day. Seems like these shortcuts would have an effect of cutting off that mental work early so the actual progress in ability never manifests itself.
And even worse, what happens in the future when people raise themselves up on this shyt before they actually gained the skills? What if you started using it back when you were in middle school, to complete answers for you at the time when you were supposed to be building the neural connections and gaining the mental skills to be able to do work like this on your own. Would you ever even have gotten good enough at producing those reports to be able to evaluate whether ChatGPT had done it right? Or would you be forced to assume it was right, just because ChatGPT had done it? Perhaps you would still get feedback from time to time, telling you whether or not your ChatGPT-produced answer had hit the mark. But my experience in educational psychology suggests that we are far less able to evaluate and utilize feedback for shyt we didn't build ourselves.
There's a chance that this will curtail a lot of the development that makes us into strong thinkers. There's a reason human beings don't mature mentally until around 20-25 years old. We can't build wet neural networks with shortcuts. Perhaps for a few folk this will be a time-saving device that will allow them to spend more time in deeply engaging endeavors, but I'm guessing for a lot more young people it's going to lead to less time spent in intellectually demanding activity, and that's not good in the long run.
good analyis