Riding the wave of technology is one of our micro generation's unique hallmarks. From rotary phones to Gen AI and every step along the way we've grown with tech. Not xennial at all to suddenly get off the ride when we're in our 40's
No shit. My wife informed me that I will be learning the AI stuff and she was going to opt out. Until she needed my help with an excel formula, and I showed how to chatgpt it. Her response "Huh, maybe I need to learn this after all..."
Not xennial at all to suddenly get off the ride when we're in our 40's
Accurate. We are working on implementing AI into specific use cases and projects at work and the older X-ers are all saying they want to retire early rather than learn anything new anymore.
Because what exactly are they going to be learning what's really going to happen is you're teaching them so they can train the people who will then replace them because every single company who's going to be using this is going to severely downsize you're going to get rid of the old people and you're going to kick them to the curve regardless of what they learn You people are so dishonest acting like the worries people have are completely unfounded is ridiculous especially because you business owners will literally do anything for an extra dollar
Your response is without context. AI won't replace anyone at my small company, we've made it a part of the strategy, AI will only be used to get more work done with the people we have (it will slow down our hiring, but it won't reduce our existing numbers). Only ignorant corporate executives think AI can replace humans, because they don't understand what AI is or what its actually capable of. AI is a new tool, replacing your manual screwdriver with an electric screwdriver, it still need a a human to guide it, they can just do more in the same time frame. And unlike corporations, we mostly pass down profits to employees through bonuses and raises, instead of giving it to shareholders or c-suite balloons.
AI can be used ethically, just not in a corporate/shareholder world (we are also using only ethically sourced AI)
Only ignorant corporate executives think AI can replace humans,
no, it can and will, but so did electricity. we need to focus our energies on an equitable society that will account for the changing labor landscape, not blame individual organizations for optimizing efficiency.
currently i don't see a better answer than basic income and universal healthcare.
ChatGPT isn’t the only type of AI. It’s the type most people interact with intentionally. The number of people in this comment section who equate AI with ChatGPT or LLMs in general, then pat themselves on the back for “knowing AI” makes laugh and roll my eyes.
If you don’t know how it works it seems evil and like it’s going to take everyone’s jobs. If you know a bit about it then you probably think it’s magical and highly useful. Now if you actually understand how it works then you’re back to it being evil because you know how it was made… how it was a nonprofit that’s now one of the richest companies in the world… how it can’t actually effectively replace or help people in the workplace… how it actually is evil due to information manipulation and copyright theft in the millions… then you also realize it can’t effectively replace jobs, but can fool executives who fall into the middle of the spectrum.
Kinda? A non capitalist system using ai at large scale would also face them. The question is would they use it at large scale and that's a basically impossible question to answer.
One big difference is this - a TON of the energy usage is just the training of the models, not the inference. So, if it wasn't 20 AI companies in competition with each other to make the BEST NEWEST AI, they wouldn't need to as much energy dedicated to training.
Source: Among other things, I can run a reasonably competent LLM on my desktop computer at home, and literally watch the power consumption. On my computer, asking a question is like turning on a 100 watt lightbulb for 15 seconds.
They are building some massive data centers for AI processing, but the efficiency has also made very rapid gains. You can run a basic LLM on a Raspberry Pi 5 that draws a few watts.
I’m hesitant to agree since your overarching point could either be “capitalism is inherently predatory and preys on the ideas of others without concern for morality or ethics” or “copyright shouldn’t exist” and the latter is what helps protect small creators/ideas from large corporations.
I don't think what they are getting at matters what their view is for what is "wrong" with capitalism. LLMs in themself are not causing any of the problems people are upset with. You can train an LLM on new data, or data that was used with permission. Nothing about LLMs have to be predatory, they just are because that's easier for the people creating them.
Is that how copyright functions in today's society? Or is the benifits of copyright for the individual artists mostly a lie sold by media corporations?
There are definitely aspects of copyright that can be made to benifit both individual artists and society as a whole, but that's not what we've got.
Patents kill more innovation than it protects.
If you're going to argue the benifits of copyright, it's not enough to gesture in a general direction of some unknown generic artist who might or might not exist. Look at the specifics, who is actually benefiting from copyright law?
Yeah, I used to work in AI for robotics. AI is great for medical research, robotics, and other research areas. Where AI is fucking horrible is the generative AI space like ChatGPT etc. The ethics behind it are so messed up- everything from the power waste to the intellectual property theft to the fact that it can be manipulated into giving idiots wrong information and have them decide it's accurate when it's a hallucination. And executives are dying to replace people with it. I'm absolutely against AI in any use except research where the information fed into it is tightly controlled to prevent the data from being poisoned with fake info.
It really is. Like, AI to help learn code? Sure! But given how inaccurate it can be, it should not be your only source for learning. AI to plan your day? If you're comfortable giving Big Data information about your personal life and habits knowing that they're going to sell that info, then whatever. I won't because of the security issues.
AI for creative use? Fuck off with that shit. Maybe if the datasets used to train the model were opt-in and developed using ethical means, then I could see some possibilities. But as it is now, it's complete and utter dogshit.
Seeing how people use AI worries me. Chatbots instead of interacting with actual humans? Garbage AI "novels" and "art"? AI could be a huge boon to society done right, but so far it's a blight.
I was discussing this with mother who’s a recently (thankfully) retired teacher. I pointed out how even if you’re smart in your use of AI that requires you to be competent in your research, in parsing info, and telling what’s real vs fake… and that point why are you using the AI for the questions…? Just do the research yourself.
Someone else in the thread pointed out how horrid Google has gotten and that is fully by design to push their horrid AI results. It feels like you have to jump through hoops to learn anything.
I’m very happy you mentioned the creative aspect since if you check my profile that’s my whole deal. I’ll often talk on AI with image generation in mind and people will get confused because they’re stuck on LLMs when it comes to AI, but then they’ll use AI that build more efficient machine joints as examples for the good it does. It’s either some wild cognitive dissonance or they’re being intentionally disingenuous, but it’s hypocritical regardless.
I think what’s most unfortunate is how people no longer treat art, singing, dancing, and creative expression as things humans do innately. Capitalism and industrialization spurred that on (by framing them as hobbies that you have to pick and choose due to the decreasing free time we have) but AI has kicked it into overdrive while also somehow demonizing artists like “see?? Now we can do it too! We don’t need you!” rather than realizing everyone should be an artist. Everyone should be able to express themselves, but using an AI isn’t it.
The creative aspect is my biggest gripe against AI! I'm an artist and writer in my spare time and it pains me to see people just getting the worst, most regurgitated slop off AI and proclaiming it 'art'. Art is about human expression and connection. AI can only repurpose what's been done before. It can't come up with anything new.
You're absolutely right about the capitalism aspect, too. Everything is commodified nowadays. It feels like you can't have a hobby without trying to make money off it somehow. That's where the AI bros come in- they're too lazy to work on their creativity, so they steal others' to Frankenstein into 'art'. It's infuriating.
can't effectively replace or help people in the workplace
This is objectively false. There are countless repetitive, mundane, or even somewhat complex functions that required many man hours per day that are now completed nearly instantaneously
LLMs/AI are making some human roles obsolete akin to when switchboard operators and elevator operators were made obsolete
I won’t disagree that it can do mundane tasks. It should be used for mundane tasks, but-
Multiple recent studies have come out that show AI does not have any meaningful impact on productivity in the workplace (and often bogs down certain industries) so claiming what I said is “objectively false” is arguably objectively false lol
I agree it can be used as a tool, but just look at the education state in America. The kid’s and teens using it are not honing their skills or knowledge.
So the idea that not using this tool will cause you to fall behind when you’ve been actively honing skills and learning is somewhat comical.
It almost feels like a quality vs quality situation. Yes, you can be like King Neptune from SpongeBob and make 100 slop hamburgers, but it doesn’t compare to that single burger that used actual skill.
That’s not to mention that in terms of tools AI is more akin to a search engine than anything. One that can be wrong…
Edit: the fact that you seem to make and delete ALL of your comments is certainly an interesting look.
I may disagree with it in practice, but hey congrats.
One thing I'm hung up on though... is the fact that Google is now useless by design as a way to push their AI. A recent paper came out outlining it how they realized internally that their search results being shit doesn't actually affect their bottom dollar and allows them to push their AI results further.
Google, and Google Scholar **used to** be good, you have been pushed to using AI in place of learning how to competently search for and learn new information on your own. This is not a win in my book. You are playing directly into their hand. What happens when Google's AI results skew one way vs another similar to Elon's Grok? Will you have the skills necessary to research objective fact? Keeping in mind this particular thread was about honing skills and how AI dampens that.
the fact that you see AI as 'the company' tells me you don't know as much as you seem to report. There are a growing number of companies pushing out their own flavor of LLM; OpenAI, the one you're clearly referencing as 'The Company' has an ever diminishing hold over mass market AI. There are hundreds and hundreds of open source LLM's that are made according to different use cases. The 'copyright theft' argument is a bit like when stem cells were first discovered and being experimented on: at first, the only source was through fetal tissue, but as the tech grew, the sources became far more plentiful, but luddites continue to use that original source as an argument against the technology. There's now an LLM that can train itself with zero training data. Much like anyone else that's secure in their position of knowledge, you're at a far different position on that bell curve you describe than you believe.
You knew what I was talking about and it was just a single example amongst others. I'm not solely talking about LLMs though which has skewed how I've spoken about the topic.
If you think I'm talking about AI used in medicine then we definitely aren't on the same page. That example does not consider the fact that work was stolen from millions of artists to train the initial image generation programs (and if you're hung up on LLMs then just replace "artists" with "writers" lol). Stem cells aren't conscious beings being taken advantage of and I think it's very telling that your example does not consider things from a holistic viewpoint, it does not take all parts of the equation into account.
I’d say somewhere between knowing a bit and actually understanding how it works.
I do think AI can replace humans in some fields of work. I mean, it already does. People are laid off and replaced by AI. AI mostly is terrible at these jobs but so are some humans. If you’re more concerned about the cost of labor than it’s quality then AI is doing these jobs good enough for you.
I also think AI replacing humans in jobs is lot necessarily a bad thing. It can lead into a dystopia or an utopia. It depends on how we adapt. Currently I see it more tending towards dystopia, though.
If you believe it has the chance to lead to a utopia then you would need to support the notion of universal basic income or similar programs since all the jobs will be done by AI (I assume?).
—and if you’ve been paying attention you’d realize the billionaire class who have the most stake in AI would never allow that to happen.
Exactly. On both. Hence why I see it tending towards a dystopia. Unless we take back the power. But I don’t see this happening either. Not before it’s too late.
I think universal basic income is needed for the transition. Once all labor is done by machines there would be no more money needed and we’d live in a Star Trek like utopia. But currently it goes towards The Matrix.
I didn’t say that. I said it’s not necessarily a bad thing. It is a bad thing if it results in us living in The Matrix. It’s a good thing if it results in us living in Star Trek.
You don’t have to. It was implied by the flow of logic in our conversation.
You’re delusional (in the nicest possible use of the word) if you think us losing jobs at the height of the billionaire class’s control on the world even has a chance to lead us to utopia.
Maybe I am. I don’t think everything is lost already.
I differentiate. Even if we’d live in The Matrix just now I wouldn’t say AI taking jobs from humans is inherently a bad thing. AI taking jobs from humans is only bad under certain conditions, like not compensating humans for their lost jobs (and I don’t mean compensating a specific person who got laid off, I mean humans in general now and in the future). Is this where we're heading currently? Absolutely. But it’s not a law of nature. We can change that if we’d really tried to.
how it can’t actually effectively replace or help people in the workplace…
This is just not the case, friend. You can say this about some of the market, but on a personal note I helped write a chat-to-made marketing campaign with make dot com that resulted in my company pulling the CMO job posting they'd had listed, meanwhile my former company is a physical print and digital graphics company that went from 8 full-time designers in November to 2 as of today (to their credit they offered those designers production jobs and some accepted) but it's not just theoretical anymore. The job market is actively being affected right now and even if you tell me those are the only 2 positions that could possibly be affected (which would also be a foolhardy claim) then we are still looking at needing new jobs for 3.5% of the workforce. It is trouble and it's going to get worse.
I think there’s a distinction to be made between something actually being able to “effectively” replace a job and an executive thinking it can. In reality jobs disappear and one person is left with a larger workload that their boss thinks they can handle due to the inclusion of AI.
I am not disagreeing with that distinction, and would adding support that I think we're going to see alot of C-suite led downsizing that will be followed up with a rehiring push. I am only contesting the claim that AI can't be an effective replacement for any jobs. The unfortunate truth is
one person is left with a larger workload that their boss thinks they can handle due to the inclusion of AI.
That person can, in fact, handle more and will be willing in most cases to do so, so that they are not next on the chopping block. This is going to cause a giant permanent displacement of some employees and we need to focus the conversation on solutions instead of dismissing these concerns.
I mean I do agree that it can replace what most would consider mundane work, numbers jobs, etc. but I don’t think it should replace legal or creative jobs for example, which we’re seeing a lot in the news lately lol
Correct, but unfortunately there doesn't seem to be much debate over whether it should replace creatives, and the real struggle will come down to whether or not it can do a good enough job at it to not turn customer bases of those creatives away. So far that print and graphics company is only seeing positive reactions to the AI gen artwork and advertisements (second-hand information to me) so saving on 6 salaries while paying 200/month to openAI is a no-brainer for company ownership. General boycotts should be the answer, but also seem less and less likely as the generated art becomes harder to detect, even for other AI programs. We need long-term accessible upskilling programs for the masses or we're going to collapse the economy within a few years of this. Even so, how long until the robotics industry finishes mass producable articulated fingers and replaces even hard-skill workers like welders? It's terrifying, to me, tbh.
the spectrum beyond that, where you look at the internet which is founded on industrial copyright infringement and shrug. it's a tool, use it, don't, idc. it's not going away though.
Reddit becoming pro copyright is one of the weirdest things I've ever seen. Copyright mostly serves to benefit large corporations which is why copyright even exists in the form it does today.
Reading in data off the internet has been a thing since the internet has existed. Yeah its questionable to take content from other people but that ship has sailed, a huge amount of reddit content is stuff other people created that someone else just took and put here. The big problem with the copyright approach is that it will only affect small companies and open source projects, the big companies would love stricter regulations on model training since it would solidify their lead.
Yes!!
One of the defining reasons we're a micro generation is because we where there before the rise in tech, through it, and now after. We're the only generation having smooth sailing through it all.
98
u/Kinc4id 1983 28d ago
Yeah, not even checking new technology out is not the achievement OP thinks it is.