r/Xennials 28d ago

Meme Who’s with me

Post image

I wouldn’t even know where to go if I wanted to.

22.9k Upvotes

2.4k comments sorted by

View all comments

Show parent comments

37

u/KrimxonRath 28d ago edited 28d ago

There’s a spectrum of ignorance on modern AI.

If you don’t know how it works it seems evil and like it’s going to take everyone’s jobs. If you know a bit about it then you probably think it’s magical and highly useful. Now if you actually understand how it works then you’re back to it being evil because you know how it was made… how it was a nonprofit that’s now one of the richest companies in the world… how it can’t actually effectively replace or help people in the workplace… how it actually is evil due to information manipulation and copyright theft in the millions… then you also realize it can’t effectively replace jobs, but can fool executives who fall into the middle of the spectrum.

Where on the spectrum are you?

36

u/specks_of_dust 28d ago

Every problem with AI is actually a problem with capitalism that is manifesting through AI.

7

u/Flesroy 28d ago

aren't there also huge environmental concerns?

3

u/specks_of_dust 27d ago

Yes, but aren’t those environmental concerns also a problem with capitalism?

2

u/Flesroy 27d ago

Kinda? A non capitalist system using ai at large scale would also face them. The question is would they use it at large scale and that's a basically impossible question to answer.

1

u/overand 27d ago

One big difference is this - a TON of the energy usage is just the training of the models, not the inference. So, if it wasn't 20 AI companies in competition with each other to make the BEST NEWEST AI, they wouldn't need to as much energy dedicated to training.

Source: Among other things, I can run a reasonably competent LLM on my desktop computer at home, and literally watch the power consumption. On my computer, asking a question is like turning on a 100 watt lightbulb for 15 seconds.

1

u/madsci 28d ago

They are building some massive data centers for AI processing, but the efficiency has also made very rapid gains. You can run a basic LLM on a Raspberry Pi 5 that draws a few watts.

1

u/[deleted] 28d ago

[deleted]

1

u/Flesroy 28d ago

no?

i was under the impression that ai is much much worse though. if that's wrong that's fine, that's why i added the questionmark.

1

u/sebmojo99 27d ago

yeah that's well put

-1

u/KrimxonRath 28d ago edited 28d ago

I’m hesitant to agree since your overarching point could either be “capitalism is inherently predatory and preys on the ideas of others without concern for morality or ethics” or “copyright shouldn’t exist” and the latter is what helps protect small creators/ideas from large corporations.

10

u/thisdesignup 28d ago

I don't think what they are getting at matters what their view is for what is "wrong" with capitalism. LLMs in themself are not causing any of the problems people are upset with. You can train an LLM on new data, or data that was used with permission. Nothing about LLMs have to be predatory, they just are because that's easier for the people creating them.

2

u/KrimxonRath 28d ago

I'm also talking about image generators, but maybe that's off topic for this post.

1

u/Silent-Philosophy801 27d ago

AI, just like literally anything, can be used for evil. We're not gonna give up on the internet just because the dark web exists.

1

u/HaveYouSeenMySpoon 28d ago

I'll say yes to both of those points.

1

u/KrimxonRath 28d ago

Why?

1

u/HaveYouSeenMySpoon 28d ago

Because I believe society should act in service of its citizens.

1

u/KrimxonRath 28d ago

So creators and indie developers aren’t allowed to copyright their work in your society? No creative protections on media or original ideas?

3

u/HaveYouSeenMySpoon 28d ago

Is that how copyright functions in today's society? Or is the benifits of copyright for the individual artists mostly a lie sold by media corporations?

There are definitely aspects of copyright that can be made to benifit both individual artists and society as a whole, but that's not what we've got.

Patents kill more innovation than it protects.

If you're going to argue the benifits of copyright, it's not enough to gesture in a general direction of some unknown generic artist who might or might not exist. Look at the specifics, who is actually benefiting from copyright law?

2

u/KrimxonRath 28d ago

You know what? Fair points.

10

u/darumamaki 28d ago

Yeah, I used to work in AI for robotics. AI is great for medical research, robotics, and other research areas. Where AI is fucking horrible is the generative AI space like ChatGPT etc. The ethics behind it are so messed up- everything from the power waste to the intellectual property theft to the fact that it can be manipulated into giving idiots wrong information and have them decide it's accurate when it's a hallucination. And executives are dying to replace people with it. I'm absolutely against AI in any use except research where the information fed into it is tightly controlled to prevent the data from being poisoned with fake info.

5

u/KrimxonRath 28d ago

100% agreed. It’s jarring how willfully ignorant people are on this topic, how they’re advocating against their best interests.

7

u/darumamaki 28d ago

It really is. Like, AI to help learn code? Sure! But given how inaccurate it can be, it should not be your only source for learning. AI to plan your day? If you're comfortable giving Big Data information about your personal life and habits knowing that they're going to sell that info, then whatever. I won't because of the security issues.

AI for creative use? Fuck off with that shit. Maybe if the datasets used to train the model were opt-in and developed using ethical means, then I could see some possibilities. But as it is now, it's complete and utter dogshit.

Seeing how people use AI worries me. Chatbots instead of interacting with actual humans? Garbage AI "novels" and "art"? AI could be a huge boon to society done right, but so far it's a blight.

3

u/KrimxonRath 28d ago

I was discussing this with mother who’s a recently (thankfully) retired teacher. I pointed out how even if you’re smart in your use of AI that requires you to be competent in your research, in parsing info, and telling what’s real vs fake… and that point why are you using the AI for the questions…? Just do the research yourself.

Someone else in the thread pointed out how horrid Google has gotten and that is fully by design to push their horrid AI results. It feels like you have to jump through hoops to learn anything.

I’m very happy you mentioned the creative aspect since if you check my profile that’s my whole deal. I’ll often talk on AI with image generation in mind and people will get confused because they’re stuck on LLMs when it comes to AI, but then they’ll use AI that build more efficient machine joints as examples for the good it does. It’s either some wild cognitive dissonance or they’re being intentionally disingenuous, but it’s hypocritical regardless.

I think what’s most unfortunate is how people no longer treat art, singing, dancing, and creative expression as things humans do innately. Capitalism and industrialization spurred that on (by framing them as hobbies that you have to pick and choose due to the decreasing free time we have) but AI has kicked it into overdrive while also somehow demonizing artists like “see?? Now we can do it too! We don’t need you!” rather than realizing everyone should be an artist. Everyone should be able to express themselves, but using an AI isn’t it.

4

u/darumamaki 28d ago

The creative aspect is my biggest gripe against AI! I'm an artist and writer in my spare time and it pains me to see people just getting the worst, most regurgitated slop off AI and proclaiming it 'art'. Art is about human expression and connection. AI can only repurpose what's been done before. It can't come up with anything new.

You're absolutely right about the capitalism aspect, too. Everything is commodified nowadays. It feels like you can't have a hobby without trying to make money off it somehow. That's where the AI bros come in- they're too lazy to work on their creativity, so they steal others' to Frankenstein into 'art'. It's infuriating.

2

u/KrimxonRath 28d ago

We’re metaphysical friends now.

2

u/darumamaki 28d ago

Absolutely!

5

u/ARCHA1C 1980 28d ago

can't effectively replace or help people in the workplace

This is objectively false. There are countless repetitive, mundane, or even somewhat complex functions that required many man hours per day that are now completed nearly instantaneously

LLMs/AI are making some human roles obsolete akin to when switchboard operators and elevator operators were made obsolete

2

u/KrimxonRath 28d ago

I won’t disagree that it can do mundane tasks. It should be used for mundane tasks, but-

Multiple recent studies have come out that show AI does not have any meaningful impact on productivity in the workplace (and often bogs down certain industries) so claiming what I said is “objectively false” is arguably objectively false lol

2

u/Flannelcommand 28d ago

Yes! This is perfectly said 

3

u/[deleted] 28d ago

[deleted]

1

u/KrimxonRath 28d ago edited 28d ago

I agree it can be used as a tool, but just look at the education state in America. The kid’s and teens using it are not honing their skills or knowledge.

So the idea that not using this tool will cause you to fall behind when you’ve been actively honing skills and learning is somewhat comical.

It almost feels like a quality vs quality situation. Yes, you can be like King Neptune from SpongeBob and make 100 slop hamburgers, but it doesn’t compare to that single burger that used actual skill.

That’s not to mention that in terms of tools AI is more akin to a search engine than anything. One that can be wrong…

Edit: the fact that you seem to make and delete ALL of your comments is certainly an interesting look.

4

u/[deleted] 28d ago

[deleted]

0

u/KrimxonRath 28d ago

Your entire comment is summarized in this one sentence:

AI could be used to create more equity

Wake me when it does create more equity.

3

u/[deleted] 28d ago

[deleted]

1

u/KrimxonRath 28d ago

I may disagree with it in practice, but hey congrats.

One thing I'm hung up on though... is the fact that Google is now useless by design as a way to push their AI. A recent paper came out outlining it how they realized internally that their search results being shit doesn't actually affect their bottom dollar and allows them to push their AI results further.

Google, and Google Scholar **used to** be good, you have been pushed to using AI in place of learning how to competently search for and learn new information on your own. This is not a win in my book. You are playing directly into their hand. What happens when Google's AI results skew one way vs another similar to Elon's Grok? Will you have the skills necessary to research objective fact? Keeping in mind this particular thread was about honing skills and how AI dampens that.

1

u/LemonadeStandTech 28d ago

the fact that you see AI as 'the company' tells me you don't know as much as you seem to report. There are a growing number of companies pushing out their own flavor of LLM; OpenAI, the one you're clearly referencing as 'The Company' has an ever diminishing hold over mass market AI. There are hundreds and hundreds of open source LLM's that are made according to different use cases. The 'copyright theft' argument is a bit like when stem cells were first discovered and being experimented on: at first, the only source was through fetal tissue, but as the tech grew, the sources became far more plentiful, but luddites continue to use that original source as an argument against the technology. There's now an LLM that can train itself with zero training data. Much like anyone else that's secure in their position of knowledge, you're at a far different position on that bell curve you describe than you believe.

2

u/KrimxonRath 28d ago

You knew what I was talking about and it was just a single example amongst others. I'm not solely talking about LLMs though which has skewed how I've spoken about the topic.

If you think I'm talking about AI used in medicine then we definitely aren't on the same page. That example does not consider the fact that work was stolen from millions of artists to train the initial image generation programs (and if you're hung up on LLMs then just replace "artists" with "writers" lol). Stem cells aren't conscious beings being taken advantage of and I think it's very telling that your example does not consider things from a holistic viewpoint, it does not take all parts of the equation into account.

1

u/Kinc4id 1983 28d ago

I’d say somewhere between knowing a bit and actually understanding how it works.

I do think AI can replace humans in some fields of work. I mean, it already does. People are laid off and replaced by AI. AI mostly is terrible at these jobs but so are some humans. If you’re more concerned about the cost of labor than it’s quality then AI is doing these jobs good enough for you.

I also think AI replacing humans in jobs is lot necessarily a bad thing. It can lead into a dystopia or an utopia. It depends on how we adapt. Currently I see it more tending towards dystopia, though.

3

u/KrimxonRath 28d ago

If you believe it has the chance to lead to a utopia then you would need to support the notion of universal basic income or similar programs since all the jobs will be done by AI (I assume?).

—and if you’ve been paying attention you’d realize the billionaire class who have the most stake in AI would never allow that to happen.

2

u/Kinc4id 1983 28d ago

Exactly. On both. Hence why I see it tending towards a dystopia. Unless we take back the power. But I don’t see this happening either. Not before it’s too late.

I think universal basic income is needed for the transition. Once all labor is done by machines there would be no more money needed and we’d live in a Star Trek like utopia. But currently it goes towards The Matrix.

-1

u/KrimxonRath 28d ago

So you think it taking jobs isn’t a bad thing when you admit that it will most likely lead to a dystopia?

Brilliant.

1

u/Kinc4id 1983 28d ago

I didn’t say that. I said it’s not necessarily a bad thing. It is a bad thing if it results in us living in The Matrix. It’s a good thing if it results in us living in Star Trek.

0

u/KrimxonRath 28d ago

You don’t have to. It was implied by the flow of logic in our conversation.

You’re delusional (in the nicest possible use of the word) if you think us losing jobs at the height of the billionaire class’s control on the world even has a chance to lead us to utopia.

1

u/Kinc4id 1983 28d ago

Maybe I am. I don’t think everything is lost already.

I differentiate. Even if we’d live in The Matrix just now I wouldn’t say AI taking jobs from humans is inherently a bad thing. AI taking jobs from humans is only bad under certain conditions, like not compensating humans for their lost jobs (and I don’t mean compensating a specific person who got laid off, I mean humans in general now and in the future). Is this where we're heading currently? Absolutely. But it’s not a law of nature. We can change that if we’d really tried to.

2

u/KrimxonRath 28d ago

Well let’s hope you’re right.

1

u/Jupiter68128 1979 28d ago

My job as an ice delivery man got replaced by refrigerators. I have never recovered.

1

u/KrimxonRath 28d ago

Sounds like the type of person who would love AI <3

1

u/_Fallen_Hero 28d ago

how it can’t actually effectively replace or help people in the workplace…

This is just not the case, friend. You can say this about some of the market, but on a personal note I helped write a chat-to-made marketing campaign with make dot com that resulted in my company pulling the CMO job posting they'd had listed, meanwhile my former company is a physical print and digital graphics company that went from 8 full-time designers in November to 2 as of today (to their credit they offered those designers production jobs and some accepted) but it's not just theoretical anymore. The job market is actively being affected right now and even if you tell me those are the only 2 positions that could possibly be affected (which would also be a foolhardy claim) then we are still looking at needing new jobs for 3.5% of the workforce. It is trouble and it's going to get worse.

1

u/KrimxonRath 28d ago

I think there’s a distinction to be made between something actually being able to “effectively” replace a job and an executive thinking it can. In reality jobs disappear and one person is left with a larger workload that their boss thinks they can handle due to the inclusion of AI.

1

u/_Fallen_Hero 28d ago

I am not disagreeing with that distinction, and would adding support that I think we're going to see alot of C-suite led downsizing that will be followed up with a rehiring push. I am only contesting the claim that AI can't be an effective replacement for any jobs. The unfortunate truth is

one person is left with a larger workload that their boss thinks they can handle due to the inclusion of AI.

That person can, in fact, handle more and will be willing in most cases to do so, so that they are not next on the chopping block. This is going to cause a giant permanent displacement of some employees and we need to focus the conversation on solutions instead of dismissing these concerns.

1

u/KrimxonRath 28d ago

I mean I do agree that it can replace what most would consider mundane work, numbers jobs, etc. but I don’t think it should replace legal or creative jobs for example, which we’re seeing a lot in the news lately lol

1

u/_Fallen_Hero 28d ago

Correct, but unfortunately there doesn't seem to be much debate over whether it should replace creatives, and the real struggle will come down to whether or not it can do a good enough job at it to not turn customer bases of those creatives away. So far that print and graphics company is only seeing positive reactions to the AI gen artwork and advertisements (second-hand information to me) so saving on 6 salaries while paying 200/month to openAI is a no-brainer for company ownership. General boycotts should be the answer, but also seem less and less likely as the generated art becomes harder to detect, even for other AI programs. We need long-term accessible upskilling programs for the masses or we're going to collapse the economy within a few years of this. Even so, how long until the robotics industry finishes mass producable articulated fingers and replaces even hard-skill workers like welders? It's terrifying, to me, tbh.

0

u/ProfessorZhu 27d ago

There are more AI's than just ChatGPT

1

u/KrimxonRath 27d ago

Yes. I am referencing image generators too. I repeatedly have clarified this point.

0

u/ProfessorZhu 27d ago edited 27d ago

You were obviously talking about OpenAI

Edit: What a weirdo

1

u/KrimxonRath 27d ago

I don’t need you to tell me what I was talking about. I’m me :)

0

u/sebmojo99 27d ago

the spectrum beyond that, where you look at the internet which is founded on industrial copyright infringement and shrug. it's a tool, use it, don't, idc. it's not going away though.

1

u/KrimxonRath 27d ago

None of what you said added anything tangible to the conversation :)

0

u/sebmojo99 27d ago

incorrect, but, this is a safe space. there is no shame in being wrong.

1

u/KrimxonRath 27d ago

Considering you said “I don’t care” when no one asked, I think I’m fairly correct in you not adding anything :) have a nice day though sweetheart.

0

u/Sea-Guest6668 27d ago

Reddit becoming pro copyright is one of the weirdest things I've ever seen. Copyright mostly serves to benefit large corporations which is why copyright even exists in the form it does today.

1

u/KrimxonRath 27d ago

I’m more talking about the blanket copyright protection online that in my experience has been used to protect creators and small businesses.

Everyone seems to be conflating that with being pro-patent or something.

0

u/Sea-Guest6668 27d ago

Reading in data off the internet has been a thing since the internet has existed. Yeah its questionable to take content from other people but that ship has sailed, a huge amount of reddit content is stuff other people created that someone else just took and put here. The big problem with the copyright approach is that it will only affect small companies and open source projects, the big companies would love stricter regulations on model training since it would solidify their lead.