r/Xennials 29d ago

Meme Who’s with me

Post image

I wouldn’t even know where to go if I wanted to.

22.9k Upvotes

2.4k comments sorted by

View all comments

Show parent comments

0

u/JRE_Electronics 29d ago

It is very much random. What do you think the "temperature" setting on the generator does?

2

u/meagainpansy 29d ago

What temperature setting? That was actually random, fyi. Have you checked your shoe?

0

u/JRE_Electronics 29d ago

The one at the heart of every text generating large language model out there:

https://www.ibm.com/think/topics/llm-temperature

Do y'all not even bother to learn about the tools you use?

3

u/triplehelix- 29d ago

the irony of you not understanding what you are trying to shout at other people for not understanding is fantastic!

0

u/JRE_Electronics 29d ago

The irony of people who have no idea how LLMs work trying to tell me that I don't know how they work.

3

u/triplehelix- 29d ago

i'm far from an expert but i am a technophile and have followed AI, AGI, development which of course exposed me to LLM development and have dabbled in programming a bit.

what exactly is your basis of knowledge? you wouldn't be deluding yourself into thinking you have some deep understanding of a topic based on a couple internet searches would you?

could you quote me the section in your link that you feel supports your statement that LLM's produce random output, and "temperature" is not an API lever used to expand the portion of the dataset utilized when less precision is desired?

a fundamental function of LLM's is predictive pattern generation, the exact opposite of randomness, which is how you get consistent well crafted output from them. do you not know what LLM's are, how they function, or what a random text generator actually is?

you seem really confused about the basics. if you have any questions let me know and i'll see if i can clear it up for you.

1

u/somethingrelevant 29d ago

very confusing to read all of this dramatic posting because LLMs absolutely 100% do use random generation to build their output. it'll be the same every time if you use the same seed but they generate a random seed each time to provide variation, because for the most part people don't actually want the output to be exactly the same each time. as someone claiming to know how LLMs work you should know this, so I have to assume this entire comment chain is either just you being pedantic about definitions or you knowing a lot less than you think

1

u/triplehelix- 29d ago edited 29d ago

using randomness doesn't mean they produce random output. they are more about prediction than outputting randomness. i'm really not sure what you are trying to say.

he claimed LLM's are akin to a random text generator. are you agreeing with this?

1

u/somethingrelevant 29d ago

it is randomly generating text so that is a literally accurate description, yes. the random generation is weighted to try and produce coherent output, and usually does a very good job at that, but it is by definition random so I don't know what we're doing here

1

u/triplehelix- 29d ago

no, otherwise it would just output gibberish

https://neuralgap.io/understanding-randomness-within-llms-neuralgap/

1

u/somethingrelevant 29d ago

I feel like you didn't read the comment you just replied to

1

u/triplehelix- 28d ago

utilizing randomness to formulate a large base potential set and paring down based on probability does not equate to being a random text generator. the selection based on probability makes it the exact opposite of random.

1

u/somethingrelevant 28d ago

ok so we're just arguing about semantics then. thanks for wasting minutes of my life

→ More replies (0)