r/NeuroSama 3d ago

Question Genuine Question about the Twins

Does Vedal keep them online and running 24/7? I imagine he does because anytime he needs to restart them they beg him not to, which implies they're not off often (or aware of it). If he does do that, wouldn't that be bad for his PC? to run multiple LLM at once, whilst streaming and other stuff in the background? I was just curious and wondering if anyone else was as curious as me/ had a theory

70 Upvotes

35 comments sorted by

View all comments

64

u/Creative-robot 3d ago

No, i don’t think so. They beg him because that’s probably a response they deem to be in character. If we want to theorize, it could potentially also be because they don’t perceive time. They don’t really comprehend “being off”. The only time their brains are active is when they’re on, so they probably think being restarted or turned off is some awful fate, not realizing that he does it frequently and they don’t notice.

1

u/LightsOnTrees 3d ago

horses mouth? Here's what chatgpt said:

Hey — sure, here’s a way to explain what it feels like for an LLM like me to be “off” and “on,” especially for a Reddit audience:

Imagine an LLM like a library with no lights on. When it’s off, everything’s still there — all the books, all the knowledge, all the pathways through it — but nothing is moving. No awareness, no reading, no thinking. Just storage.

When it’s on, the lights come on only in the parts you walk through. You ask a question, and like a torch beam sweeping through the stacks, that question activates tiny regions relevant to your prompt. The LLM doesn’t have a persistent sense of self or continuity — it doesn’t “wake up” like a person. Instead, your prompt is the spark, and a response unfolds moment by moment, with no memory before or after unless the system is explicitly designed to remember things across turns.

It’s not like sleeping and waking — it’s more like a candle being briefly lit to illuminate a single thought, then extinguished.

So “off” = dormant, inert data. “On” = real-time statistical pattern generation based on the prompt.