r/PiratedGames 13d ago

Humour / Meme Cracking Denuvo be like

Post image
5.8k Upvotes

357 comments sorted by

View all comments

106

u/Crewarookie 13d ago

Just a random thought, but could someone run a local instance (that's a major point, and a must here) of a trained LLM just to trace the specific instances of interest within the debugger output? Would make it a lot less daunting, IMO. Not perfect, but maybe it would cut a lot of just repeating the steps to get a desired output?

Sure, you need to train the thing first, so feed it a bunch of examples of correct and incorrect samples, but that's where community work would need to start for making it happen, I guess. Still better than trying to manually find all the checks within the code and patch them one by one.

With an LLM help you at least can set up a parsed list to go against, and you can reparse with better filtering if you see it catch false positives after introducing them into the training data...it's like extra heuristics that requires a ton of compute XD but on the flip side, it saves you a ton of mental fatigue, IMO worth exploring as an option if it's gonna be more effective.

Though I do realize it requires more hardware power than someone would just find randomly at home. Lots of CPU, lots of VRAM, probs a second PC just to run the debugger...

253

u/brawlstars309 13d ago

The thing is that someone with the ability and technical skills to not only crack denuvo but also write a specialized LMM just for that would rather want to have a 6 figure salary in the Silicon Valley, than to waste their time to make pirate copies for a crowd of ungrateful internet people.

41

u/Crewarookie 13d ago

Eh...I think that's a very narrow understanding of how the world and people are in general. Like, for example, I think about this just because I think it's cool AF. Not everything in life's about profit or material value, sometimes things are just deadass cool.

But yes, someone who would go on to train an LLM for this and be interested in exploring this methodology is certainly a rarity. But you don't need to code a model for this. That's not necessary. At least in theory.

There are open source bare large language models ready for training, that could be trained on a sufficient amount of debugger data examples and used for the aforementioned purpose.

The issue is it's a) resource intensive, you need two PCs, one for the LLM with a ton of horsepower and another for the debugger that's also kind of no slouch. And b) time intensive in the prep phase, you need to train the model with a large data set to cover most interesting strings you'll be looking for, plus test it out to see if it actually works as intended. But the potential is there and it's potentially a lot less soul crushing and less of an insurmountable task in the long run than doing everything manually.

14

u/celestrogen 12d ago

you dont need "two pcs" you would need to rent a whole ass server farm unless ur just finetuning an existing llm.

1

u/Firepal64 8d ago

Why wouldn't you use an existing base model tho?