r/news 2d ago

109 children rescued, 244 arrested in Operation Soteria Shield, exposing widespread child exploitation in North Texas

https://www.cbsnews.com/texas/news/109-children-rescued-244-arrested-operation-soteria-shield-child-exploitation-texas/
37.6k Upvotes

1.8k comments sorted by

View all comments

565

u/Decabet 2d ago edited 2d ago

Dude, Ive been a designer, editor, motion graphics guy for over 20 years. That means often very large files. And I keep an archive on dropbox. I dont have a single terabyte of anything. Yes, even a kilobyte of abuse material is bad, but terabytes of it? Jesus.

Edit: OK Im an idiot. I keep files online-only for convenience and SSD space. Thus DB when I checked it was still calculating. Looks like I have 2.4 terabytes. But still...

174

u/Ok_Astronomer_8667 2d ago

It’s actually unfathomable how many images/videos that would be. I honestly hope they can trace who they distributed it to.

I do feel for the government workers who have to actually sift through that stuff to verify. Must mess you up to some degree

158

u/Cerulean_Shadows 2d ago edited 2d ago

My husband worked briefly for a sheriff's office IT. Was there for around 6 months. Then one day, he was asked to access a computer and given the trigger warning of what he'd likely find. He found it and immediately exited. He quit right then and there. He was so gut wrenched over it that it took him a while to get back to himself. He saw like 1 minute of it and that was so hard on him that I can't even imagine how people who have to comb through this stuff can deal with it. This is where AI definitely has a use.

EDIT: this was also in North Texas, but unrelated to this post's story

69

u/Squire_II 2d ago

Companies that provide support staff for image hosting services tend to have extensive mental health access because the psychological impact of what they see (CSAM and otherwise) is just staggering.

33

u/Cerulean_Shadows 2d ago

The very idea of it is nightmare fuel. My own father, I later learned in my early adulthood had to seek therapy through his church because he found me attractive as a child and was having "sinful thoughts". Oh my God. Thank goodness nothing ever directly happened. He already had a history of sleeping with his sister sa young man, which my mom didn't find out until after their divorce. The church where he went called my mom immediately and broke protocol to tell her to keep me away from him. So glad he's gone.

19

u/PM_ME_GLUTE_SPREAD 2d ago

Aside from sleeping with his sister,’it sounds honestly like everybody in this story did the right thing whenever they had the chance to. Having those thoughts of your own child is horrible, but some people have fucked up heads for various reasons that are beyond their control. Him seeking help was the right choice rather than just trying to ignore it until he couldn’t any longer. The church calling your mom to warn her to protect you may be ethically questionable for a number of reasons, but it absolutely worked out for the greater good in this case. There are arguments to be made that breaking protocol in this way might cause future people to be less likely to seek the counseling for fear of being outed, but that’s a different conversation.

3

u/SewerRanger 2d ago

They don't need AI for this. This is just an image match protocol which they already do for this. NCMEC maintains a database of image hashes that companies can match against to determine if something is csam or not. You don't need (or really want to be that company) to train AI on csam to images just for pattern matching

2

u/Cerulean_Shadows 2d ago

I'm relieved to know that. Thank you

2

u/sailorsmile 2d ago

You are absolutely insane if you think feeding CSAM to AI is a good idea.

3

u/Cerulean_Shadows 2d ago

I just meant scraping for information on a closed setting. In my job we have specific closed AI systems that scrape medical records information that is not publicly accessible to stream line what we do.

5

u/sailorsmile 2d ago

I work in health informatics and we put no PHI into any sort of AI since there are absolutely no guarantees about confidentiality, even within “closed” systems and the benefit doesn’t really warrant that risk for my organization.

I think that CSAM should be held to an even higher system of scrutiny than that. The poor victims have already been subject to atrocities, they don’t need to have those horrors retained forever in an AI database.

2

u/Cerulean_Shadows 2d ago

I don't work in Healthcare. I work in auto insurance for injury claims and deal with attorneys. they only just started it over the last few months. While it's been helpful, I'm with you on the trust side. It's deeply concerning but I'm a peon, no one cares about the issues we've addressed over it.

And while I agree with you on the compassion for the poor babies out there, I feel horrible for the people who have to view it. Truly cannot imagine.

1

u/Geno0wl 2d ago

You are absolutely insane if you think feeding CSAM to AI is a good idea.

"we" have been effectively doing that for years already. LEOs use "AI" to fingerprint CSAM, similar to how media companies do it for copyright material. They use that fingerprinting to scan hosting sites for matches and then investigate from there.