stuck on decrappified windows for the immediate future.
PirateJesus
Philip answered him, 2 books is not sufficient for them. And Jesus took the books; and when he had given thanks, he distributed to the disciples, and the disciples to them that were set down. Therefore they gathered them together, and filled twelve baskets with the new copies, which remained over.
- 30 Posts
- 24 Comments
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish1·1 year agoEverybody is American. They just don’t know it yet.
Gospel of the Jesus
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish1·1 year agoIt seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I’ve heard from others, it is used to simplify prosecution. PedoAnon can’t argue “it’s a deepfake, not a real kid” to the SWAT team.
There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense. This can be attributed to no proper funding of CSAM enforcement. Pedos get picked up if they become an active embarrassment like the article dude. Otherwise all the money is just spent on the database getting bigger and keeping the lights on. Which works for congress. A public pedo gets nailed to the wall because of the database, the spooky spectre of the pedo out for your kids remains, vote for me please…
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish2·1 year agoAnd also it’s an AI.
13k images before AI involved a human with Photoshop or a child doing fucked up shit.
13k images after AI is just forgetting to turn off the CSAM auto-generate button.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish2·1 year agoStable Diffusion has been distancing themselves from this. The model that allows for this was leaked from a different company.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish15·1 year agoIt’s not ok to do this. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish22·1 year agoMaking the CSAM is illegal by itself https://www.thefederalcriminalattorneys.com/possession-of-lolicon
Title is pretty accurate.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish36·1 year agoCreating the pics is a crime by itself. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish114·1 year agoso many people still think it should be illegal
It is illegal. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish18·1 year agoThe generated stuff is as illegal as the real stuff. https://www.thefederalcriminalattorneys.com/possession-of-lolicon https://en.wikipedia.org/wiki/PROTECT_Act_of_2003
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish15·1 year agoIt would be illegal in the United States. Artistic depictions of CSAM are illegal under the PROTECT act 2003.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish23·1 year agoAsked whether more funding will be provided for the anti-paint enforcement divisions: it’s such a big backlog, we’ll rather just wait for somebody to piss of a politician to focus our resources.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish311·1 year agoSimulated crimes aren’t crimes.
Artistic CSAM is definitely a crime in the United States. PROTECT act of 2003.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish28·1 year agoThe major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.
The Protect Act of 2003 means that any artistic depiction of CSAM is illegal. The guidance is pretty clear, FBI is gonna raid your house…eventually. We still haven’t properly funded the anti-CSAM departments.
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish477·1 year agoOMG. Every other post is saying their disgusted about the images part but it’s a grey area, but he’s definitely in trouble for contacting a minor.
Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.
https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish712·1 year agoCurrently, we do not outlaw written depictions nor drawings of child sexual abuse
Cartoon CSAM is illegal in the United States
https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish89·1 year agoCartoon CSAM is illegal in the United States. Pretty sure the judges will throw his images under the same ruling.
https://en.wikipedia.org/wiki/PROTECT_Act_of_2003
https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ@lemmy.dbzer0.com•Tribler: Peer-to-Peer decentralized torrent clientEnglish1·1 year agoDoes nobody use the god given Repository of all human knowledge?
There are privacy issues that still have not been addressed as of 2023:
A privacy review of Tribler, the onion-routed BitTorrent app
https://www.ctrl.blog/entry/tribler-onion-routed-bittorrent.html
Daniel Aleksandersen 2022-01-11 10:35Z
Hi Anth0rx, yes — I’ve looked into all of them. Here are some hot-takes:
Loginet is just a front for a cryptocurrency. It’s decentralized but not distributed. It’s primary purpose is to selling you hot air, though.
I2P can only talk to other I2P users. There are far from enough users on it to reliably use it for P2P. There’s nothing inherently wrong with it, it just never reached critical mass. The set-up process is probably too complicated for most potential users.
GNUnet has been “fixing the internet” for literally two decades. They‘ve yet to deliver anything. The software download pages clearly warns that it’s still “not yet ready”. It’s an interesting project, but it doesn’t seem to be going anywhere.
Daniel Aleksandersen 2023-07-02 15:17Z
The project change log does not indicate any work on any of the things discussed in this article. I might revisit this after the next beta release.
TLDR: Censorship resistant doesn’t mean anything if they can find you and nail you to a cross
PirateJesus@lemmy.todayto Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ@lemmy.dbzer0.com•There should be a way to give directly to the developersEnglish1·1 year agoCode comes from Cappuccino, so sayeth the messiah
It depends on how situation aware the cops are about all the cameras and witnesses. In an ideal case, they’d realize there are a lot of voting community members around them who made the time to attend a ceremony. Not a situation where people will just turn up the volume on their television sets.