this post was submitted on 14 Jan 2024
200 points (94.6% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54500 readers
297 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
 

For instance, say I search for "The Dark Knight" on my Usenet indexer. It returns to me a list of uploads and where to get them via my Usenet provider. I can then download them, stitch them together, and verify that it is, indeed, The Dark Knight. All of this costs only a few dollars a month for me.

My question is, why can't copyright holders do this as well? They could follow the same process, and then send takedown requests for each individual article which comprises the movie. We already know they try to catch people torrenting so why don't they do this as well?

I can think of a few reasons, but they all seem pretty shaky.

  1. The content is hosted in countries where they don't have to comply with takedown requests.

It seems unlikely to me that literally all of it is hosted in places like this. Plus, the providers wouldn't be able to operate at all in countries like the US without facing legal repercussions.

  1. The copyright holders feel the upfront cost of indexer and provider access is greater than the cost of people pirating their content.

This also seems fishy. It's cheap enough for me as an individual to do this, and if Usenet weren't an option, I'd have to pay for 3+ streaming services to be able to watch everything I do currently. They'd literally break even with this scheme if they could only remove access to me.

  1. They do actually do this, but it's on a scale small enough for me not to care.

The whole point of doing this would be to make Usenet a non-viable option for piracy. If I don't care about it because it happens so rarely, then what's the point of doing it at all?

top 50 comments
sorted by: hot top controversial new old
[–] Darkassassin07@lemmy.ca 129 points 10 months ago (2 children)

They do receive takedown notices, however files uploaded to usenet are mirrored across many providers across many jurisdictions while also split into many parts as you noted. Usenets implementation of file sharing is quite robust; being able to rebuild a file that's missing a significant portion of it's data. To successfully take down a file, you need to remove many of these parts across almost all of the usenet backbones which requires cooperation across many nations/jurisdictions that are governed by varying laws. It's not an easy task.

Here's a somewhat limited map of usenet providers:

[–] can@sh.itjust.works 98 points 10 months ago (2 children)

So it's basically the fediverse 30+ years before?

[–] HAL_9_TRILLION@lemmy.dbzer0.com 49 points 10 months ago* (last edited 10 months ago) (1 children)

Truthfully in a technical aspect: somewhat. But USENET is also largely unmoderated. So for the purposes of meaningful discussion, there is little comparison.

[–] XTL@sopuli.xyz 19 points 10 months ago

Meaning spam to signal ratio is about 50000000:1.

[–] Darkassassin07@lemmy.ca 47 points 10 months ago (1 children)
[–] can@sh.itjust.works 57 points 10 months ago

Nature is healing

[–] ShitpostCentral@lemmy.world 16 points 10 months ago

This makes perfect sense. Thank you!

[–] _number8_@lemmy.world 114 points 10 months ago (5 children)

why is the DMCA the one fucking law that actually gets enforced at a high rate when there are literally billions of things more important that we could spend money on

[–] grue@lemmy.world 64 points 10 months ago* (last edited 10 months ago)

Because violating the DMCA is copyright infringement, and § 501 (b) of the Copyright Act gives copyright holders a private right of action to file a civil lawsuit to enforce it. Copyright holders tend to be motivated in a way that the State very often isn't.

[–] Diplomjodler@feddit.de 41 points 10 months ago

Because the corporations that benefit from this law can afford to buy lots of politicians.

[–] Supermariofan67@programming.dev 3 points 9 months ago

It's civil lawsuits by corporations, not state prosecution

load more comments (2 replies)
[–] Pretzilla@lemmy.world 89 points 10 months ago (1 children)

That's why we don't talk about it

[–] can@sh.itjust.works 6 points 10 months ago (1 children)

Is real debrid a similar situation?

[–] ninjan@lemmy.mildgrim.com 8 points 10 months ago (1 children)

No. Also use at your own risk, read their privacy policy.

[–] can@sh.itjust.works 5 points 10 months ago* (last edited 10 months ago) (1 children)

Why aren't they taken down then? I've torrented raw for decades. I can't imagine debrid is more risky to me personally.

[–] Vub@lemmy.world 4 points 10 months ago* (last edited 10 months ago) (2 children)

It’s not more risky than torrenting yourself, it is just not risk free. If you live in a country where torrenting is safe: great for you! It’s not like that everywhere.

Regarding the entire issue: Usenet and Debrid are extremely uncommon and “underground” compared to regular torrents or direct downloads. The lobby does not have infinite resources so they target the big and easy to catch fish. Just harvesting torrent IPs and suing individuals with zero knowledge of defending themselves is super easy.

[–] nintendiator@feddit.cl 2 points 9 months ago

The lobby does not have infinite resources

[citation needed]

[–] can@sh.itjust.works 1 points 10 months ago* (last edited 10 months ago)

That makes sense. I was thinking it might even be a safer in a way since there's no seeding. But that's true, I am lucky enough to never really have to worry either way.

[–] DosDude@retrolemmy.com 39 points 10 months ago (1 children)

As far as I know, they do get dmca'd. But they delete a single file so it's incomplete. But if you have 2 different newsgroup providers they usually didn't delete the same file, so you can still download it.

But I could be totally wrong because I haven't really looked into this, and this is all from a very old memory.

[–] ShitpostCentral@lemmy.world 5 points 10 months ago* (last edited 10 months ago) (2 children)

That makes some amount of sense. I'm not sure exactly how each article is stitched together to create the full file. Do you happen to know if it's just put together sequentially or if there's XORing or more complex algorithm going on there? If it's only the former, they would still be hosting copyrighted content, just a bit less of it.

EDIT:
https://sabnzbd.org/wiki/extra/nzb-spec
This implies that they are just individually decoded and stitched together.

[–] Shadow@lemmy.ca 11 points 10 months ago

Compressed into a set of small archives, then each one is posted.

Usually par files are included so you can regenerate a few missing archives. https://en.m.wikipedia.org/wiki/Parchive

[–] grue@lemmy.world 8 points 10 months ago* (last edited 10 months ago)

Do you happen to know if it’s just put together sequentially or if there’s XORing or more complex algorithm going on there? If it’s only the former, they would still be hosting copyrighted content, just a bit less of it.

Copyright is a legal construct, not a technological one. Shuffling the file contents around doesn't make the slightest bit of legal difference, as long as the intent is to reconstruct it back into the copyrighted work.

(Conversely, if the intent was to, say, print out the file in hexadecimal and wallpaper your house with it, that wouldn't be copyright infringement even if you didn't rearrange it at all because the use was transformative. Unless the file in question was a JPEG of hex-digit wallpaper, of course.)

[–] Nollij@sopuli.xyz 36 points 10 months ago (2 children)

First, a massive amount of content is removed. You won't find a lot of popular, unencrypted content these days on usenet. It's all encrypted and obfuscated now to avoid the bots

Speaking of bots, I don't think you realize how much of this process is automated, or how wide of a net is being used. The media corporations all have enormous collected libraries of material. It gets posted constantly to all sorts of places. This includes public torrents, public usenet, YouTube, PornHub (yes, really, even for non-porn), Facebook, TikTok, Tumblr, GNUtella, DDL sites....

The list goes on and on. Each one gets scanned for millions of potentially infringing items, often daily. No actual people are doing those steps.

Now, throw in things like private torrents, encrypted usenet posts, invite-only DDL, listings that use '3' instead 'e' or those other character subscriptions..... These require actual humans to process. Humans that cost money, and a considerable amount of it. As a business, you have to show a return on investment. Fighting piracy, even at its theoretical best, doesn't increase revenues by a lot.

You mention revenue and breaking even, but you left out an important detail. Your time is free. They don't have to pay $10/month, they have to pay $10/month + $20/hour for someone to deal with it. And most pirates of that level will just find another method.

[–] Darkassassin07@lemmy.ca 7 points 10 months ago (1 children)

You won't find a lot of popular, unencrypted content these days on usenet. It's all encrypted and obfuscated now to avoid the bots

That's not been my experience at all. Pretty much everything I've looked for has been available and I rarely come across encrypted files. I do regularly have to try 2 or 3 nzbs before I find a complete one, but I almost always find one.

[–] Nollij@sopuli.xyz 7 points 10 months ago (1 children)

Are they obfuscated in any way? Depending on your client, you may not be able to see the names and subjects. But if you didn't have the NZB, is there any real chance you could find it otherwise?

[–] Darkassassin07@lemmy.ca 4 points 10 months ago

But if you didn't have the NZB, is there any real chance you could find it otherwise?

No, but that's just the nature of NZB file sharing. The individual articles aren't typically tagged/named with the actual file names, that info is pulled from the NZB and the de-compressed + stitched together articles.

I'm not using any special indexers, just open public registration ones. The NZBs aren't hard to find, for me or for IP claimants.

[–] Evotech@lemmy.world 5 points 10 months ago (2 children)

I remember spesifically game of thrones. If you didn't download the episode within a day or so of being released it was DMCAed and gone from most popular usenets

Not seen that with anything else tho

[–] teamevil@lemmy.world 3 points 10 months ago

Only thing I've had a problem finding is parks and rec along with ATHF season 12

[–] jeeperv6@lemmy.dbzer0.com 3 points 9 months ago

HBO used to be bad. Going back to the Deadwood or True Blood days... the show would air, within 20 minutes of the end credits, it'd be up on Usenet. You had to start grabbing it ASAP, HBO's sniffers would be on the look out within an hour. And they'd take down just enough parts to make the PARS useless. I don't miss the UUEncode days and the refreshing a newsgroup every few minutes to see if my show got posted (or re-posted) days.

[–] rikudou@lemmings.world 20 points 10 months ago (7 children)

Off topic, but is there any tutorial on how to do this Usenet thing? Feel free to contact me on Matrix, it's on my profile.

[–] Darkassassin07@lemmy.ca 47 points 10 months ago

You'll need 3 things:

A usenet client such as SABnzbd. This is equivalent to a torrent client like qbittorrent.

An NZB indexer such as NZBGeek, again equivalent to torrent indexers, but for nzb files.

And finally a usenet provider such as FrugalUsenet. This is where you're actually downloading articles from. (there are other providers listed in the photo in my other comment here)

Articles are individual posts on usenet servers. NZB files contain lists of articles that together result in the desired files. There are also additional articles included so if some are lost (taken down due to dmca/ntd) they can be rebuilt from the remaining data. Your nzb client handles the process of reading nzb files, trying to download the articles from each of your configured usenet providers, then decompressing, rebuilding lost data, and finally stitching it all together into the files you wanted.

[–] LazerDickMcCheese@sh.itjust.works 16 points 10 months ago (1 children)

I'd like to know too, but people are so cryptic about it every time this shit is brought up that I'm overwhelmed before I even begin. So I just stick to the tried and true methods I know

[–] Agathon@lemmy.dbzer0.com 14 points 10 months ago

Find a Usenet provider. A quick web search and some reading should get you to the right place. I’m not sure if any good free servers are available anymore, but there’s probably one that’s cheap enough.

Looks like https://sabnzbd.org/ is a free and open source Windows/MacOS/Linux client that can download files. I haven’t tried it, but it’s highly rated on alternativeto.net

load more comments (3 replies)
[–] Z4rK@lemmy.world 14 points 9 months ago (1 children)

You forgot the first rule of Usenet: We do not talk about Usenet. That’s why it works. Keep talking and see what happens.

load more comments (1 replies)
[–] thantik@lemmy.world 10 points 10 months ago (2 children)

I mean, how does usenet compare to just pulling torrents from public trackers?

Is there a good way of searching to find out if something is available before diving in?

I don't really know how usenet compares - and private trackers seem quite a pain and haven't ever really had anything I couldn't find on public trackers anyways.

[–] cm0002@lemmy.world 14 points 10 months ago

If what you want is recent within the last 10-20 years or older but popular, Usenet is far superior, if it's available it will always download as fast as your connection will allow Vs torrents which also needs seeders in addition to simply being available, and then ofc you're beholden to those seeder's upload bandwidths.

If what you want is obscure and/or old, torrenting is probably your best bet, especially since you can just leave the download "open" and download it byte by byte over months if you so choose. But even then, it's still worth checking Usenet since you never really know what people will upload and when. If someone reuploads something obscure/ancient and then is never seen again it doesn't matter, you'll still be able to download it fast until it's removed or until it's retention expires (about 10 years for the good providers)

[–] Darkassassin07@lemmy.ca 4 points 10 months ago

2 main advantages:

  • no hosting liability. Unlike torrents; you're not seeding ie hosting the files yourself. You're purely downloading. This moves you out of the crosshairs of copyright holders as they are only interested in the hosts (providers). This also means a VPN is not necessary for usenet downloading. (providers don't log who downloads what either)

  • speed. As long as the content is available (hasn't been removed due to dmca/ntd), you are always downloading at the maximum connection speed between you and your provider. No waiting/hoping for seeds and whatever their connections can provide. I'm usually at around 70mb/s. Where as torrents very very rarely broke 10mb/s for me, usually struggling to reach 1mb/s.

As far as availability goes, stats from my usenet client: of 17m articles requested this month, 78% were available. I'm only using a single usenet provider. That availability percentage can be improved by using more than one in different jurisdictions (content is difficult to remove from multiple servers across different regions).

[–] stratosfear@lemmy.sdf.org 3 points 10 months ago

Tom Cruise loves DMCA

load more comments
view more: next ›