Forget BitTorrent. Learn Usenet.

What is Usenet?

I may have misled with the title of this article. I think I made it sound like Usenet is the next-generation BitTorrent or something. Not true! Usenet is an archaic discussion system that has been with the Internet since, well, forever. It was originally created for people to discuss with each other and flame one another about whatever topics they pleased.

One day, some enterprising dudes realized they could encode binary content into text and post it to groups to be downloaded rapidly by their friends around the world, since most ISP's provided their own local copies of Usenet and thus provided a very fast cache to the end-user.

What you'll find on Usenet

Usenet is a big place, filled with both discussion and binaries. Some of this distribution is legal, and some of this is not.

Legal: Old people flaming each other, your creepy neighbor posting horrible Alvin and the Chipmunks slashfic, Linux ISOs, some book collections, porn sites flooding XXX groups with watermarked pictures that they own
Illegal: Full ROM sets for every system imaginable, DVD's of every movie and TV show imaginable, HD resolution captures of TV shows, full high-bitrate MP3 albums, DVD's of all the latest console games, cracked versions of operating systems and software, eBooks, and A LOT of porn (which, by the way, is still copyrighted). Basically all forms of media that are or can be digitized have been pirated on Usenet.

If you're wondering why Usenet hosting providers don't get in trouble for distributing this content, it's obviously due to the decentralization and legislation difficulty of the Internet, and also the DMCA. As long as the providers respond to "takedown requests" by the copyright holders within a reasonable amount of time, their operation is legal since they remain unaware of whether data transferred in and out of their service is being redistributed unlawfully. These takedown requests take resources to file, and lawyers tend to target the more public, obvious places to scare people into never downloading anything ever again. Usenet so far is relatively unscathed (and by so far, I mean since the 80's), but one day that could change.

Why Usenet is better than BitTorrent

Well, I shouldn't be so general. BitTorrent is still useful for things like Linux ISO's (though if they happen to be on Usenet at the time, it'd probably be faster). It is obviously much more efficient in regards to bandwidth, and decentralizing distribution of information. But this article is focusing more on the individual downloader, who's looking for some kind of data online. He wants it fast, and thus he should learn how to use Usenet.

Sometimes, BitTorrent may be useful for obscure things that you're unable to find on Usenet, though the odds are much lower that you'll find what you're looking for without a private tracker.

If you're using Usenet to download various files for whatever reason, it's a much safer bet for these reasons:

Getting Started: Hosting

I suppose the biggest problem for most people wanting to get involved with Usenet is that it is going to cost money. ISP's have caught on in recent years, and now it is very rare to find an ISP that actually keeps binary groups listed on their Usenet servers. Even if you do find such an ISP, their servers may only store 2 or 3 days of data, far less than what you'd get from a paid Usenet hosting account.

An unlimited bandwidth package could run you, say, $15 a month, and a good indexing site will run about $3 a month. If you do a LOT of downloading, especially of obscure content or of things that are time-critical, then it is very much worth this cost.

Concerned about spending money on this? That's valid. However, most Usenet hosting companies are very strict about keeping your privacy. When looking for a Usenet host, you are probably going to want a provider that does not keep any logs of what their users are downloading.

My current hosting provider, and my current favorite, is Newshosting.com. Their unlimited plan is about $15/month, they're lightning fast, and they keep 45 days of data stored in their spools. In my opinion, that's a really awesome deal.

Getting Started: Indexing Sites

When you hear people talking about "binaries" on Usenet, they're talking about files. It doesn't necessarily mean a program; it just means files. It could be images, ZIP archives, RAR archives, whatever. All of the stuff you're looking for is most likely stored in binaries groups, which are conveniently indexed by several sites on the web (but not Google). These sites are a lot like The Pirate Bay or TorrentSpy, since they will be the place where you search for your downloads, but not where you actually pull the data.

These indexing sites make things a LOT easier for you, because you don't have to wade through a single group's millions of postings and group them together yourself. The indexing sites create special instruction files called NZB's (sort of like a .torrent file) that you can then pop into a client of your choice so all the work is done for you.

My current favorite, and pretty much the Internet's best, is Newzbin.com. Newzbin has a community of editors that put together groups of posts, so you can be sure a post is complete and will work properly when you download it. They also have a commenting system for the posts, so you can pick up some feedback from other people that downloaded the post. Newzbin has kind of a funky payment plan where you pay once every 8 weeks, and it's only 2 Euros (around $2.55 USD). Trust me, it's definitely worth it compared to any of the free NZB sites out there.

Downloading Files

So, you got yourself an account at a Usenet host and an indexing site; now how will you pull all that sweet nectarine data? It depends on your OS, but there are great solutions for each.

Windows: NZB-O-Matic Plus, or NOMP, is an open-source NZB client for Windows that is written in the .NET framework. It's easy to configure and it's free.
OS X: Unison (by Panic) isn't free, but it is pretty damn awesome. It has a slick UI and great NZB support. If you would rather go with the free, OSS route, read on for Linux/OS X solutions.
Linux / OS X: hellanzb is a free, open-source command-line NZB client written in Python that is hella cool (I'm so clever). Check out their site for a huge list of features. You could also use this with Windows, but I'd highly recommend using NOMP instead.

I'll leave the details of downloading files with each of these clients to their documentation, but in any case it's really straight forward once you have an NZB in hand.

Just be sure to set the number of concurrent connections (or "threads") to the maximum your provider allows. Newshosting allows 8. Also, if you have any issues with your connection to the provider, your hosting will often provide specific port numbers to use so you can use a different peer (see here).

At this point, you either have the files you were looking for, or you might have a bunch of files like "r00, r01, r02" or ".001, .002, .003". These are pieces of a broken-up RAR archive, which you can open up using any number of decompression utilities (for Windows, use 7-Zip, OS X, use UnrarX, Linux, use unrar).

Although usually when you pull data from Usenet with a verified NZB (like from Newzbin) the data integrity will be pretty solid, there will sometimes be cases where you might be missing a piece or some pieces are incomplete. Luckily we have PAR to help us out.

PAR Files

Most posts, especially large ones, will bundle some special parity files that are downloaded first. Depending on the size of the post you're downloading, there may only be the initial PAR file (*.PAR), or you may have several (*.PAR, *.P01, *.P02, etc). These files give you an opportunity to rebuild any lost data, so long as you have most of the other pieces. It's a lot like magic, and if you don't believe in magic, you should read the Wikipedia article on the subject.

There are several programs out there to help you fix broken posts using PAR files. Here are my favorites:

Windows: QuickPar is free, fully-featured, and has plenty of helpful information on its website.
OS X/Linux: par2cmdline is a free command-line tool to fix that broken data (and, if you want, create PAR files of your own). However, hellanzb, mentioned above, already handles PAR files all by itself. Protip: just use that.

Basic Walkthrough

Let's say I'm looking for a Metallica album, and I haven't signed up for anything yet. I would:
  1. Get a NewsHosting unlimited account
  2. Sign up for Newzbin, give myself premium credit, and set my account preferences to show search results for 45 days (thus the results will match the retention of my hosting account)
  3. Download NOMP for Windows and configure its settings with my account settings at NewsHosting
  4. Search Newzbin for "Metallica"
  5. Click on the album I wanted
  6. Review the post to make sure it looks cool, then click "Get Message-IDs" at the top of the list of files
  7. Open the resulting NZB in NOMP, set a save directory and wait (and watch in awe as my internet connection maxes out and downloads as fast as it can)
  8. When the download finishes, I'd open the .PAR file in QuickPar and watch the integrity check to make sure the files are all OK. If not, I'd click the repair button and repair whatever is broken.
  9. Unrar the files, if necessary, by opening the first .rar file I see or by opening the first numbered file (.001) in 7-zip and extracting (it will automatically traverse through the pieces).
  10. Delete the nzb, PAR files, and RAR files, leaving only the juicy nectarine data
  11. Delete the shitty music I just downloaded for a demonstration

Now I know what you want to say next. "With the $18 bucks you just spent, you could have gone out and bought brand new copies of that awful album!" True, but now I have an entire month of unlimited access to download whatever I please at very high speeds (by repeating the last few steps in that list). My quest for shitty music was just to get started.