main page
Downloading


The Headers of Posts Explained 
  Some of the more common header abbreviations are defined below
 a or an  anal
 ani  animation - cartoon
 Arj or 0x  compressed archive (see Extracting Archives)
 bj  blow job
 cs  cum shot
 dil  dildo
 dp  double penetration
 f  female
 fac  facial cum shot
 fst  fisting
 gb  gang bang
 gs  golden shower
 Lz or lez   lesbian
  m  male
 o  oral
 pw  password protected (see Extracting Archives)
 Rar or r00  compressed archive (see Extracting Archives)
 reg  regular male/female sex
Rp  Repost
 so  strap on dildo
 tf  tit fuck
 ts  Transexual
 ws   water sports
 zip  compressed archive (see Extracting Archives)

and the codecs -
Some of the more common CODEC abbreviations used in ABME are defined in CODECS



Decoding Off-line 

Even if your newsreader decodes both UUE and MIME base 64, there are times you may need to decode a post off-line.  For instance, it's possible to save encoded parts of the same file from two different servers and then use your decoder program to assemble them (concatenate).

  1. If the file is UUencoded (every line begins with "M"), save as .UUE; if it's MIME base 64, save as .MME.
  2.  If you have all parts of a post on one server, retrieve all message bodies first.  Make sure the parts are sorted in the correct sequence, then select them all together.  In (Free)Agent, click on File | Save Messages As... and choose "no dividers" and "bodies only, no headers."
  3. If the parts are on more than one server, or if your newsreader doesn't have a command for saving the parts already joined, then download and save each one individually with temporary sequential filenames, such as file01.mme, file02.mme, etc.  Choose "no headers" if you have that option.
 In Agent and Free Agent, you can download the earliest parts joined as described above and save them joined, then change servers and APPEND the missing part to the file. Change back to your primary server and append the remaining parts. It's a tricky process, though.  See Missing Parts.

If your newsreader tells you there's no attachment, then it's probably MIME base 64 encoded, (Decoding Off-line).
If your downloaded clip only plays for a few seconds, then stops, odds are that you only decoded part 1 of a multiparty file.  To fix the error, highlight and download all parts of the file.  Count them and make sure every part is there.  Then, after you have the message bodies downloaded, right click and look for "combine and decode" or something similar.

If you do get the "no attachment" message, check to see if the poster is using the new major format entering play called yEnc - a space / time saving encoding scheme - for more info check here.


How to stop and resume a long download 
You can trick a newsreader that has off line reading.
1. Split the file into its parts.
2.  Sort by subject.
3. Then retrieve the bodies.
4.  Stop and resume any time you want, so the download only has to begin again at the current part.
5.  When all bodies have been retrieved from the server, click on save.
 There are additional benefits to downloading piecemeal:
      (1) You don't have to restart the download from the beginning if your connection cuts out.
   (2) If an intermediate part has been corrupted, its actual line count will usually be noticeably smaller or larger than the other parts, and often this is only apparent after the message body has been retrieved. If this happens, ask for a repost of that part and fill it in using the methods described in (Missing Parts)


Missing Parts 

With the advent of PAR recovery files, and most certainly the newer Par2, missing segments are no longer have to be the problem they once were. If you looking for a fix, rather than an in-depth explanation, or need information on how to, see our usenet help section:

For How to use Par1

For How to use Par2

Missing parts in UUE posts using Agent

Missing parts in yEnc posts using Agent
Missing parts using Xnews

    Missing Parts discussed
The more spam or garbage messages in circulation, the farther back the binaries are in the queue.  The number of posts actually matters as much as the size.  Each message must go through the same channels as binary parts do, and only so many will go through at once.  Consider the handling required to process 1,000 2 line posts as opposed to a single 2,000 line binary. Reduce the spam and the chat, and you will see more complete posts.  This is why we're always hammering people to use the discussion groups.

Multipart posts in smaller pieces generally mean greater distribution with fewer problems. Once in a while you'll see an aborted post, where the upload timed out and the originator didn't know how to resume it or cancel the unusable parts - them showing their "Newbie"-ness.  Most likely, they're just using a lousy news server.  Since message routing on the Internet is dynamic, pieces 4 and 5 of an 8 part file may have been routed through many more machines than the rest of the pieces.  Articles sometimes land on a particularly bogged down news server between you and the originator where they are spooled (saved on disk) until the server has enough resources to do its part of the broadcasting.  Sometimes, by the time those parts get moving again, your server has already expired posts from that date and you miss out.
      When faced with fragmented multipart posts, it's always a good idea to wait a couple of days for all the pieces to show up.  Wait until all the parts that did get through fall off your server before asking  for a repost.
  While smaller files are obviously more easily digested by many more people (and therefore, the better way to post) good servers get all 100 part posts just fine.

     When posts from certain people are always incomplete there are several factors involved:

1) take a look at the path in the headers of those posts.  Chances are, you'll see a long list of servers between yours at the beginning and the poster's at the end.  Every hop is a chance for a part to be dropped.  The question, then, is which one of you is too far away from the Usenet backbone.  A post is always complete on the originator's server.

2) some people post in chunks that are too big for many news servers to handle, and some servers have dropped packet size limits way down in an attempt to cope with increased Usenet volume.  In 1996, 15,000 lines per part was common.  In 1999, anything over 7,500 lines (or even smaller) is often stopped in its tracks.  If you only see part 0/n of the post and the last part, which is often much smaller than the intermediates, then the post has obviously run into packet size limits someplace.  Check the largest individual part size you see in ABME?  If it's fairly small, and if you see a lot of posts missing their entire middles, you can safely assume your server is filtering out larger parts.

If you've given the post a couple of days to propagate and have all but a few pieces, most contributors will cheerfully re-up the missing ones for you.  Post your request in ABMED, Attention: poster's nym>.

Saving Parts from more then one Server 
(1) If you use only ONE copy of Agent or Free Agent
a. Retrieve the bodies individually for the parts you have.
b. Click on Edit | Select All, and hit K (keep command, padlock icon).
c. Now all the ABME headers from your primary server are marked kept and you'll be able to tell them apart from any you get via the secondary server.
d. Change the NNTP server ID and get ABME headers again.
e. If the missing parts fill in, retrieve their message bodies, then save your completed file.
To clean up the mixed source headers,
a. Click on Edit | Select All, and hit delete.
b. Any extraneous headers (not locked) from the second server will be deleted.
c  Then remove the locks from the rest,
d. Restore your server ID, and you'll be back to normal again.
(2) With TWO copies of Agent / Free Agent, thanks to bj the dj. 

This definitely works with Forte's Agent. I haven't used Free Agent for some years but it should be possible using the inbox instead of a folder. To combine multipart messages with the parts on different servers:

1. In your main instance of Agent, make a new Folder to hold the parts of your multi-part file.
2. Download the parts available on your main news server, and then drag'n'drop them to the Folder that you've created.
3. Download the other parts from the alternative server(s).
4. Select all of those parts, (i.e. click on the first and shift-click on the last).
5. Select File>Save Messages As,
a. then select the destination and choose a suitable filename - something ending in .TXT helps.
b. Select File Format>UNIX message file
c. Select Save raw (Unformatted Message)
d. Select Header fields to include>All fields
e. Save the file.
6. Go back to your main instance of Agent and select File>Import messages. Navigate your way to the file that you just saved in the other instance of Agent.
a. Select Destination Folders>Put all messages in folder, and then
b. select the folder that you made in 1. above
c. Open the file, all the parts will imported into your selected folder.
7. Go to your 're-combine' folder and sort on Subject.
8. Select all parts of the article that you want to recombine.
a. Select Message>Join and then
b. choose whether to Join, Launch or save the recombined article; or
c. alternatively use the options available on the right mouse button.

How to Join the Parts when you get them All 

How to join and save parts if those missing don't come through immediately, or the missing parts are reposted for you:

1. Online newsreader, you'll have to save each part individually to your hard disk and then decode the file off-line when it's complete.

2. Off-line reader, the methods below for either Agent or Free Agent should work for you.  The difference between the two, in this instance, is Agent's ability to display a single, joined header for multiparts and more sorting options.

Agent users:
1.  Split the incomplete message into individual parts.
2.  Select all the parts and hit K (keep command, padlock icon).
3.  Retrieve the bodies for those parts before they disappear from your server.
4.  Leave them right where they are, no saving, no decoding.
5. When the missing parts are reposted, split them (if there's more than one, they might be displayed joined at first) and retrieve the bodies.
6.  When you have the bodies for all the parts, sort the message list by Subject so the parts fall into correct numerical order.  Then:

(a) If the headers are identical, click on part 1 and hit "A" (save command).
(b) If the reposted parts have a different subject header from the original post (for example, Attn.: Sparky, etc.), create a folder and move all the parts to the folder, then select all of them by clicking and dragging.  Otherwise, select them while they're still in the main ABME browser window by clicking on part 1 and then scrolling down to the last part and hitting shift click,
(c) Find the new parts and add them to the selection with control clicks.
(d) Click on Message | Join Sections.
(e) Use the Up and Down buttons on the screen to put the new parts into correct number sequence with the rest.
(f) Click on Save.

Sfv & Par Files 

Often when downloading you'll see either Par files (*.Par and *.P01, *.P02, etc.) or Sfv files (*.sfv). Both of these are means of confirming the condition of the '*' file that they accompany. Make sure when these files are posted you download them as well. For how they're used, check the Viewing section.


What NOT to Download 

     If you can't play some of the files downloaded from ABME and you're on a Mac, hope for a friend with a Wintel computer to help you out.  Some of the codecs favored by posters are proprietary formats (I263) that there would appear to be no Mac versions of.  Have the Wintel friend see Converting Video Clips.  Otherwise, check the 00/xx section of the post and confirm on your machine that you have the codec installed before downloading.

For Wintel users, check to see if the Codec is on your machine; if not head to the links page to see if one's available.  Again, if it's not there, check the d group for updated information and discussion.  I promise any new codec will be a subject of conversation there.


Kill Filters 

The Agent help file is your best resource for learning how to use kill and watch filters. But here's some quick kill filter settings you might want to try. Most of these came from Bob Slack, with the rest filled in as they were posted in abme. For the full, complete scoop on Agent and the max you can get from kill filters, check out jlbradley's site found by hld3.
---------------------------------------------
Priority: 1000

Subject:(jpg|jpeg|gif|bmp|anime|{real media}|{real audio}|{test}|http|www|".exe"|gay|playboy|$|$$|$$$*) &! {.preview.*}

Filter to kill any of the fields listed (jpg, jpeg, gif, etc...) that do not include the text string "preview" in the header.  This filter is directed primarily towards SPAM as well as content that I don't prefer (ie. anime, playboy, etc...).  If you wish to allow any of this content to pass (ie. anime, playboy, etc...) just remove it along with the adjoining "or" symbol ( | ) from the syntax string. "Real Media" and "Real Audio", "Anime", "Playboy", and "Gay" will be filtered unless it has "preview" in the header.  Remove any of the ones you want to keep.
---------------------------------------------
Priority: 900

Subject [*,2500] &! subject: (mpg|mpeg|avi|mov|asf|jpg|jpeg|zip|rar|{r[0-9]+}|preview)

Filter to kill messages less than 2500 lines that do not meet the listed criteria.  This filter is directed primarily at non-binary posts and some SPAM that slips past the first filter.
---------------------------------------------
Priority: 800

subject: ("re:"|"req:"|request) &! [50,*]

Filter to kill most of the incessant Re: and request posts. If a binary is being posted with "re:" or "req:" in the subject header and the 00/XX part or last part is less than 50 lines, they will get filtered.
---------------------------------------------
Priority: 700

Subject: (".rm"|"re:"|{^[^a-z"]}) &! (mpg|mpeg|avi|mov|asf|zip|rar|{r[0-9]+}|preview|%FAQ)

Filter to kill *.rm files (I'm not a fan of Real Media vid files), certain "re:" messages (most of those not killed by the last filter), and posts that begin with non-alpha characters (usually SPAM). The FAQs are protected.
---------------------------------------------
Priority: 800

subject: [*,100] or [*,100] and not author: izzy@pop.com

The easiest is by line count to filter out every article with 100 lines or less. You can add other parameters to this, as needed, too.
---------------------------------------------
Priority: 800

subject: {\.mpg\.[a-z0-9]+}

This kills master split files
---------------------------------------------
Priority: 800

subject:[*, 100] and not (virus* or alert* or trojan*  or ABME* or FAQ*) 600

If you want to keep the 0/n files, add the following before virus
*: "0\/" or "00\/" or "000\/" or

---------------------------------------------
Priority: 800

Subject: Subject: {[^ ]\.r[0-9][0-9]}
Subject: {[^ ]\.s[0-9][0-9]}
Subject: {[^ ]\.rar}{\.mpg\.[a-z0-9]+} 800

This kills RAR files

Unix 
 If you have just a basic knowledge of Unix shell commands, Usenet access is very straight forward. From the shell prompt, just launch your news reader of choice (more about this later), subscribe to the desired groups, and start reading. It's that simple. But the more you know about Unix operations, the more sophisticated your news reading can become (more on this later, too).

NewsReaders:
So which news reader? A very good question. Unix news readers come in several varieties.  Most Unix neophytes cut their eye teeth with Tin, a very straight forward, menu driven newsreader. But you'll find Tin to be a tremendous resource hog on your Linux box and it's binary decoding capability is limited (articles must be individually tagged in order). But it does have support for multiple news servers and authentication (these are a *must* these days for any newsreader).

Pan -  A new addition to the Linux open source group. flunkboy calls this one a kick ass graphical threaded news reader that supports multiple servers and authentication.

Trn (threaded read news). It offers sophisticated news group navigation and filtration options (the filter capability alone is worth the price of admission), multiple newsserver and authentication support, robust and clever binary decoding (both UU encoding and MIME... by the way, the UU means Unix to Unix), article scoring, mouse-in-xterm support (if you really have a strong desire for rodent-clutching), and a host of other features too numerous to mention here.

ubh - Koos pointed out to us the Usenet Binary Harvester - it's a GPL'ed Perl console application which automatically discovers, downloads, and decodes single-part and multi-part Usenet binaries. It takes a bit of unix knowledge, but since it uses the same .newsrc format as unix newsreaders it is quite easy to add newsgroups and start harvesting them. Automatically assembles multi-part binaries. Provides searching via Perl regular expression syntax. Also provides a pre-selection capability whereby the user can interactively choose which binaries to download. Use the .newsrc file to control which groups to process, and uses the .newsrc to keep track of articles already processed. Handles uuencoded binaries and MIME attachments. Runs under multiple Perl scripts.

Other noteworthy readers are nn, slrn, gnus, and xnews. All have differing levels of user friendliness and capability. My advice would be to spend some time trying them out and make your own decision.   I almost forgot to mention that *all* of the above newsreaders come with complete source code. You can tweak and customize as you see fit. For Linux users, all the major distributions (Slackware, Redhat, Debian, SuSe) will allow you to install one or more of these right out of the box!

Offline newsreading and decoding? Not a problem. One popular program is suck. Just specify the newsgroup, and it makes the NNRP connection and downloads all available articles (or any subset thereof) for the group. Once you have the articles, just run them through unpost (another fine utility, complete with source code) to get them decoded. Be advised, however, that many ISPs and news services have restictions on running suck or pulling suck-type news feeds (yes, they'll be able to tell). 



This Web Site and its contents - ©2000-2004 abmefaq consortium