Summary:
This project is a fully functional NNTP client, written in a single C# assembly.
Important notice:
Since I have released my little program on the innerweb, I have seen many sites that offer programs for download have
copied my program. Effectively, most of them are hiding where the official location of the program is.
So, if you did NOT download my program from here:
http://sourceforge.net/projects/nntpgolverd/files/
You should destroy your copy and download it from my official location. This ensures no-one has been messing with my
program, and it ensures you get the latest copy. I have seen many humorous errors on those sites. Cnet for example seems
to advertise that my program has an install/uninstall, which for sure is NOT the case (its copy/paste-deploy-ware).
Even worse, many sites are offering download-mirrors in which they keep older versions, and, worse, archives that have been
changed by someone (I found a real cool chinese mod fro example). Some sites claim it'll run on Windows98 & NT; That was really
funny to read! Best part was I was able to find a 'NNTPGolverd-Crack.exe' which I think I should not be using.
I really dont mind anyone copying my stuff, nor do I mind anyone making money advertising it, but please: IF you want to advertise
the program to make some money on your website, ensure to link ONLY to the official source and NOT to any locally stored copies. In
this way the intended audience should be getting the best expiriences with the program.
If you like/dislike the software, feel free to leave me a review on the above site; There is no other way of contacting me,
other than plainly hacking the Sourceforge databases.
Why is this program here:
There are many NNTP clients available, but none of them (at least in native C#) seem to have matured enough for my day to day
usage (which basically is looking for and downloading binaries on Usenet). I needed a client that will understand yEnc protocol
(which I have to admit is a crappy spec - more on that later), it will have to be able to use multiple connections to speed up
downloads and, thats what I missed mostly, many downloads consist of multiple files (where 1 file consists of multiple posts).
None of the freeware/opensource I found could combine these multiple files into a single download so the complete download can
be retrieved with a single click. Inconvenient when you need to select 100+ files manually to get a single download. In addition
it would be nice if the program can use nzb files rather than header downloads.
Features:
- Simple and fast in usage
- Supports multiple connections on multiple usenet servers
- Intended for binary yEnc posts (both download and upload), but also works for 'classic' discussion-groups.
- Can filter based on posters, as well as on the human-readability of the post
- Combines multiple file posts into a single download item
- Possible to download individual files too
- Auto par/rar build in
- No install/uninstall needed; can run as (portable) software from any location
- Can detect & ignore passworded posts
- Full NZB support build in, can be set as shell-extension for nzb's
- Global config settings can be overruled at newsgroup level
- Runs on any windows system (32/64 bit) supporting .NET 2.0
- Single binary build in C#/.NET 2.0; Can run on Mono as well
I wanted the program to be of use for anyone, so the rar file I'm offering contains not only the source code, but also the compiled
release mode binary. The program is a single assembly, so one can just copy it anywhere and use it. No need to run any installer
etc. It is compiled 'Any CPU' so works with native 64bit system also.
Note a number of things on the application:
- It will use the windows logged on users 'local application' and 'my document' locations to store its config, index data and resulting downloads.
- It is written using VS2005, .NET 2.0 runtime. So, be sure to have the runtime installed if you wish to use the program.
- It doesnt contain any unmanaged coding, so it should easily upgrade to VS2008, 2010 or whatever.
- It demonstrates a number of techniques, such as thread-pooling, tcp/ip communication, event generation/handling and it even has a custom user control. I also like binary file usage (which can be a lot quicker than using databases in specific cases). So, it should be of interest for programmers eager to learn doing those things.
- It is provided as is. I am not looking for any feedback, error reporting etc. I made it for myself, and just hope others can do something with it.
Getting started:
- ensure you have .NET 2.0 runtime installed
- copy the bin/release/nntpgolverd.exe anywhere you like.
- create a shortcut to NNTPGolverd.exe on your desktop if you want (the build in icon can be used)
- optionally you may wish to associate NNTPGolverd to the .nzb extension. How this is done depends on
your exact windows version. Using google it should simple to find out how to do this. If you did this,
you can simply start NNTPGolverd with some nzb on your disk by doubleclicking on the .nzb file.
- start it up.
- It will now ask you to provide primary usenet server details; Click the 'Primary' server already in place, and then the 'Edit'button.
be sure to use the 'Validate Connection' button to ensure the connection data is correct. Now you are almost ready for use!
- Note that it IS important to check the maximum connections value; If you expirience 'Connection limit reached' errors in
the logging while downloading/updating, the value is probably higher then what your provider allows. With a bit of
luck the message will also contain what max is allowed in your case.
- Next it will offer to refresh the usenet-group listing. Allow it to do so, and now you can browse through the groups to select one or more binary groups you like.
- 'Add' the groups you want to download from (right-clicks on the various grids will bring up the options)
- select one or more added groups, and select 'Update' to retrieve the headers from your usenet server.
- optionally: drag/drop a nice .jpg picture into the program's screen; this will display on the main screen!
- now you can select a group's download files. Right click a download, and now you can download the selection, find out more about its contents (Analyze),
and/or use the innerweb to see more about it (google: generic info, imdb: movie data and binsearch: nzb's and other related posts).
Auto Par2/Rar processing:
As of 19 Feb, the program will automatically attempt to repair any PAR2 archive and to unpack any RAR archive downloaded. The main
screen has a new 'led' which indicates when this is happening. The processing is triggered when a download in the queue has been
processed.
To make it work, you'll need to copy some executables in the same folder where you run NNTPGolverd from. I have included the items in
the bin/release folder in my download.
It will NOT use more than 1 thread. The reason is mainly that these disk-intensive programs will be fighting with eachother when they
run concurrently. Try to unpack 2 large rars on a single drive at the same time, and time that against unpacking them one by one. You'll
note the difference (unless you have a really fast drive to write to).
Par2:
In the past I used an old commandline tool which would repair the files fine, but took ages to complete. So I have checked again and to
my happiness someone has decided to build and release a very fast and reliable PAR2 program. It can be found on:
http://paulhoule.com/phpar2/index.php
As said, its quick, although not as clever as the Quickpar GUI, but hey: I need a commandline tool and am totally happy with the hard work
this guy must have been doing. It is almost crazy to create a native PAR2 class in C#, due to the fast amount of calculations needed, so
for now I'll stick to the external program.
Note: if the program isnt there, Par2 processing will simply be skipped. If you place PHPAR2.EXE in the program folder, NNTPGolverd will
use it.
Rar:
As with par2, I used an old (freeware) rar commandline tool in the past. It was slow, but did the job. So, I was very surprised finding
a fantastic commandline tool replacement with the 7-zip project, on SourceForge. It works really fast and is opensource. Find it here:
http://www.7-zip.org/
I would have liked my own C# class for this, but hey: way too much work, and probably it will never be any better than this fast
program.
Note: if the program isnt there, Rar unpacking will be skipped. If you install 7Zip (you can remove it later) and copy the 7z.exe and
7z.dll files to my program folder, NNTPGolverd will use it.
Logging:
Basic logging is now included. It will report a bit about activities, and how the rar/par stuff works through. I havent yet included
all possible required logging (I have to keep some tasks for future, rite?).
There are a lot of old-school binary-file operations in the program; why not use a DBMS?
I get this question once in a while. The answer: tried it, been there and then discarded it again. The plain story is that the overhead
involved by using a DBMS (sqlLite, SQLserver and thelikes) makes most operations way too slow. In addition, using a DBMS would cripple
the highly portable nature of the program. Using a DBMS would be very sensible when there is a rocksolid server on your (fast) network,
with lots of fast disks/memory in it, running a nice SQLserver instance.
So, I decided to use the old-school approach to make speed of operation acceptable. Sure the code is less easy to understand, it is more
difficult to change stuff in the indexing system, but..... It's way faster than anything other I tried. At some point I even considered
to give up C#'s native File-access classes and use unmanaged kernel calls instead (they are a lot faster then the native .NET ones) but at
the moment the program is as fast or better than the commercial alternatives, and using unmanaged coding isnt a good plan for portability.
Multiple Servers:
Since 10 Mar release, the software can work with multiple servers. There must be 1 primary server, which will provide the
newsgroups and headers. The primary server is also the only server used when posting to usenet. Additional servers can be set up in
3 categories:
- BackupDLAll: these servers will always help downloading files.
- BackupDLOnFail: these servers will help only when other servers are unable to retrieve files.
This is usefull for server connections with limited bandwidth and/or data limits.
- Inactive: servers in config that are switched off (they can be changed to Backup server when needed). This allows you to switch off
a server where you hit the download-limits, and revive it once the next limit-period starts.
Fully functional Client Software!
As of 14 March 2013, this program supports basically anything required for a usenet user. It can now read and write (aka Post)
messages both in text mode and binary mode.
Posting binaries: a short notice
When posting binaries, it is ALWAYS a good plan to provide the posting with a meaningfull description of what is being
uploaded. It is ALSO a very good plan to *NOT* post passworded data, but if you have to, at least be so polite to mention it
in the description and/or the postname. It is also a good plan to always compress your posts prior to sending them off, and
adding some redundancy files (par2) is encouraged. Also be aware that when you post WITHOUT compressing, the files must be
collected in a single folder prior to sending them away, so: the file(s) you put in it, MUST have unique names! If you let the
program compress, this wont matter. Be aware that the program will use about 2 times the diskspace of the input file(s) when
building/sending posts. You can simply send away your entire C: drive contents in a few clicks, but you do need the space....
yEnc encoding in NNTPGolverd
The yEnc spec has a line-size identifier, which is typically old school 128 or 254 bytes (thanks to crappy Mac users).
In the spec the encoder should cut each line at the line size, but the spec itself also says one can exceed the line-size if it
ends with an escape character, so the line-size is really a 'loose end'.
In NNTPGolverd posts, the line-size is actually the number of bytes one would get after decoding a line. The encoded line sizes
themselves may be larger (but are close to) the line-size in the header. I can imagine that really strict and/or old decoders
may fail on this. If so, you should switch to using NNTPGolverd, which doesnt really care about how long a line is (well, as
long as it doesnt exceed 4kb).
Enjoy coding/using this stuff!
Kind regards, GolverdCore
Update info:
[3 Apr 2013]
- Several improvements in EditPanels; fixed some unhandled exceptions.
Edit panels can now select text by mouse, and use right click context menus to
insert/delete contents. Note: paste doesnt work (yet), but you can use ctl-v in
text blocks.
- Messages composed are now exactly as entered in the EditPanel; a picture/attachment now
has options (to set encoding & picture type). The program used to always take the text, and then
add each attachment. With the EditPanel, you decide!
- TextMode view now also allows a Hex view, so you can take a look at the rawdata of any
message.
[1 Apr 2013]
- Decided to abandon .rar project distribution completely, and now use 7z.
- From the beginning I wasnt happy about the textmode edit-boxes; My new project, MsgEditor, has a readme in
which I explain why the standard controls are simply not good enough if you need to enable a program to
compose a message containing both text and picture elements. The new control I created, called 'EditPanel',
is more or less a complete word-processor and although at this point it isnt 100% perfect, it is good enough
for now. In any case the use of it makes the screen display a message very fast, even when the message has many lines
of misplaced uu/yenc/mime coded stuff in it. The whole plan is that when you compose a message, it should
look (almost) exactly like what it would look when received (until now the pictures wouldnt show etc).
In addition to that, it is now possible to save any decoded item from any received message to disk by clicking on
the item.
[28 Mar 2013]
- Postidentity, user agent & verbose headers can now be set in config for permanent use. In textmode-identity
the options can be easily changed. You can save only for runtime of textmode view, or decide to save permanent in config.
- With the new ability to create mime-messages, there were some errors ofcourse, and this helped making the
decoding of incoming messages a lot more robust.
[27 Mar 2013]
- Ensured message headers are properly encoded by rfc2047 if needed. However it still wont wrap headers
which are longer than the usual 80 bytes.
- (almost) Completed a full mime-sending engine; messages you send will now be transformed into mime if
the data you send needs that. In addition to preserving (encoded) quoted reply text, this also adds
the ability to send attachments (any type) within a message.
- Added support for generic yenc encoding into the mime-processor; Note that yenc is NOT within the mime
rfc(s) so it isnt reliable when users with other software receive the data; however in worst case a
receiver should be able to save the yenc-coded data to file and process it later with yenc decoding tools.
In any case: NNTPGolverd will directly decode the data and present it as good as possible, and it will
save 33% of bandwidth.
- Textmode view now has a load-all button to retrieve all messages in a given group. It still uses a
single connection, so it can take a while, but once you have all messages, you can search/browse them
all offline.
[25 Mar 2013]
- Fix: Found well hidden bug that could cause certain articles to be un-retrievable (and cause server reconnects also).
- Fix: serveral issues with message header parsing/mime view (textmode only).
- Textmode view: left panels will now show message in mime (when available in the message) and
headers are properly decoded as the rfc's specify. However: the articles on the left still
contain undecoded subject/poster data. When in mime-view, one can toggle back to plain text view as well.
[23 Mar 2013]
- Textmode view: messages can now be marked/unmarked as notify messages. When the group is loaded,
the screen will mark all new and unread messages related to the notify message. In this way its
easier to find out if someone replied to any message(s) of your interest.
- Fix for duplicate newsgroupids (could cause crash at program start, with the 21 Mar version).
- Mime-decoder now in complete form; only a few specialties left to do. Also added a proper message-header
parser to make detecting specific headers easier.
[21 Mar 2013]
- Download lists now keeps track of DL-status before, during and after download. In addition it is no longer possible
to start a download that is already in the queue (or in progress). The data is commited to disk, and is also
valid after stopping/starting the program again. Now at least you can tell what you selected to download, and avoid
downloading items twice.
[20 Mar 2013]
- When downloading message bodies, and max.operation.retry limit is reached, the thread will now re-schedule to another
server if any available (otherwise it will skip the operation as it used to do).
- Added a very basic Mime-message viewer to TextMode. For now it only decodes text & jpeg image. Will be extended soon.
[18 Mar 2013]
- Added UUdecoding methods to download process. However it will only decode data packed in a single, complete
article; Some programs just split these encoded chunks in several articles, and it is then (very) hard to tell
which part goes where. In any case: uuencoding isnt used a lot, so its unlikely this is ever a real problem.
- Improved message-body checking when downloading to find out quickly if we are dealing with an encoded message or not.
[15 Mar 2013]
- Fix: Threads could fail when text-only messages are selected for binary download.
- Speed improvement! When you select downloads to the queue, the software has to retrieve all associated articleids
and messageids from the cached index. With large groups, this can take quite some time. The same process is used when
one analyses a download-file's file parts, and is just as slow. On a very fast machine I have, I had 20.000.000 articles
in a newsgroup. Selecting a single download took 72 seconds. So, you would see a busy cursor for 72 seconds before any
of the connection leds will light up, starting the communication. I changed the function a bit here and there, and now the
same action takes about 4 seconds!
- TextMode view now has search options so you can more exact on what to search for.
- TextMode: rightclick on article added option to download all articles on and below the item. First I had a Download All
button, but that takes ages. With this option you can download a complete discussion with one click and then read it
(or search in it) offline.
- Fix: when downloads are re-scheduled to high prio queue, the download% could reach above 100%.
[14 Mar 2013]
- Fixed: textmode posting subject would always return 'invalid characters' (post will happen, no panic).
- Added local post cache for text messages. So, you will now have access to any posted text message, even when
your server headers have not yet been updated (the post would otherwise disappear after closing the textmode view).
- Ensured the user cannot place invalid text in any post-field when posting.
- Config expanded: Max headers for textmode viewing can be set, and the Post-Identity.
- Config screen a bit smarter: when primary server isnt setup yet, the screen will show a message and
place the primary server in edit mode; makes life a bit easier for end users.
- Textmode-message compose screen can now 'quote' a reply; Its very usual in usenet discussions to always copy the
full conversion at the start of your reply, prefixed by '>'.
- Statistics screen created; Shows a lot of the current queue activeties as well as 'overall' (since program start). This
screen should be helpfull if you wish to maximize throughput. Shows but download and upload statistics.
- Added wizard screen to build & post binary files, right click on selected group now shows 'Post Binary' option.
Individual file(s) & complete folders (with recursion) can be added to a single post with a few clicks. The wizard will
optionally pack the complete post in (splitted) 7z (7Zip format) archives, and can optionally add par2 redundancy files
to the post. The actual posting is then forwarded to the queue, so the user can continue doing other things.
- Queue-order priority is now: 1. newsgroup (retrieve/update) operations, 2. article retrieval, 3. article upload.
- Message subject parsing changed; When the poster sends subjects containing quote-enclosed filename(s), the filename
will be accepted as long as it has some extension. In this way the limitation on rar/mp3/jpg only downloads is
not applicable for these posts; Any file can be received.
- Improved download-group data; First of all, IF a valid date stamp was on the article, the group now gets that date rather then
the date when the group item was created. Secondly it was a lot of work to find back the original subject from a
download-group. You needed to analyze, select a file and then list its fileparts to get the subject. Now the original subject
is appended to the download-group grid.
- Auto rar support extended with zip & 7z archive files, so they will be unpacked automatically.
Unfortunately I cannot test passwording on 7z split-archives: the full archive MUST be downloaded first. Shamefull.
- Overall change: max line sizes now set to 4096 bytes (was 1024). Some headers are just way too big....
[13 Mar 2013]
- Added idle disconnect to TextMode view; connection times out after approx. 2 minutes; connection is re-established
automatically when needed. A blinking led was added to indicate the connection status.
- TextMode view now uses no more than config specified max headers to display. In this way the group can be displayed even when
there is a large amount of headers.
- Right click menu in 'All Groups' now has 'Search' option to limit the list to specific item(s).
[12 Mar 2013]
- New 'TextMode' view for newsgroups; Shows traditional discussions in treeview. Messages can be viewed in plain text mode (first I
used mickeysofts' richtextbox, but abandoned it again when I found out it can be tricky/dangerous with carefully crafted text-
messages). The view allows searches (even within complete articles when they have been read earlier).
- In textmode, a user can also reply to posts and/or start a new top-post conversation.
todo:
* show last 100000 entries from header cache in textmode.
* implement yenc encoding/posting for binaries, batch mode.
* review subject parsing (improve when files are properly quoted)
* unpack for 7z files would be cool.
[11 Mar 2013]
- Fix for 430 responses on BODY command (430 - article not on server). These would typically cause a 'bad connection crash'
which isnt really fair and cause a full reconnect cycle; Now the handling is correct, the article is properly passed
on to any other servers when they are available, and only when all servers tried the article it is discarded as unretrievable.
- Added statistics processing in the engine; At thread ends, the thread will report it stats to the logging; On the main
screen a button 'Stats' will show you the current statistics for all threads.
- Another payserver problem found; After logging in, it may return 482 (although the user-credentials are correct). 482
normally means 'Authentication failed', but the server I happened to use is using this code for other purposes as well.
In my case it was trying to say the connection limit was reached, so I changed the code to deal with that. I'm wondering how
other payservers do this..... I also think that once I consumed my limit this month, it'll probaly return another variant of
482. We'll see about that later.
[10 Mar 2013]
- Finally found a proper payserver to do some more testing;
- Fix: some payservers do not respond correctly when connecting; They fail to mention to the client
that authentication is required; So, after establishing connection, the software now first issues a
MODE READER command (which it should do anyway) to test its reponse for authentication. In addition the
fail-to-connect message in the configscreen will now also return the connect-response to the user.
- Fix: downloads placed in the queue after one or more newsgroup updates could be failing (argument error
in the logging) and crashing one or more threads in the process. Now the downloads will always select the
correct group object prior to downloading data.
- Implemented basic color skinning, and changed the entire program into a 'Black Edition'.
- Finally, the software can work with multiple servers! There needs to be 1 primary server, which is used to get
newsgroup lists & headers from. Additional servers can be added to the config; there are 3 types:
* BackupDLAll: This server type will always help downloading any queued articles.
* BackupDLOnFail: This server type will only be invoked once other servers are unable to return an article.
* Inactive : This server type does nothing, but can be re-activated manually if needed.
Existing configs will automatically be converted (thus creating your primary server entry).
- Still to do:
* check payserver hehaviour when my 25GB/month data limit is reached, and see if this can be used
to maximize connections until the limit is reached.
[8 Mar 2013]
- Downloads now sort by dl, file & article. In this way, a download should process file by file rather than by whatever
order the posts were placed on the server. Creates a bit of extra overhead when sending download(s) to the queue, so
can be switched off in config (default: switched on).
- Small speed up for testing passworded rars; In a download containing a single file, the test is simply forwarded to
the normal rar/par process. This eliminates the need to copy, test & report the single file. It saves time when you typically
download lots of small single file rar's.
- Added a few new extensions: JPG, MP3, CBR and EPUB.
- Prepared for alternate color-scheme's.
- Implemented a busy-cursor here and there, so the user knows the software is doing something.
[7 Mar 2013]
- Today is usenettiquette day! I hate: passworded downloads and downloads with subject lines that make it impossible
to know what you're downloading at all.
With the last issue I created the human-readability test yesterday, so these can be filtered away; Leaves us with
the password problem. So, 2 new config switches exist to control this, and the screen has a new led indicator that
will flash on as soon as a passworded download is found.
In config, you can set the option to only test for passwords (which makes the led work), and you can set the option
to make the software skip all remaining file(s) of that download once it found out a file is passworded (other downloads
will just continue). Note that in order to test for a password protection, the software must first download at least 1
file with extension R00 or RAR. This file is then copied to another location and tested. So, the testing does introduce
a bit of overhead. You can switch it off completely if you dont need this. I know for sure lots of peoples will be
happy with this; Nothing is as disappointing as waiting the entire download only to find out you cannot unpack the
data. NNTPGolverd willnot only alert you now, but it will also save your bandwidth for usefull things!
- Small downloads-grid change; Added 2 columns, one to contain the condensed filesize (e.g. 2.7MB, not 2786763) and
the not-human-readable score the dl has (its crap value basically).
[6 Mar 2013]
- Oops fix: Found out the refresh newsgroup function was no longer working after I've changed the queing system.
Stupid mistake, solved now.
- Fix: Kiil list was accidentally wiped when config is changed later; now saves perfectly.
- Added 'analyze' to selected newsgroups-menu. This produces a treeview by poster, with an analysis of the
human-readability of the posts by this poster. Many times I get a million posts with just some random text/numbers
as a 'description'. Not very helpfull, these posts. So, I typically kill such posters so I do not have to go through
these useless posts. The analyze screen makes this a lot easier; It features a auto kill/unkill setting by percentage,
and individual items can also be changed. Closing the screen saves the changes to your config.
- Added per group-config settings. In here, one can set different max values for old & new article headers. In addition,
it is possible to specify behaviour when showing the downloads, both for black- (killed) & white- listed posters downloads.
This makes it possible to black list a poster, but still show the download if it is % human readable. The reverse is also possible;
A white listed posters'post can be removed from display if it is % human unreadable.
[3 Mar 2013]
- again rework article archive: change from large single raw file to little chunk files. Objective: speeds up article cache cleanups,
and makes it possible to let the indexing process skip old records to save time when updating a newsgroup that has stored articles
from earlier updates.
The raw file is now a single large textfile. It can become huge (about 3GB for 10.000.000 article headers or so) and
operations on huge files (cleanup mostly) take a long time. With older filesystems (FAT based, still found on a lot of USB keys
today), it would also have to deal with the 2 GB filesize limits.
Note: the new archiving system is on a different location than in earlier versions. You will need to update all your groups after
installing this version, which will trigger conversion of your existing headers (if any).
- change to nzb file reading. I found some nzb file that had 2 identical segments for a single file. I guess this means the software
can use either of these segments to download; when one fails, the other could be tried. But, it violates the segment-numbered key
data, so I basically changed the code to ignore duplicate segments altogether.
- Found some question on the innerweb of someone wanting to run a newsreader directly from USB key, without storing/installing anything
on the targetted computer at all. My program was designed to be portable, but the config/headers and downloaded data are now stored
in windows-user folders. So, some mechanism must be in place that'll enable someone to tell the program to store the stuff
in its program folder instead. In that way it'll be 100% portable.
If you place a file called 'portable.txt' in the applications folder (e.g. on the USB key where you run the program from), the
program will keep all its stored data within the program folder. Be aware: whereever you place the program, you MUST have write access
to the location or terrible things will happen......
[2 Mar 2013]
- All newsreaders that can use multiple server connections I've seen so far, seem to retrieve article headers (Xover command)
in a single connection. That is also what my program USED to do until now. Now its different: the retrieval is now done in
max. 10000 article - chunks. This makes the program benefit from multiple connections and thus speeds up the retrieval
(measured about 20% faster retrieval on bad,slow internet connections). The best part is that the progress indicator is now
working a lot better (it was based on retrieval of single article bodies, which do not take ages, but a single Xover can
easily take 30 minutes - the indicator would stay at 0%, and after 30 minutes it would pop to 100%).
The best part about it is the fact that an abort or connection failure during header retrieval no longer results in a
non-indexed set of articles. So, bullet proof!
[1 Mar 2013]
- All softwares I have build in the past were capable of showing a cat-picture somewhere on the screen; So,
NNTPGolverd can now also do that. Without any picture, it will show a message at the place where the 'Downloads' grid
used to be. With a picture, the picture will be shown. It can be any picture you want, as long as its a valid .jpg
file. Just drag the .jpg to the main screen and it will be picked up at program start. For me, I have a local 'cat.jpg'
in my folder, so my program is compliant with all other programs I've build :-)
- Added message to main screen explaining users where to get the latest & original programs, since many sites have started
hosting older (and even modded) copies.
- Added feedback during header-transfers (which can take a long time). It will now report the completion% of the newsgroup(s) being
updated.
- Updated the general text above in this readme to bring it up to date.
- Resizing of the main screen at runtime now works as expected; the other screens are already resizable.
[27 Feb 2013]
- Added search option to search in displayed download list.
- Added a little scrolling log-messages control to the main screen.
- Revised the connection threads so they can reliably report that a specific file from a specific download has completed.
With that, it can now also reliably report that a specific download has completed. File completion messages are listed on
the main screen 'messages' control.
- Auto-rar/par is now triggered as soon as any of the queued downloads has completed, rather then the software waiting until
ALL downloads complete. In addition the results of rar/par are now listed on the 'messages' control.
- Fix: when deleting a group, the group files were not cleared from disk; now they are.
Todo's:
- Optimize after having received new article headers from server (the entire article raw cache is now re-indexed, but it
is a lot more efficient to only process the newly added items instead.
Investigated it: the addition to existing indici & the cleanup involved doesnt actually make the process much faster.
It does save some time in specific scenarios, but given the process takes about 4 seconds for 300000 articles, and retrievel
of the Xover takes much longer then that, I will put this on hold.
- Finally see if we can reliably determine when a newsgroup has to be blocked because of its updating in multiclient. They are
now blocked during the entire download action.
- Some way to reset the decoder-errors after a download completes.
[26 Feb 2013]
- Investigated certain downloads that result in many yenc decoder failures (# bytes returned not matching and CRC32 incorrect).
Since I doubted the yenc implementation, I have verified the failures against the official yenc spec, and found that there
are no errors in the implemenation. In addition I confirmed the download itself was flawless. So, these files must have
been badly encoded before being transmitted. This can explain why there is always exactly 1 byte missing after decoding.
It doesnt explain why other parts of the same post are correct. These posts came from 'easyusenetreader'. about 17% of the
messsages are incorrect, so even a par2 repair will typically fail.
So I could do no more than add some stats to the main screen. It will now tell how many parts have been downloaded so far,
and it will show the decoder-error ratio against what was downloaded. In this way, one can tell during a download if it
makes sense to continue the operation or abort it alltogether.
[25 Feb 2013]
- Added the usual 'Blue Screen' icon to the application. Icon is 50 kb, but adds 100 kb to the .exe!
- The files in a selected download item can now be inspected ('Analyze' in the context menu). So, the download can be inspected a
bit better (manually) before one starts it. From here, you can detail-inspect the fileparts (aka usenet articles) the file is
containing, and that will show all the usenet article details (such as original subject line). Individual files can now also
be downloaded.
- Provided a status indication for both downloads and individual files. Note that the status is based on the trivial yEnc subject,
so isnt accurate always, but at least provides some indication of whether to expect a half complete file or not. It has 3 states now:
1. Complete; In this case the number of parts as described by the subject matches to the number of parts found in the groupcache.
2. OverComplete: The cache contains more parts then expected from the subject.
3. InComplete: The cache contains less parts then expected.
If you are using this program, ensure to update all your group headers for this to function correctly, otherwise all part counts
will stay at 0, and all status will probably be 'OverComplete'.
- Usage of maxold/new articles tuned a bit better; code now attempts to preserve as much data as possible. Basically a NG at any point
in time should be containing max of (old+new).
- Fix: (Oops!) After a timeout, reconnect must run; However the caller would continue using the failed connection object and thus
any operation after the reconnect would fail.
- Next todo is to improve performance when initiating download(s). With large groups it can take quite a while.
- Next todo is to provide better on screen feedback on how a downloaded queue ended. You now need to check the logging completely
to determine how a given download has ended.
- Next todo is to revise the dl queue itself, so it can be stopped/resumed at will (hopefully even when the program is closed/restarted).
[24 Feb 2013]
- Again improved the logging, and introduced a logging level setting in config.
- Also improved failure handling; First of all read/write timeouts can now be set in config (rather than the hardcoded 2 seconds
it had before). The server reconnect schedule is now more adaptive: it will throttle the wait time between attempts.
- An option to (re)start the download queue is now there. The option can only be used when at least 1 thread is still alive, and it will
revive all threads needed. The option is there to allow a resume when a download operation fails the configured amount of times.
The failed operation wont be tried again; the thread will try the next.
- Added a number of web-lookup options in the downloads-grid. You can now go to google, imdb and binsearch with a rightclick and
retrieve any details that may be provided by the downloads groupname.
- Code now orders the nzb-file list before launching a download. Some nzb's have the files in reverse order; I prefer to
download part1 first, and so on.....
[22 Feb 2013]
- Launch of AutoRar/Par now only done if there is data to process.
- Disabled viewing downloads of a newsgroup when the newsgroup is being updated. It isnt perfect yet, since it will keep
saying the newsgroup is in use, even when it may have already been downloaded; Currently the lock is only released when ALL
running downloads have completed. For now this prevents loading incomplete/inuse files for the group (and the associated
crashes one can expect).
- Article-header retrieval improved; Existing already retrieved articles are now preserved (with a max specified in config).
Also can now limit the amount of articles being retrieved in one Xover (for commercial usenet servers that have a large amount
of headers).
- Added set of new config options to config screen.
- Until now, communication failures on different levels could create big problems. Worst case was endless looping of one or more
threads. The code is now changed to handle these failures a lot better. Article retrieval will use a disconnect/reconnect schema
in this case for example. Half-retrieved articles will be retried a number of times. The objective is that even with a broken
innerweb connection, the threads should not hang but eventually be ended. A new set of config parameters is in place to specify
the behaviour.
[21 Feb 2013]
- Added commandline support for .nzb files. The program can now be used to 'open with' nzb files. You must configure this file
association in windows yourself. Note: double clicking a nzb when the program is already loaded will load the nzb into
the already open program.
- Prevented running multiple instances of the NNTPGolverd program (which can create lots of trouble). Only a single (the first)
instance will run at any time.
- Fix: article-raw file position index had wrong pointers when unicode data is received from Xover command. Result is that
it wasnt possible anymore to start a download using received headers (nzb would work okay). The process is now purely binary,
ensuring the index is correct and making the reading/writing faster as well.
If you have this problem with any pre- 21 Feb version, load the application and ensure to 'Update' all newsgroups you have added.
This will recreate the index and avoid the problem.
[20 Feb 2013]
- When the main form runs background jobs to process on-screen selections, the user is able to initiate other actions
at the same time (but that would not always work out fine). These actions are now blocked to prevent crashing.
- Article-body retrieval by articleID can fallback to retrieve by messageid if needed. This however introduces
overhead during download. In addition the queueing of articleids/messageids to download was slower then I hoped.
This queueing is now about 200% faster AND it adds the messageids beforehand, avoiding the fallback overhead.
- Added 'KillPoster' function to permanently remove downloads from idiots (todo: create screen where these posters can be unKilled).
- Downloads list now in byte-size, descending order.
- Fix: Fallback to retrieve article by messageID when articleID failed was broken; fixed.
[19 Feb 2013]
- Added automatic Par2 & rar processing. This will work when you place some extra (excellent!) executables in the same
location where NNTPGolverd.exe is running from.
- Added initial logging module (logs to 'local app' folder)
- Fix: yenc metadata retrieval was 7bit (so filename could be incorrect), now 8bit.
- Fix: also changed article headers retrieval to 8bit.
- Made body reading more efficient by skipping string encoding during yenc body reading.
[18 Feb 2013]
- Revised yenc subject parser (more robust, easier to maintain and better results)
- Added full queueing support: files/newsgroups can now be added even when some other download is already running.
- Added button to abort any running operations gracefully. Use with care: it will result in incomplete data.
- Added drop/drag support for nzb file(s). Drag these to the form & hit the 'NZB Download' button to add these downloads to the queue.
[17 Feb 2013]
- Initial publication of the NNTPGolverd.exe client.