X-Git-Url: http://git.ozo.com/?a=blobdiff_plain;ds=sidebyside;f=README;h=fc974004632f701aebc32b82d3f184e72aaae798;hb=92b9dc13794b99cb12906dee06274fa75a29959a;hp=513ec0dead3063181803a3403aca05204a6b9044;hpb=45c8adfaf1b1938da0c04b4954628efff911546a;p=rawdog%2F.git diff --git a/README b/README index 513ec0d..fc97400 100644 --- a/README +++ b/README @@ -6,6 +6,11 @@ feed parser. It's just an aggregator; it's not a weblog authoring tool, nor is it an NNTP gateway, outliner, mailserver or anything else. rawdog probably only runs on Unix-like systems. +rawdog requires Python 2.2 or later. rawdog itself doesn't need any +additional modules to be installed, but it uses distutils for +installation, so if you're on a Debian system you'll need to install the +"python-dev" package first. + rawdog reads articles from a number of feeds and writes out a single HTML file, based on a template either provided by the user or generated by rawdog, containing the latest articles it's seen. It uses the ETags @@ -34,8 +39,6 @@ to perform -- for instance, "rawdog --update --write" tells it to do the "--update" action, then the "--write" action. The actions supported are as follows: -"--help": Provide a brief summary of all the options rawdog supports. - "--update" (or "-u"): Fetch data from the feeds and store it. This could take some time if you've got lots of feeds. @@ -64,6 +67,17 @@ own template: do "rawdog -t >~/.rawdog/mytemplate" with "template default" in your config file, and you'll get a copy of the default template to edit. +There are also the following options which may only be supplied once +(they're read before any of the actions are performed): + +"--help": Provide a brief summary of all the options rawdog supports, +and exit. + +"--dir DIR" (or "-d DIR"), where DIR is a directory: Use DIR instead of +the $HOME/.rawdog directory. This is useful if you want to have two or +more completely different rawdog setups with different sets of feeds; +just create a directory for each. + You will want to run "rawdog -uw" periodically to fetch data and write the output file. The easiest way to do this is to add a crontab entry that looks something like this: