X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/ffec6806087981420eaf83c8d83cc4523a46d0de..623c7e17989f53ef049f12639950d2c50c9edd4f:/doc/plugins/aggregate.mdwn?ds=sidebyside diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn index e2efcd83f..b1db828d1 100644 --- a/doc/plugins/aggregate.mdwn +++ b/doc/plugins/aggregate.mdwn @@ -1,13 +1,21 @@ [[!template id=plugin name=aggregate author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin allows content from other feeds to be aggregated into the wiki. To specify feeds to aggregate, use the [[ikiwiki/directive/aggregate]] [[ikiwiki/directive]]. -The [[meta]] and [[tag]] plugins are also recommended. Either the -[[htmltidy]] or [[htmlbalance]] plugin is suggested, since feeds can easily -contain html problems, some of which these plugins can fix. +## requirements + +The [[meta]] and [[tag]] plugins are also recommended to be used with this +one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since +feeds can easily contain html problems, some of which these plugins can fix. + +Installing the [[!cpan LWPx::ParanoidAgent]] Perl module is strongly +recommended. The [[!cpan LWP]] module can also be used, but is susceptible +to server-side request forgery. + +## triggering aggregation You will need to run ikiwiki periodically from a cron job, passing it the --aggregate parameter, to make it check for new posts. Here's an example @@ -15,6 +23,11 @@ crontab entry: */15 * * * * ikiwiki --setup my.wiki --aggregate --refresh +The plugin updates a file `.ikiwiki/aggregatetime` with the unix time stamp +when the next aggregation run could occur. (The file may be empty, if no +aggregation is required.) This can be integrated into more complex cron +jobs or systems to trigger aggregation only when needed. + Alternatively, you can allow `ikiwiki.cgi` to trigger the aggregation. You should only need this if for some reason you cannot use cron, and instead want to use a service such as [WebCron](http://webcron.org). To enable @@ -39,3 +52,10 @@ plugin as well as aggregate itself, since feed entries will be stored as HTML, and as first-class wiki pages -- each one generates a separate HTML page in the output, and they can even be edited. This option is provided only for backwards compatability. + +## cookies + +The `cookiejar` option can be used to configure how [[!cpan LWP::UserAgent]] +handles cookies. The default is to read them from a file +`~/.ikiwiki/cookies`, which can be populated using standard perl cookie +tools like [[!cpan HTTP::Cookies]].