X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/523650d9c2b876152aaa9bc6c5d6363377ea7008..e71622d233660b5ba305d68e586d7d14ff2124e6:/doc/plugins/aggregate.mdwn?ds=sidebyside diff --git a/doc/plugins/aggregate.mdwn b/doc/plugins/aggregate.mdwn index e2efcd83f..b1db828d1 100644 --- a/doc/plugins/aggregate.mdwn +++ b/doc/plugins/aggregate.mdwn @@ -1,13 +1,21 @@ [[!template id=plugin name=aggregate author="[[Joey]]"]] -[[!tag type/useful]] +[[!tag type/special-purpose]] This plugin allows content from other feeds to be aggregated into the wiki. To specify feeds to aggregate, use the [[ikiwiki/directive/aggregate]] [[ikiwiki/directive]]. -The [[meta]] and [[tag]] plugins are also recommended. Either the -[[htmltidy]] or [[htmlbalance]] plugin is suggested, since feeds can easily -contain html problems, some of which these plugins can fix. +## requirements + +The [[meta]] and [[tag]] plugins are also recommended to be used with this +one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since +feeds can easily contain html problems, some of which these plugins can fix. + +Installing the [[!cpan LWPx::ParanoidAgent]] Perl module is strongly +recommended. The [[!cpan LWP]] module can also be used, but is susceptible +to server-side request forgery. + +## triggering aggregation You will need to run ikiwiki periodically from a cron job, passing it the --aggregate parameter, to make it check for new posts. Here's an example @@ -15,6 +23,11 @@ crontab entry: */15 * * * * ikiwiki --setup my.wiki --aggregate --refresh +The plugin updates a file `.ikiwiki/aggregatetime` with the unix time stamp +when the next aggregation run could occur. (The file may be empty, if no +aggregation is required.) This can be integrated into more complex cron +jobs or systems to trigger aggregation only when needed. + Alternatively, you can allow `ikiwiki.cgi` to trigger the aggregation. You should only need this if for some reason you cannot use cron, and instead want to use a service such as [WebCron](http://webcron.org). To enable @@ -39,3 +52,10 @@ plugin as well as aggregate itself, since feed entries will be stored as HTML, and as first-class wiki pages -- each one generates a separate HTML page in the output, and they can even be edited. This option is provided only for backwards compatability. + +## cookies + +The `cookiejar` option can be used to configure how [[!cpan LWP::UserAgent]] +handles cookies. The default is to read them from a file +`~/.ikiwiki/cookies`, which can be populated using standard perl cookie +tools like [[!cpan HTTP::Cookies]].