X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/ab75c0323bc584203a2b4a507c2a2012523354d0..4729ff0812c1f3d06d98524e2fec232d3bf90513:/doc/todo/aggregation.mdwn?ds=sidebyside diff --git a/doc/todo/aggregation.mdwn b/doc/todo/aggregation.mdwn index 7d765f9e9..371d20c12 100644 --- a/doc/todo/aggregation.mdwn +++ b/doc/todo/aggregation.mdwn @@ -1,24 +1,3 @@ -Here's a scary idea.. A plugin that can aggregate feeds from other -locations. Presumably there would need to be a cron job to build the wiki -periodically, and each time it's built any new items would be turned into -pages etc. There might also need to be a way to expire old items, unless -you wanted to keep them forever. +* Still need to support feed expiry. -This would allow ikiwiki to work as a kind of a planet, or at least a -poor-man's news aggregator. - -* XML::Feed has a very nice interface, may require valid feeds though. -* How to store GUIDs? Maybe as meta tags on pages, although that would need - caching of such metadata somewhere. -* How to configure which feeds to pull, how often, and where to put the - pulled entries? One way would be command line/config file, but I think - better would be to use preprocessor directives in a wiki page, probably - the same page that inlines all the pages together. -* Where to store when a feed was last pulled? - -So I need: - -* A way to store info from the preprocessor directives about what pages - to pull and expiry. -* A way to store info on last pull time, guids, etc. -* Switch for a mode that a) pulls b) expires old c) rebuilds wiki (for cron) +[[todo/done]]