X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/87ece1837d203ba384c1c565a507558d81c1a840..68717497d3eb7a024ea0f48b60ddae98caccd81b:/doc/plugins/aggregate/discussion.mdwn?ds=inline diff --git a/doc/plugins/aggregate/discussion.mdwn b/doc/plugins/aggregate/discussion.mdwn index f5e44c529..028775ec8 100644 --- a/doc/plugins/aggregate/discussion.mdwn +++ b/doc/plugins/aggregate/discussion.mdwn @@ -35,7 +35,7 @@ Two things aren't working as I'd expect: > problem. You can see the feed validator complain about it here: > > -> It's sorta unfortunate that [[cpan XML::Feed]] doesn't just assume the +> It's sorta unfortunate that [[!cpan XML::Feed]] doesn't just assume the > un-esxaped html is part of the description field. Probably other feed > parsers are more lenient. --[[Joey]] @@ -57,3 +57,81 @@ New bug: new posts aren't getting displayed (or cached for aggregation). After f > Even if you start it more frequently, aggregation will only occur every > `updateinterval` minutes (default 15), maximum. Does this explain what > you're seeing? --[[Joey]] + +>> Crap, right, and my test update has since made it into the planet. His post still hasn't. So it must be something with David's feed again? A quick test with XML::Feed looks like it's parsing just fine: --[[schmonz]] + + $ perl + use XML::Feed; + my $feed = XML::Feed->parse(URI->new('http://www.davidj.org/rss.xml')) or die XML::Feed->errstr; + print $feed->title, "\n"; + for my $entry ($feed->entries) { + print $entry->title, ": ", $entry->issued, "\n"; + } + ^D + davidj.org + Amway Stories - Refrigerator Pictures: 2008-09-19T00:12:27 + Amway Stories - Coffee: 2008-09-13T10:08:17 + Google Alphabet Update: 2008-09-11T22:55:37 + Writing for writing's sake: 2008-09-09T23:39:05 + Google Chrome: 2008-09-02T23:12:26 + Mister Casual: 2008-07-25T09:01:17 + Parental Conversations: 2008-07-24T10:44:44 + Place Of George Orwell: 2008-06-03T22:11:07 + The Raw Beauty Of A National Duolian: 2008-05-31T12:41:06 + +> I had no problem getting the "Refrigerator Pictures" post to aggregate +> here, though without a copy of the old feed I can't be 100% sure I've +> reproduced your ikiwiki's state. --[[Joey]] + +>> Okay, I blew away the cached entries and aggregator state files and reran the aggregator and all appears well again. If the problem recurs I'll be sure to post here. :-) --[[schmonz]] + +>>> On the off chance that you retained a copy of the old state, I'd not +>>> mind having a copy to investigate. --[[Joey]] + +>>>> Didn't think of that, will keep a copy if there's a next time. -- [[schmonz]] + +----- + +In a corporate environment where feeds are generally behind +authentication, I need to prime the aggregator's `LWP::UserAgent` +with some cookies. What I've done is write a custom plugin to populate +`$config{cookies}` with an `HTTP::Cookies` object, plus this diff: + + --- /var/tmp/pkg/lib/perl5/vendor_perl/5.10.0/IkiWiki/Plugin/aggregate.pm 2010-06-24 13:03:33.000000000 -0400 + +++ aggregate.pm 2010-06-24 13:04:09.000000000 -0400 + @@ -488,7 +488,11 @@ + } + $feed->{feedurl}=pop @urls; + } + - my $res=URI::Fetch->fetch($feed->{feedurl}); + + my $res=URI::Fetch->fetch($feed->{feedurl}, + + UserAgent => LWP::UserAgent->new( + + cookie_jar => $config{cookies}, + + ), + + ); + if (! $res) { + $feed->{message}=URI::Fetch->errstr; + $feed->{error}=1; + +It works, but I have to remember to apply the diff whenever I update +ikiwiki. Can you provide a more elegant means of allowing cookies and/or +the user agent to be programmatically manipulated? --[[schmonz]] + +> Ping -- is the above patch perhaps acceptable (or near-acceptable)? -- [[schmonz]] + +>> Pong.. I'd be happier with a more 100% solution that let cookies be used +>> w/o needing to write a custom plugin to do it. --[[Joey]] + +>>> According to LWP::UserAgent, for the common case, a complete +>>> and valid configuration for `$config{cookies}` would be `{ file => +>>> "$ENV{HOME}/.cookies.txt" }`. In the more common case of not needing +>>> to prime one's cookies, `cookie_jar` can be `undef` (that's the +>>> default). In my less common case, the cookies are generated by +>>> visiting a couple magic URLs, which would be trivial to turn into +>>> config options, except that these particular URLs rely on SPNEGO +>>> and so LWP::Authen::Negotiate has to be loaded. So I think adding +>>> `$config{cookies}` (and using it in the aggregate plugin) should +>>> be safe, might help people in typical cases, and won't prevent +>>> further enhancements for less typical cases. --[[schmonz]] + +>>>> Ok, done. Called it cookiejar. --[[Joey]]