X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/db2178d5ae9e1b0079e881451213110c1702b1dd..3abcb0f797753b0e486568047c78dcf45a2923ed:/doc/plugins/aggregate/discussion.mdwn diff --git a/doc/plugins/aggregate/discussion.mdwn b/doc/plugins/aggregate/discussion.mdwn index e8f264cf2..028775ec8 100644 --- a/doc/plugins/aggregate/discussion.mdwn +++ b/doc/plugins/aggregate/discussion.mdwn @@ -35,7 +35,7 @@ Two things aren't working as I'd expect: > problem. You can see the feed validator complain about it here: > > -> It's sorta unfortunate that [[cpan XML::Feed]] doesn't just assume the +> It's sorta unfortunate that [[!cpan XML::Feed]] doesn't just assume the > un-esxaped html is part of the description field. Probably other feed > parsers are more lenient. --[[Joey]] @@ -51,3 +51,87 @@ Two things aren't working as I'd expect: >>>>> --[[Joey]] >>>>>> I can confirm, they're fixed on my end. --[[schmonz]] + +New bug: new posts aren't getting displayed (or cached for aggregation). After fixing his feed, David posted a new item today, and the aggregator is convinced there's nothing to do, whether by cronjob or webtrigger. I verified that it wasn't another problem with his feed by adding another of my ikiwiki's feed to the planet, running the aggregator, posting a new item, and running the aggregator again: no new item. --[[schmonz]] + +> Even if you start it more frequently, aggregation will only occur every +> `updateinterval` minutes (default 15), maximum. Does this explain what +> you're seeing? --[[Joey]] + +>> Crap, right, and my test update has since made it into the planet. His post still hasn't. So it must be something with David's feed again? A quick test with XML::Feed looks like it's parsing just fine: --[[schmonz]] + + $ perl + use XML::Feed; + my $feed = XML::Feed->parse(URI->new('http://www.davidj.org/rss.xml')) or die XML::Feed->errstr; + print $feed->title, "\n"; + for my $entry ($feed->entries) { + print $entry->title, ": ", $entry->issued, "\n"; + } + ^D + davidj.org + Amway Stories - Refrigerator Pictures: 2008-09-19T00:12:27 + Amway Stories - Coffee: 2008-09-13T10:08:17 + Google Alphabet Update: 2008-09-11T22:55:37 + Writing for writing's sake: 2008-09-09T23:39:05 + Google Chrome: 2008-09-02T23:12:26 + Mister Casual: 2008-07-25T09:01:17 + Parental Conversations: 2008-07-24T10:44:44 + Place Of George Orwell: 2008-06-03T22:11:07 + The Raw Beauty Of A National Duolian: 2008-05-31T12:41:06 + +> I had no problem getting the "Refrigerator Pictures" post to aggregate +> here, though without a copy of the old feed I can't be 100% sure I've +> reproduced your ikiwiki's state. --[[Joey]] + +>> Okay, I blew away the cached entries and aggregator state files and reran the aggregator and all appears well again. If the problem recurs I'll be sure to post here. :-) --[[schmonz]] + +>>> On the off chance that you retained a copy of the old state, I'd not +>>> mind having a copy to investigate. --[[Joey]] + +>>>> Didn't think of that, will keep a copy if there's a next time. -- [[schmonz]] + +----- + +In a corporate environment where feeds are generally behind +authentication, I need to prime the aggregator's `LWP::UserAgent` +with some cookies. What I've done is write a custom plugin to populate +`$config{cookies}` with an `HTTP::Cookies` object, plus this diff: + + --- /var/tmp/pkg/lib/perl5/vendor_perl/5.10.0/IkiWiki/Plugin/aggregate.pm 2010-06-24 13:03:33.000000000 -0400 + +++ aggregate.pm 2010-06-24 13:04:09.000000000 -0400 + @@ -488,7 +488,11 @@ + } + $feed->{feedurl}=pop @urls; + } + - my $res=URI::Fetch->fetch($feed->{feedurl}); + + my $res=URI::Fetch->fetch($feed->{feedurl}, + + UserAgent => LWP::UserAgent->new( + + cookie_jar => $config{cookies}, + + ), + + ); + if (! $res) { + $feed->{message}=URI::Fetch->errstr; + $feed->{error}=1; + +It works, but I have to remember to apply the diff whenever I update +ikiwiki. Can you provide a more elegant means of allowing cookies and/or +the user agent to be programmatically manipulated? --[[schmonz]] + +> Ping -- is the above patch perhaps acceptable (or near-acceptable)? -- [[schmonz]] + +>> Pong.. I'd be happier with a more 100% solution that let cookies be used +>> w/o needing to write a custom plugin to do it. --[[Joey]] + +>>> According to LWP::UserAgent, for the common case, a complete +>>> and valid configuration for `$config{cookies}` would be `{ file => +>>> "$ENV{HOME}/.cookies.txt" }`. In the more common case of not needing +>>> to prime one's cookies, `cookie_jar` can be `undef` (that's the +>>> default). In my less common case, the cookies are generated by +>>> visiting a couple magic URLs, which would be trivial to turn into +>>> config options, except that these particular URLs rely on SPNEGO +>>> and so LWP::Authen::Negotiate has to be loaded. So I think adding +>>> `$config{cookies}` (and using it in the aggregate plugin) should +>>> be safe, might help people in typical cases, and won't prevent +>>> further enhancements for less typical cases. --[[schmonz]] + +>>>> Ok, done. Called it cookiejar. --[[Joey]]