From: joey Date: Wed, 25 Jul 2007 03:36:53 +0000 (+0000) Subject: massive patchqueue reorg X-Git-Tag: 2.5~44 X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/commitdiff_plain/c96d672810ae524aedf9ffff08dcca773cd9a876?hp=d63068b669f5b5450b01a3142c2efca448d3dfc3 massive patchqueue reorg patches can now be anywhere and tagged patch to show up on the patch list. Moved all the patchqueue stuff to todo items; some of it was merged into existing todo items. --- diff --git a/doc/about_rcs_backends.mdwn b/doc/about_rcs_backends.mdwn index c20fa5cc3..2f0fef0d3 100644 --- a/doc/about_rcs_backends.mdwn +++ b/doc/about_rcs_backends.mdwn @@ -36,7 +36,7 @@ W "belongs" to ikiwiki and should not be edited directly. Support for using darcs as a backend is being worked on by [Thomas Schwinge](mailto:tschwinge@gnu.org), although development is on hold curretly. -There is a patch in the [[patchqueue]]. +There is a patch in [[todo/darcs]]. ### How will it work internally? @@ -114,7 +114,7 @@ towards transmitting changes with standalone patch bundles (often by email) as d >> IMHO it comes down to whatever works well for a given RCS. Seems like >> the darcs approach _could_ be done with most any distributed system, but >> it might be overkill for some (or all?) While there is the incomplete darcs ->> plugin in the [[patchqueue]], if you submit one that's complete, I will +>> plugin in [[todo/darcs]], if you submit one that's complete, I will >> probably accept it into ikiwiki.. --[[Joey]] ## [[Git]] diff --git a/doc/bugs.mdwn b/doc/bugs.mdwn index 9c6e49d37..95eb82a8f 100644 --- a/doc/bugs.mdwn +++ b/doc/bugs.mdwn @@ -4,6 +4,6 @@ Also see the [Debian bugs](http://bugs.debian.org/ikiwiki), and the [[TODO]] page. [[inline pages="bugs/* and !bugs/done and !bugs/discussion and -!link(bugs/done) and !bugs/*/*" +!link(patch) and !link(bugs/done) and !bugs/*/*" feedpages="created_after(bugs/no_commit_mails_for_new_pages)" actions=yes rootpage="bugs" postformtext="Add a new bug titled:" show=0]] diff --git a/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn b/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn index b6090b4bd..9f1e89397 100644 --- a/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn +++ b/doc/bugs/edits_not_showing_up_in_compiled_pages.mdwn @@ -10,7 +10,7 @@ Looks like a build died halfway through, so it was stumbling over rendered html pages that it didn't have record of. I don't know what build failed exactly. --[[Joey]] ->> Has this just happened again? [[patchqueue/datearchives-plugin]] is now exhibiting the same symptoms -- it's in the repository and RecentChanges, but the actual page is 404. --Ben +>> Has this just happened again? [[todo/datearchives-plugin]] is now exhibiting the same symptoms -- it's in the repository and RecentChanges, but the actual page is 404. --Ben >>> Yes, it seems to have happened again. Added debugging to track it >>> down next time it occurs. It seems to be happening when you add things diff --git a/doc/bugs/hardcoded___34__Discussion__34___link.mdwn b/doc/bugs/hardcoded___34__Discussion__34___link.mdwn index 28b76cbd5..d5e5a1a68 100644 --- a/doc/bugs/hardcoded___34__Discussion__34___link.mdwn +++ b/doc/bugs/hardcoded___34__Discussion__34___link.mdwn @@ -24,7 +24,7 @@ or "History". --[[Paweł|ptecza]] >>> existing/nonexisting page in two places, one in code in ikiwiki and one >>> in the template. Not good design. --[[Joey]] -> As noted in [[patchqueue/l10n]], there are some other places in ikiwiki +> As noted in [[todo/l10n]], there are some other places in ikiwiki > that hard code English strings, and I feel that using standard gettext > and po files is the best approach for these, although Recai suggested an > approach of translating the strings using a template file. --[[Joey]] @@ -41,4 +41,4 @@ or "History". --[[Paweł|ptecza]] >>> translating it. [[bugs/done]]! There's a `po/debconf.pot` in the source >>> now for translating. See [[translation]]. --[[Joey]] ->>>> Joey, you're great! ;) Thanks a lot! I'll try ikiwiki l10n stuff soon. --[[Paweł|ptecza]] \ No newline at end of file +>>>> Joey, you're great! ;) Thanks a lot! I'll try ikiwiki l10n stuff soon. --[[Paweł|ptecza]] diff --git a/doc/index.mdwn b/doc/index.mdwn index e5fbd70d4..ae1777011 100644 --- a/doc/index.mdwn +++ b/doc/index.mdwn @@ -28,8 +28,7 @@ Thanks! --[[Joey]] developed, and is being written with security as a priority, so don't expect things to stay in this list for long. -* Developers, please document any ikiwiki patches you have in the - [[PatchQueue]]. +* Developers, please document any ikiwiki [[patches|patch]] you have. All wikis are supposed to have a [[SandBox]], so this one does too. diff --git a/doc/patch.mdwn b/doc/patch.mdwn new file mode 100644 index 000000000..d8ac3cd42 --- /dev/null +++ b/doc/patch.mdwn @@ -0,0 +1,9 @@ +Since we have enough people working on ikiwiki to be dangerous, or at least +to duplicate work without coordination, and since few people have direct +commit access to the tree, here's a queue of suggested patches. + +If you post a patch to the wiki, once it's ready to be applied, add a +'patch' tag so it will show up here. + +[[inline pages="todo/* and link(patch) and !*/Discussion" rootpage="todo" +archive="yes"]] diff --git a/doc/patchqueue.mdwn b/doc/patchqueue.mdwn deleted file mode 100644 index ca9dbff79..000000000 --- a/doc/patchqueue.mdwn +++ /dev/null @@ -1,10 +0,0 @@ -Since we have enough people working on ikiwiki to be dangerous, or at least -to duplicate work without coordination, and since few people have direct -commit access to the tree, here's a queue of suggested patches. -Feel free to either copy the patch inline, or link to one elsewhere (or nag -[[Joey]] to open up anonymous svn access to this wiki so you can check in the -patches directly). - -[[inline pages="patchqueue/* and !*/Discussion" -feedpages="created_after(patchqueue/enable-htaccess-files)" -rootpage="patchqueue" archive="yes"]] diff --git a/doc/patchqueue/Gallery_Plugin_for_Ikiwiki.mdwn b/doc/patchqueue/Gallery_Plugin_for_Ikiwiki.mdwn deleted file mode 100644 index 6e6f560bf..000000000 --- a/doc/patchqueue/Gallery_Plugin_for_Ikiwiki.mdwn +++ /dev/null @@ -1,24 +0,0 @@ -I have implemented the first version of the Gallery Plugin for Ikiwiki as part of [[soc]]. This plugin would create a nice looking gallery of the images once the directory containing images is specified with some additional parameters. It has been build over the img plugin. - -Plugin can be downloaded from [here](http://myweb.unomaha.edu/~ajain/gallery.tar). - -It can be used as :
-\[[gallery imagedir="images" thumbnailsize="200x200" cols="3" alt="Can not be displayed" title="My Pictures"]] - -where-
-* imagedir => Directory containing images. It will scan all the files with jpg|png|gif extension from the directory and will put it in the gallery.
-* thumbnailsize(optional) => Size of the thumbnail that you want to generate for the gallery.
-* alt(optional) => If image can not be displayed, it will display the text contained in alt argument.
-* cols(optional) => Number of columns of thumbnails that you want to generate.
-* title(optional) => Title of the gallery.
- -Features of the Gallery Plugin:
-* You can go the next image by clicking on the right side of the image or by pressing 'n'.
-* Similary, you can go the previous image by clicking on the left side of the image or by pressing 'p'.
-* Press esc to close the gallery.
-* While on image, nearby images are preloaded in the background, so as to make the browsing fast.
- -Right now, it features only one template namely [Lightbox](http://www.hudddletogether.com). Later on, I will add few more templates.
-For any feedback or query, feel free to mail me at arpitjain11 [AT] gmail.com - -Additional details are available [here](http://myweb.unomaha.edu/~ajain/ikiwikigallery.html). \ No newline at end of file diff --git a/doc/patchqueue/Wikiwyg_Plugin_for_IkiWiki.mdwn b/doc/patchqueue/Wikiwyg_Plugin_for_IkiWiki.mdwn deleted file mode 100644 index e4369d923..000000000 --- a/doc/patchqueue/Wikiwyg_Plugin_for_IkiWiki.mdwn +++ /dev/null @@ -1,14 +0,0 @@ -Project IkiWiki::WIKIWYG v0.8 - -=========================================================== - -[Wikiwyg][] is a "What you see is what you get" editor for wikis. It will allow you to double click on the text in a wiki and save it without reloading the page. The IkiWiki version will allow you to edit your wiki in Markdown or WYSIWYG. - -The plugin can be downloaded from - -### Current Issues - -* Code sections starting with 4 spaces do not work -* Adding links in the WYSIWYG editor is difficult -* Double lists don't work - -[Wikiwyg]: http://www.wikiwyg.net/ \ No newline at end of file diff --git a/doc/patchqueue/Wikiwyg_Plugin_for_IkiWiki/discussion.mdwn b/doc/patchqueue/Wikiwyg_Plugin_for_IkiWiki/discussion.mdwn deleted file mode 100644 index 93b9c8ce1..000000000 --- a/doc/patchqueue/Wikiwyg_Plugin_for_IkiWiki/discussion.mdwn +++ /dev/null @@ -1,33 +0,0 @@ -Very nice! There are some rough spots yes, but this looks exactly as I'd -hoped it would, and seems close to being ready for merging. - -A few observations, in approximate order of priority: - -* What's the copyright and license of showdown? Please include that from - the original zip file. -* What happens if there are concurrent edits? The CGI.pm modification to - save an edited wikiwyg part doesn't seem to check if the source file has - changed in the meantime, so if the part has moved around, it might - replace the wrong part on saving. I've not tested this. -* The stuff you have in destdir now really belongs in basewiki so it's - copied over to any destdir. -* Personally, I'm not sure if I need double-click to edit a section in my - wiki, but I'd love it if the edit form in the cgi could use wikiwyg. Seems - like both of these could be independent options. Doable, I'm sure? -* It would be good to move as much as possible of the inlined javascript in - wikiwyg.tmpl out to a separate .js file to save space in the rendered - pages. -* Both this plugin and the [[Gallery_Plugin_for_Ikiwiki]] are turning out - to need to add a bunch of pages to the basewiki. I wonder what would be a - good way to do this, without bloating the basewiki when the plugins arn't - used. Perhaps the underlaydir concept needs to be expanded so it's a set - of directories, which plugins can add to. Perhaps you should work with - arpitjain on this so both plugins can benefit. (The smiley plugin would - also benefit from this..) -* Is there any way of only loading enough of wikiwyg by default to catch - the section double-clicks, and have it load the rest on the fly? I'm - thinking about initial page load time when visiting a wikiwyg-using wiki - for the first time. I count 230k or so of data that a browser downloads - in that case.. - ---[[Joey]] diff --git a/doc/patchqueue/Wrapper_config_with_multiline_regexp.mdwn b/doc/patchqueue/Wrapper_config_with_multiline_regexp.mdwn deleted file mode 100644 index b3c6d8e51..000000000 --- a/doc/patchqueue/Wrapper_config_with_multiline_regexp.mdwn +++ /dev/null @@ -1,32 +0,0 @@ -Turning the wikilink regexp into an extended regexp on the svn trunk seems to have broken the setuid wrapper on my system, because of two reasons: First, the wrapper generator should turn each newline in $configstring into `\n` in the C code rather than `\` followed by a newline in the C code. Second, the untainting of $configstring should allow newlines. - -> Both of these problems were already dealt with in commit r3714, on June -> 3rd. Confused why you're posting patches for them now. --[[Joey]] - - Modified: wiki-meta/perl/IkiWiki.pm - ============================================================================== - --- wiki-meta/perl/IkiWiki.pm (original) - +++ wiki-meta/perl/IkiWiki.pm Mon Jun 11 10:52:07 2007 - @@ -205,7 +205,7 @@ - - sub possibly_foolish_untaint ($) { #{{{ - my $tainted=shift; - - my ($untainted)=$tainted=~/(.*)/; - + my ($untainted)=$tainted=~/(.*)/s; - return $untainted; - } #}}} - - - Modified: wiki-meta/perl/IkiWiki/Wrapper.pm - ============================================================================== - --- wiki-meta/perl/IkiWiki/Wrapper.pm (original) - +++ wiki-meta/perl/IkiWiki/Wrapper.pm Mon Jun 11 10:52:07 2007 - @@ -62,7 +62,7 @@ - } - $configstring=~s/\\/\\\\/g; - $configstring=~s/"/\\"/g; - - $configstring=~s/\n/\\\n/g; - + $configstring=~s/\n/\\n/g; - - #translators: The first parameter is a filename, and the second is - #translators: a (probably not translated) error message. diff --git a/doc/patchqueue/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn b/doc/patchqueue/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn deleted file mode 100644 index f3a8b2f78..000000000 --- a/doc/patchqueue/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn +++ /dev/null @@ -1,666 +0,0 @@ -I am serving notice that I am starting work on a calendar plugin inspired by Blosxom's calendar plugin. The current plan is to create a plugin that looks through all the source files matching a certain pagespec, and optionally spit out a month view for the specified month (default to current), or spit out a year view for a given year (defaulting to the current year), of a list of year with posts in them. The output would be a table, with the same CSS directives that the Blosxom plugin used to use (so that I can just reuse my css file). The links would be created to a $config{archivedir}/$year or $config{archivedir}/$year-$month file, which can just have - - \[[inline pages="blog/* and !*/Discussion and creation_year($year) and creation_month($month)" rss="no" atom="no" show="0"]] - -or some thing to generate a archive of postings. - -Roland Mas suggested a separate cron job to generate these archive indices automatically, but that is another thread. - -ManojSrivastava - -This plugin is inspired by the calendar plugin for Blosxom, but derivesno code from it. This plugin is essentially a fancy front end to archives of previous pages, usually used for blogs. It can produce a calendar for a given month, or a list of months for a given year. To invoke the calendar, just use the preprocessor directive: - - \[[calendar ]] - -or - - \[[calendar type="month" pages="blog/* and !*/Discussion"]] - -or - - \[[calendar type="year" year="2005" pages="blog/* and !*/Discussion"]] - - -The year and month entities in the out put have links to archive index pages, which are supposed to exist already. The idea is to create an archives hierarchy, rooted in the subdirectory specified in the site-wide customization variable, archivebase. archivebase defaults to "archives". Links are created to pages "$archivebase/$year" and "$archivebase/$year/$month". The idea is to create annual and monthly indices, for example, by using something like this sample from my archives/2006/01.mdwn - - \[[meta title="Archives for 2006/01"]] - \[[inline rootpage="blog" atom="no" rss="no" show="0" pages="blog/* and !*/Discussion and creation_year(2006) and creation_month(01)" ]] - -I'll send in the patch via email. - - -ManojSrivastava - ------- - -Since this is a little bit er, stalled, I'll post here the stuff Manoj -mailed me, and my response to it. --[[Joey]] - -
-#! /usr/bin/perl
-#                              -*- Mode: Cperl -*- 
-# calendar.pm --- 
-# Author           : Manoj Srivastava ( srivasta@glaurung.internal.golden-gryphon.com ) 
-# Created On       : Fri Dec  8 16:05:48 2006
-# Created On Node  : glaurung.internal.golden-gryphon.com
-# Last Modified By : Manoj Srivastava
-# Last Modified On : Sun Dec 10 01:53:22 2006
-# Last Machine Used: glaurung.internal.golden-gryphon.com
-# Update Count     : 139
-# Status           : Unknown, Use with caution!
-# HISTORY          : 
-# Description      : 
-# 
-# arch-tag: 2aa737c7-3d62-4918-aaeb-fd85b4b1384c
-#
-# Copyright (c) 2006 Manoj Srivastava 
-#
-# This program is free software; you can redistribute it and/or modify
-# it under the terms of the GNU General Public License as published by
-# the Free Software Foundation; either version 2 of the License, or
-# (at your option) any later version.
-#
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
-# GNU General Public License for more details.
-#
-# You should have received a copy of the GNU General Public License
-# along with this program; if not, write to the Free Software
-# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
-#
-
-require 5.002;
-package IkiWiki::Plugin::calendar;
-
-use warnings;
-use strict;
-use IkiWiki '1.00';
-use Time::Local;
-
-our $VERSION = "0.1";
-my $file = __FILE__;
-
-my %calpages;
-my %cache;
-my %linkcache;
-
-my $index=1;
-my @now=localtime();
-
-=head1 NAME
-
-calendar - Add links for the current month's, current year's, and older archived postings
-
-=cut
-
-=head1 SYNOPSIS
-
-To invoke the calendar, just use the preprocessor directive (options
-and variations are detailed below):
-
-  [[calendar ]]
-
-or
-
-  [[calendar type="month" pages="blog/* and !*/Discussion"]]
-
-or
-
-  [[calendar type="year"  year="2005" pages="blog/* and !*/Discussion"]]
-
-=cut
-
-
-=head1 DESCRIPTION
-
-This plugin is inspired by the calendar plugin for Blosxom, but
-derives no code from it. This plugin is essentially a fancy front end
-to archives of previous pages, usually used for blogs. It can produce
-a calendar for a given month, or a list of months for a given year. 
-
-The year and month entities in the out put have links to archive index
-pages, which are supposed to exist already. The idea is to create an
-archives hierarchy, rooted in the subdirectory specified in the
-site wide customization variable, I. I
-defaults to C.  Links are created to pages
-C<$archivebase/$year> and C<$archivebase/$year/$month>. If one creates
-annual and monthly indices, for example, by using something like this
-sample from my I (warning: line split for
-readability):
-
-   \[[meta title="Archives for 2006/01"]]
-   \[[inline rootpage="blog" atom="no" rss="no" show="0"
-     pages="blog/* and !*/Discussion and creation_year(2006)
-            and creation_month(01)"
-   ]]
-
-=cut
-
-=head1 OPTIONS
-
-=over
-
-=item B
-
-Used to specify the type of calendar wanted. Can be one of C or
-C. The default is a month view calendar.
-
-=item B
-
-Specifies the C used to get pages to match for
-linking. Usually this should be something like C.
-Defaults to C<*>.
-
-=item B
-
-The year for which the calendar is requested. Defaults to the current year.
-
-=item B
-
-The numeric month for which the calendar is requested, in the range
-1..12. Used only for the month view calendar, and defaults to the
-current month.
-
-=item B
-
-A number, in the range 0..6, which represents the day of the week that
-the month calendar starts with. 0 is Sunday, 1 is Monday, and so
-on. Defaults to 0, which is Sunday.
-
-=item B
-
-In the annual calendar, number of months to place in each row. Defaults to 3.
-
-=back
-
-=cut
-
-=head1 Classes for CSS control
-
-The output is liberally sprinkled with classes, for fine grained CSS
-customization.
-
-=over
-
-=item C 
-
-The month calendar as a whole
-
-=item C
-
-The head of the month calendar (ie,"March"), localized to the environment.
-
-=item C
-
-A column head in the month calendar (ie, a day-of-week abbreviation),
-localized.
-
-=item C, C,
-  C, C,
-  C 
-
-The day squares on the month calendar, for days that don't exist
-(before or after the month itself), that don't have stories, that do
-have stories, that are in the future, or are that currently selected,
-respectively (today).
-
-=item Day-of-week-name
-
-Each day square is also given a class matching its day of week, this
-can be used to high light weekends. This is also localized.
-
-=item C
-
-The year calendar as a whole
-
-=item C
-
-The head of the year calendar (ie, "2006")
-
-=item C
-
-For example, "Months"
-
-=item C, C,
-  C, C
-
-The month squares on the year calendar, for months with stories,
-without, in the future, and currently selected, respectively.
-
-=back
-
-=cut
-
-
-sub import {
-  hook(type => "preprocess", id => "calendar", call => \&preprocess);
-  hook(type => "format", id => "calendar", call => \&format);
-}
-
-sub preprocess (@) {
-  my %params=@_;
-  $params{pages} = "*"            unless defined $params{pages};
-  $params{type}  = "month"        unless defined $params{type};
-  $params{year}  = 1900 + $now[5] unless defined $params{year};
-  $params{month} = sprintf("%02d", $params{month}) if defined  $params{month};
-  $params{month} = 1    + $now[4] unless defined $params{month};
-  $params{week_start_day} = 0     unless defined $params{week_start_day};
-  $params{months_per_row} = 3     unless defined $params{months_per_row};
-
-  # Store parameters (could be multiple calls per page)
-  $calpages{$params{destpage}}{$index} = \%params;
-
-  return "\n
" . $index++ . "
\n"; -} - -sub is_leap_year (@) { - my %params=@_; - return ($params{year} % 4 == 0 && (($params{year} % 100 != 0) || $params{year} % 400 ==0)) ; -} - - -sub month_days { - my %params=@_; - my $days_in_month = (31,28,31,30,31,30,31,31,30,31,30,31)[$params{month}-1]; - if ($params{month} == 2 && is_leap_year(%params)) { - $days_in_month++; - } - return $days_in_month; -} - - -sub format_month (@) { - my %params=@_; - my $pagespec = $params{pages}; - my $year = $params{year}; - my $month = $params{month}; - - my $calendar="\n"; - - # When did this month start? - my @monthstart = localtime(timelocal(0,0,0,1,$month-1,$year-1900)); - - my $future_dom = 0; - my $today = 0; - $future_dom = $now[3]+1 if ($year == $now[5]+1900 && $month == $now[4]+1); - $today = $now[3] if ($year == $now[5]+1900 && $month == $now[4]+1); - - # Calculate month names for next month, and previous months - my $pmonth = $month - 1; - my $nmonth = $month + 1; - my $pyear = $year; - my $nyear = $year; - - # Adjust for January and December - if ($month == 1) { $pmonth = 12; $pyear--; } - if ($month == 12) { $nmonth = 1; $nyear++; } - - # Find out month names for this, next, and previous months - my $monthname=POSIX::strftime("%B", @monthstart); - my $pmonthname= - POSIX::strftime("%B", localtime(timelocal(0,0,0,1,$pmonth-1,$pyear-1900))); - my $nmonthname= - POSIX::strftime("%B", localtime(timelocal(0,0,0,1,$nmonth-1,$nyear-1900))); - - # Calculate URL's for monthly archives, and article counts - my $archivebase = 'archives'; - $archivebase = $config{archivebase} if defined $config{archivebase}; - - my ($url, $purl, $nurl)=("$monthname",'',''); - my ($count, $pcount, $ncount) = (0,0,0); - - if (exists $cache{$pagespec}{"$year/$month"}) { - $url = htmllink($params{page}, $params{destpage}, - "$archivebase/$year/" . sprintf("%02d", $month), - 0,0," $monthname "); - } - - if (exists $cache{$pagespec}{"$pyear/$pmonth"}) { - $purl = htmllink($params{page}, $params{destpage}, - "$archivebase/$pyear/" . sprintf("%02d", $pmonth), - 0,0," $pmonthname "); - } - if (exists $cache{$pagespec}{"$nyear/$nmonth"}) { - $nurl = htmllink($params{page}, $params{destpage}, - "$archivebase/$nyear/" . sprintf("%02d", $nmonth), - 0,0," $nmonthname "); - } - - # Start producing the month calendar - $calendar=< - - $purl - $url - $nurl - - -EOF - # Suppose we want to start the week with day $week_start_day - # If $monthstart[6] == 1 - my $week_start_day = $params{week_start_day}; - - my $start_day = 1 + (7 - $monthstart[6] + $week_start_day) % 7; - my %downame; - my %dowabbr; - for my $dow ($week_start_day..$week_start_day+6) { - my @day=localtime(timelocal(0,0,0,$start_day++,$month-1,$year-1900)); - my $downame = POSIX::strftime("%A", @day); - my $dowabbr = POSIX::strftime("%a", @day); - $downame{$dow % 7}=$downame; - $dowabbr{$dow % 7}=$dowabbr; - $calendar.= - qq{ $dowabbr\n}; - } - - $calendar.=< -EOF - - my $wday; - # we start with a week_start_day, and skip until we get to the first - for ($wday=$week_start_day; $wday != $monthstart[6]; $wday++, $wday %= 7) { - $calendar.=qq{ \n} if $wday == $week_start_day; - $calendar.= - qq{  \n}; - } - - # At this point, either the first is a week_start_day, in which case nothing - # has been printed, or else we are in the middle of a row. - for (my $day = 1; $day <= month_days(year => $year, month => $month); - $day++, $wday++, $wday %= 7) { - # At tihs point, on a week_start_day, we close out a row, and start a new - # one -- unless it is week_start_day on the first, where we do not close a - # row -- since none was started. - if ($wday == $week_start_day) { - $calendar.=qq{ \n} unless $day == 1; - $calendar.=qq{ \n}; - } - my $tag; - my $mtag = sprintf("%02d", $month); - if (defined $cache{$pagespec}{"$year/$mtag/$day"}) { - if ($day == $today) { $tag='month-calendar-day-this-day'; } - else { $tag='month-calendar-day-link'; } - $calendar.=qq{ }; - $calendar.= - htmllink($params{page}, $params{destpage}, - pagename($linkcache{"$year/$mtag/$day"}), - 0,0,"$day"); - $calendar.=qq{\n}; - } - else { - if ($day == $today) { $tag='month-calendar-day-this-day'; } - elsif ($day == $future_dom) { $tag='month-calendar-day-future'; } - else { $tag='month-calendar-day-nolink'; } - $calendar.=qq{ $day\n}; - } - } - # finish off the week - for (; $wday != $week_start_day; $wday++, $wday %= 7) { - $calendar.=qq{  \n}; - } - $calendar.=< - -EOF - - return $calendar; -} - -sub format_year (@) { - my %params=@_; - my $pagespec = $params{pages}; - my $year = $params{year}; - my $month = $params{month}; - my $calendar="\n"; - my $pyear = $year - 1; - my $nyear = $year + 1; - my $future_month = 0; - $future_month = $now[4]+1 if ($year == $now[5]+1900); - - # calculate URL's for previous and next years - my $archivebase = 'archives'; - $archivebase = $config{archivebase} if defined $config{archivebase}; - my ($url, $purl, $nurl)=("$year",'',''); - if (exists $cache{$pagespec}{"$year"}) { - $url = htmllink($params{page}, $params{destpage}, - "$archivebase/$year", - 0,0,"$year"); - } - - if (exists $cache{$pagespec}{"$pyear"}) { - $purl = htmllink($params{page}, $params{destpage}, - "$archivebase/$pyear", - 0,0,"\←"); - } - if (exists $cache{$pagespec}{"$nyear"}) { - $nurl = htmllink($params{page}, $params{destpage}, - "$archivebase/$nyear", - 0,0,"\→"); - } - # Start producing the year calendar - $calendar=< - - $purl - $url - $nurl - - - Months - -EOF - - for ($month = 1; $month <= 12; $month++) { - my @day=localtime(timelocal(0,0,0,15,$month-1,$year-1900)); - my $murl; - my $monthname = POSIX::strftime("%B", @day); - my $monthabbr = POSIX::strftime("%b", @day); - $calendar.=qq{ \n} if ($month % $params{months_per_row} == 1); - my $tag; - my $mtag=sprintf("%02d", $month); - if ($month == $params{month}) { - if ($cache{$pagespec}{"$year/$mtag"}) {$tag = 'this_month_link'} - else {$tag = 'this_month_nolink'} - } - elsif ($cache{$pagespec}{"$year/$mtag"}) {$tag = 'month_link'} - elsif ($future_month && $month >=$future_month){$tag = 'month_future'} - else {$tag = 'month_nolink'} - if ($cache{$pagespec}{"$year/$mtag"}) { - $murl = htmllink($params{page}, $params{destpage}, - "$archivebase/$year/$mtag", - 0,0,"$monthabbr"); - $calendar.=qq{ }; - $calendar.=$murl; - $calendar.=qq{\n}; - } - else { - $calendar.=qq{ $monthabbr\n}; - } - $calendar.=qq{ \n} if ($month % $params{months_per_row} == 0); - } - $calendar.=< -EOF - - return $calendar; -} - - -sub format (@) { - my %params=@_; - my $content=$params{content}; - return $content unless exists $calpages{$params{page}}; - - # Restore parameters for each invocation - foreach my $index (keys %{$calpages{$params{page}}}) { - my $calendar="\n"; - my %saved = %{$calpages{$params{page}}{$index}}; - my $pagespec=$saved{pages}; - - if (! defined $cache{$pagespec}) { - for my $page (sort keys %pagesources) { - next unless pagespec_match($page,$pagespec); - my $mtime; - my $src = $pagesources{$page}; - if (! exists $IkiWiki::pagectime{$page}) { - $mtime=(stat(srcfile($src)))[9]; - } - else { - $mtime=$IkiWiki::pagectime{$page} - } - my @date = localtime($mtime); - my $mday = $date[3]; - my $month = $date[4] + 1; - my $year = $date[5] + 1900; - my $mtag = sprintf("%02d", $month); - $linkcache{"$year/$mtag/$mday"} = "$src"; - $cache{$pagespec}{"$year"}++; - $cache{$pagespec}{"$year/$mtag"}++; - $cache{$pagespec}{"$year/$mtag/$mday"}++; - } - } - # So, we have cached data for the current pagespec at this point - if ($saved{type} =~ /month/i) { - $calendar=format_month(%saved); - } - elsif ($saved{type} =~ /year/i) { - $calendar=format_year(%saved); - } - $content =~ s/(
\s*.?\s*$index\b)/
$calendar/ms; - } - return $content; -} - - - -=head1 CAVEATS - -In the month calendar, for days in which there is more than one -posting, the link created randomly selects one of them. Since there is -no easy way in B to automatically generate index pages, and -pregenerating daily index pages seems too much of an overhead, we have -to live with this. All postings can still be viewed in the monthly or -annual indices, of course. This can be an issue for very prolific -scriveners. - -=cut - -=head1 BUGS - -None Known so far. - -=head1 BUGS - -Since B eval's the configuration file, the values have to all -on a single physical line. This is the reason we need to use strings -and eval, instead of just passing in real anonymous sub references, -since the eval pass converts the coderef into a string of the form -"(CODE 12de345657)" which can't be dereferenced. - -=cut - -=head1 AUTHOR - -Manoj Srivastava - -=head1 COPYRIGHT AND LICENSE - -This script is a part of the Devotee package, and is - -Copyright (c) 2002 Manoj Srivastava - -This program is free software; you can redistribute it and/or modify -it under the terms of the GNU General Public License as published by -the Free Software Foundation; either version 2 of the License, or -(at your option) any later version. - -This program is distributed in the hope that it will be useful, -but WITHOUT ANY WARRANTY; without even the implied warranty of -MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the -GNU General Public License for more details. - -You should have received a copy of the GNU General Public License -along with this program; if not, write to the Free Software -Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA - -=cut - -1; - -__END__ -
- ------- - -I've been looking over the calendar plugin. Some items: - -* Why did you need to use a two-stage generation with a format hook? - That approach should only be needed if adding something to a page that - would be removed by the htmlscrubber, and as far as I can tell, the - calendars don't involve anything that would be a problem. It seems - that emitting the whole calendar in the preprocess hook would simplify - things and you'd not need to save state about calendars. - -> I am scared of the html scrubber, and have never turned it on, -> and did not look too deeply into what would be scrubbed out --ManojSrivastava ->> Unless you're using javascript, a few annoyances link , or inline ->> css, it's unlikly to object to any html you might write. The list of ->> allowed tags and attributes is easy to find near the top of the plugin. - -> In case the option that gets the ctime of the pages from the -> SCM itself, %IkiWiki::pagectime is not populated that early, -> is it? So I waited until the last possible moment to look at -> the time information. -> ->> Actually, since my big rewrite of the rendering path a few months ago, ->> ikiwiki scans and populates almost all page information before starting ->> to render any page. This includes %pagectime, and even %links. So you ->> shouldn't need to worry about running it late. - -* The way that it defaults to the current year and current month - is a little bit tricky, because of course the wiki might not get - updated in a particular time period, and even if it is updated, only - iff a page containing a calendar is rebuilt for some other reason will - the calendar get updated, and change what year or month it shows. This - is essentially the same problem described in - [[todo/tagging_with_a_publication_date]], - although I don't think it will affect the calendar plugin very badly. - Still, the docs probably need to be clear about this. - -> I use it on the sidebar; and the blog pages are almost always -> rebuilt, which is where the calendar is looked at most often. Oh, -> and I also cheat, I have ikiwiki --setup foo as a @daily cronjob, so -> my wiki is always built daily from scratch. -> -> I think it should be mentioned, yes. - -* There seems to be something a bit wrong with the year-to-year - navigation in the calendar, based on the example in your blog. If I'm - on the page for 2006, there's an arrow pointing left which takes me to - 2005. If I'm on 2005, the arrow points left, but goes to 2006, not - 2004. - -> I need to look into this. - -* AIUI, the archivebase setting makes a directory rooted at the top of - the wiki, so you can have only one set of archives per wiki, in - /archives/. It would be good if it were possible to have multiple - archived for different blogs in the same wiki at multiple locations. - Though since the archives contain calendars, the archive location - can't just be relative to the page with the calendar. But perhaps - archivebase could be a configurable parameter that can be specified in - the directive for the calendar? (It would be fine to keep the global - location as a default.) - -> OK, this is simple enough to implement. I'll do that (well, -> perhaps not before Xmas, I have a family dinner to cook) and send in -> another patch. - - ----- - -And that's all I've heard so far. Hoping I didn't miss another patch? - ---[[Joey]] diff --git a/doc/patchqueue/clickable-openid-urls-in-logs.mdwn b/doc/patchqueue/clickable-openid-urls-in-logs.mdwn deleted file mode 100644 index 997bc7492..000000000 --- a/doc/patchqueue/clickable-openid-urls-in-logs.mdwn +++ /dev/null @@ -1,19 +0,0 @@ -OpenID URLs aren't clickable in the ViewVC logs because they're directly followed by a colon. At the expense of, um, proper grammar, here's a patch for SVN. If this is OK, I'll patch the other RCS modules, too. - -> Reasonable, but probably needs to modify the wiki\_commit\_regexp to -> recognise such commit messages when parsing the logs. Do that and extend -> to the other modules and I'll accept it. --[[Joey]] - -
---- IkiWiki/Rcs/svn.pm  (revision 2650)
-+++ IkiWiki/Rcs/svn.pm  (working copy)
-@@ -71,7 +71,7 @@
-        my $ipaddr=shift;
- 
-        if (defined $user) {
--               $message="web commit by $user".(length $message ? ": $message" : "");
-+               $message="web commit by $user ".(length $message ? ": $message" : "");
-        }
-        elsif (defined $ipaddr) {
-                $message="web commit from $ipaddr".(length $message ? ": $message" : "");
-
diff --git a/doc/patchqueue/darcs.mdwn b/doc/patchqueue/darcs.mdwn deleted file mode 100644 index 13bd82513..000000000 --- a/doc/patchqueue/darcs.mdwn +++ /dev/null @@ -1,464 +0,0 @@ -Here's Thomas Schwinge unfinished darcs support for ikiwiki. - -(Finishing this has been suggested as a [[soc]] project.) - -> I haven't been working on this for months and also won't in the near -> future. Feel free to use what I have done so -> far and bring it into an usable state! Also, feel free to contact me -> if there are questions. - --- [Thomas Schwinge](mailto:tschwinge@gnu.org) - -[[toggle text="show"]] -[[toggleable text=""" - # Support for the darcs rcs, . - # Copyright (C) 2006 Thomas Schwinge - # - # This program is free software; you can redistribute it and/or modify it - # under the terms of the GNU General Public License as published by the - # Free Software Foundation; either version 2 of the License, or (at your - # option) any later version. - # - # This program is distributed in the hope that it will be useful, but - # WITHOUT ANY WARRANTY; without even the implied warranty of - # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - # General Public License for more details. - # - # You should have received a copy of the GNU General Public License along - # with this program; if not, write to the Free Software Foundation, Inc., - # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA. - - - # We're guaranteed to be the only instance of ikiwiki running at a given - # time. It is essential that only ikiwiki is working on a particular - # repository. That means one instance of ikiwiki and it also means that - # you must not `darcs push' into this repository, as this might create - # race conditions, as I understand it. - - - use warnings; - use strict; - use IkiWiki; - - package IkiWiki; - - - # Which darcs executable to use. - my $darcs = ($ENV{DARCS} or 'darcs'); - - - # Internal functions. - - sub darcs_info ($$$) { - my $field = shift; - my $repodir = shift; - my $file = shift; # Relative to the repodir. - - my $child = open(DARCS_CHANGES, "-|"); - if (! $child) { - exec($darcs, 'changes', '--repo=' . $repodir, '--xml-output', $file) or - error('failed to run `darcs changes\''); - } - - # Brute force for now. :-/ - while () { - last if /^<\/created_as>$/; - } - ($_) = =~ /$field=\'([^\']+)/; - $field eq 'hash' and s/\.gz//; # Strip away the `.gz' from `hash'es. - - close(DARCS_CHANGES) or error('`darcs changes\' exited ' . $?); - - return $_; - } - - - # Exported functions. - - sub rcs_update () { - # Not needed. - } - - sub rcs_prepedit ($) { - # Prepares to edit a file under revision control. Returns a token that - # must be passed to rcs_commit() when the file is to be commited. For us, - # this token the hash value of the latest patch that modifies the file, - # i.e. something like its current revision. If the file is not yet added - # to the repository, we return TODO: the empty string. - - my $file = shift; # Relative to the repodir. - - my $hash = darcs_info('hash', $config{srcdir}, $file); - return defined $hash ? $hash : ""; - } - - sub rcs_commit ($$$) { - # Commit the page. Returns `undef' on success and a version of the page - # with conflict markers on failure. - - my $file = shift; # Relative to the repodir. - my $message = shift; - my $rcstoken = shift; - - # Compute if the ``revision'' of $file changed. - my $changed = darcs_info('hash', $config{srcdir}, $file) ne $rcstoken; - - # Yes, the following is a bit convoluted. - if ($changed) { - # TODO. Invent a better, non-conflicting name. - rename("$config{srcdir}/$file", "$config{srcdir}/$file.save") or - error("failed to rename $file to $file.save: $!"); - - # Roll the repository back to $rcstoken. - - # TODO. Can we be sure that no changes are lost? I think that - # we can, if we make sure that the `darcs push' below will always - # succeed. - - # We need to revert everything as `darcs obliterate' might choke - # otherwise. - # TODO: `yes | ...' needed? Doesn't seem so. - system($darcs, "revert", "--repodir=" . $config{srcdir}, "--all") and - error("`darcs revert' failed"); - # Remove all patches starting at $rcstoken. - # TODO. Something like `yes | darcs obliterate ...' seems to be needed. - system($darcs, "obliterate", "--quiet", "--repodir" . $config{srcdir}, - "--match", "hash " . $rcstoken) and - error("`darcs obliterate' failed"); - # Restore the $rcstoken one. - system($darcs, "pull", "--quiet", "--repodir=" . $config{srcdir}, - "--match", "hash " . $rcstoken, "--all") and - error("`darcs pull' failed"); - - # We're back at $rcstoken. Re-install the modified file. - rename("$config{srcdir}/$file.save", "$config{srcdir}/$file") or - error("failed to rename $file.save to $file: $!"); - } - - # Record the changes. - # TODO: What if $message is empty? - writefile("$file.log", $config{srcdir}, $message); - system($darcs, 'record', '--repodir=' . $config{srcdir}, '--all', - '--logfile=' . "$config{srcdir}/$file.log", - '--author=' . 'web commit ', $file) and - error('`darcs record\' failed'); - - # Update the repository by pulling from the default repository, which is - # master repository. - system($darcs, "pull", "--quiet", "--repodir=" . $config{srcdir}, - "--all") and error("`darcs pull' failed\n"); - - # If this updating yields any conflicts, we'll record them now to resolve - # them. If nothing is recorded, there are no conflicts. - $rcstoken = darcs_info('hash', $config{srcdir}, $file); - # TODO: Use only the first line here, i.e. only the patch name? - writefile("$file.log", $config{srcdir}, 'resolve conflicts: ' . $message); - system($darcs, 'record', '--repodir=' . $config{srcdir}, '--all', - '--logfile=' . "$config{srcdir}/$file.log", - '--author=' . 'web commit ', $file) and - error('`darcs record\' failed'); - my $conflicts = darcs_info('hash', $config{srcdir}, $file) ne $rcstoken; - unlink("$config{srcdir}/$file.log") or - error("failed to remove `$file.log'"); - - # Push the changes to the main repository. - system($darcs, 'push', '--quiet', '--repodir=' . $config{srcdir}, '--all') - and error('`darcs push\' failed'); - # TODO: darcs send? - - if ($conflicts) { - my $document = readfile("$config{srcdir}/$file"); - # Try to leave everything in a consistent state. - # TODO: `yes | ...' needed? Doesn't seem so. - system($darcs, "revert", "--repodir=" . $config{srcdir}, "--all") and - warn("`darcs revert' failed.\n"); - return $document; - } else { - return undef; - } - } - - sub rcs_add ($) { - my $file = shift; # Relative to the repodir. - - # Intermediate directories will be added automagically. - system($darcs, 'add', '--quiet', '--repodir=' . $config{srcdir}, - '--boring', $file) and error('`darcs add\' failed'); - } - - sub rcs_recentchanges ($) { - warn('rcs_recentchanges() is not implemented'); - return 'rcs_recentchanges() is not implemented'; - } - - sub rcs_notify () { - warn('rcs_notify() is not implemented'); - } - - sub rcs_getctime () { - warn('rcs_getctime() is not implemented'); - } - - 1 -"""]] - -This is my ([bma](bma@bmalee.eu)) darcs.pm - it's messy (my Perl isn't up to much) but seems to work. It uses just one repo, like the mercurial plugin (unlike the above version, which AIUI uses two). - -`rcs_commit()` uses backticks instead of `system()`, to prevent darcs' output being sent to the browser and mucking with the HTTP headers (`darcs record` has no --quiet option). And `rcs_recentchanges()` uses regexes rather than parsing darcs' XML output. - -[[toggle text="show" id="bma"]] -[[toggleable id="bma" text=""" - - #!/usr/bin/perl - - use warnings; - use strict; - use IkiWiki; - use Date::Parse; - use open qw{:utf8 :std}; - - package IkiWiki; - - sub rcs_update () { #{{{ - # Do nothing - there's nowhere to update *from*. - } #}}} - - sub rcs_prepedit ($) { #{{{ - } #}}} - - sub rcs_commit ($$$;$$) { #{{{ - my ($file, $message, $rcstoken, $user, $ipaddr) = @_; - - # $user should probably be a name and an email address, by darcs - # convention. - if (defined $user) { - $user = possibly_foolish_untaint($user); - } - elsif (defined $ipaddr) { - $user = "Anonymous from $ipaddr"; - } - else { - $user = "Anonymous"; - } - - $message = possibly_foolish_untaint($message); - - # BUG: this outputs one line of text, and there's not a -q or --quiet - # option. Redirecting output to /dev/null works, but I still get the - # HTTP status and location headers displayed in the browser - is that - # darcs' fault or ikiwiki's? - # Doing it in backticks *works*, but I'm sure it could be done better. - my @cmdline = ("darcs", "record", "--repodir", "$config{srcdir}", - "-a", "-m", "$message", "--author", "$user", $file); - `darcs record --repodir "$config{srcdir}" -a -m "$message" --author "$user" $file`; # Return value? Output? Who needs 'em? - #if (system(@cmdline) != 0) { - # warn "'@cmdline' failed: $!"; - #} - - return undef; # success - - sub rcs_add ($) { # {{{ - my ($file) = @_; - - my @cmdline = ("darcs", "add", "--repodir", "$config{srcdir}", "-a", "-q", "$file"); - if (system(@cmdline) != 0) { - warn "'@cmdline' failed: $!"; - } - } #}}} - - sub rcs_recentchanges ($) { #{{{ - # TODO: This is horrible code. It doesn't work perfectly, and uses regexes - # rather than parsing Darcs' XML output. - my $num=shift; - my @ret; - - return unless -d "$config{srcdir}/_darcs"; - - my $changelog = `darcs changes --xml --summary --repodir "$config{srcdir}"`; - $changelog = join("", split(/\s*\n\s*/, $changelog)); - my @changes = split(/<\/patch>.*?(.*?)<\/name>/g; - my @message = {line => $1}; - foreach my $match ($change =~ m/(.*?)<\/comment>/gm) { - push @message, {line => $1}; - } - - my @pages; - foreach my $match ($change =~ m/<.*?_(file|directory)>(.*?)(<(added|removed)_lines.*\/>)*<\/.*?_(file|directory)>/g) { - # My perl-fu is weak. I'm probably going about this all wrong, anyway. - push @pages, {page => pagename($match)} if ( -f $config{srcdir}."/".$match || -d $config{srcdir}."/".$match) and not $match =~ m/^$/; - } - push @ret, { rev => $rev, - user => $user, - committype => $committype, - when => $when, - message => [@message], - pages => [@pages], - } - } - return @ret; - } #}}} - - sub rcs_notify () { #{{{ - # TODO - } #}}} - - sub rcs_getctime ($) { #{{{ - error gettext("getctime not implemented"); - } #}}} - - 1 - - - -"""]] - ---- - -Well, here's my version too. It only does getctime -- using a real XML parser, instead of regexp ugliness -- and maybe recentchanges, but that may be bitrotted, or maybe I never finished it, as I only need the getctime. As for actual commits, I have previously voiced my opinion, that this should be done by the plugin generating a patch bundle, and forwarding it to darcs in some way (`darcs apply` or even email to another host, possibly moderated), instead of the hacky direct modification of a working copy. It could also be faster to getctime in a batch. Just reading in all the changes the first time they're needed, might not be a big improvement in many cases, but if we got a batch request from ikiwiki, we could keep reaing the changes until all the files in this batch request have been met. --[[tuomov]] - -[[toggle text="show" id="tuomov"]] -[[toggleable id="tuomov" text=""" -
-#!/usr/bin/perl
-# Stubs for no revision control.
-
-use warnings;
-use strict;
-use IkiWiki;
-
-package IkiWiki;
-
-sub rcs_update () {
-}
-
-sub rcs_prepedit ($) {
-	return ""
-}
-
-sub rcs_commit ($$$) {
-	return undef # success
-}
-
-sub rcs_add ($) {
-}
-
-sub rcs_recentchanges ($) {
-	my $num=shift;
-	my @ret;
-	
-	eval q{use Date::Parse};
-	eval q{use XML::Simple};
-	
-	my $repodir=$config{srcdir};
-	
-	if (-d "$config{srcdir}/_darcs") {
-		my $child = open(LOG, "-|");
-		if (! $child) {
-			exec("darcs", "changes", "--xml", 
-			     "--repodir", "$repodir",
-			     "--last", "$num")
-			|| error("darcs changes failed to run");
-		}
-		my $data=;
-		close LOG;
-		
-		my $log = XMLin($data, ForceArray => 1);
-		
-		foreach my $patch ($log->{patch}) {
-			my $date=$patch->{local_date};
-			my $hash=$patch->{hash};
-			my $when=concise(ago(time - str2time($date)));
-			my @pages;
-			
-			my $child = open(SUMMARY, "-|");
-			if (! $child) {
-				exec("darcs", "annotate", "-s", "--xml", 
-				     "--match", "hash: $hash",
-				     "--repodir", "$repodir")
-				|| error("darcs annotate failed to run");
-			}
-			my $data=;
-			close SUMMARY;
-		
-			my $summary = XMLin("$data", ForceArray => 1);
-
-			# TODO: find @pages
-			
-			push @ret, {
-				#rev => $rev,
-				user => $patch->{author},
-				#committype => $committype,
-				when => $when, 
-				#message => [@message],
-				pages => [@pages],
-			}; # if @pages;
-			return @ret if @ret >= $num;
-		}
-	}
-	
-	return @ret;
-}
-
-sub rcs_notify () {
-}
-
-sub rcs_getctime ($) {
-	my $file=shift;
-	
-	eval q{use Date::Parse};
-	eval q{use XML::Simple};
-	local $/=undef;
-	
-	# Sigh... doing things the hard way again
-	my $repodir=$config{srcdir};
-	
-	my $filer=substr($file, length($repodir));
-	$filer =~ s:^[/]+::;
-	
-	my $child = open(LOG, "-|");
-	if (! $child) {
-		exec("darcs", "changes", "--xml", "--reverse",
-		     "--repodir", "$repodir", "$filer")
-		|| error("darcs changes $filer failed to run");
-	}
-	
-	my $data=;
-	close LOG;
-	
-	my $log = XMLin($data, ForceArray => 1);
-	
-	my $datestr=$log->{patch}[0]->{local_date};
-	
-	if (! defined $datestr) {
-		warn "failed to get ctime for $filer";
-		return 0;
-	}
-	
-	my $date=str2time($datestr);
-	
-	debug("found ctime ".localtime($date)." for $file");
-	
-	return $date;
-}
-
-1
-
-"""]] diff --git a/doc/patchqueue/datearchives-plugin.mdwn b/doc/patchqueue/datearchives-plugin.mdwn deleted file mode 100644 index b8566f8cf..000000000 --- a/doc/patchqueue/datearchives-plugin.mdwn +++ /dev/null @@ -1,70 +0,0 @@ -I'll be using IkiWiki primarily as a blog, so I want a way to view entries by date. A URL of the form `/date/YYYY/MM/DD.html` (or `/date/YYYY/MM/DD/` when using the `use_dirs` patch) should show posts from that period. ATM, I have this: - -
-Index: IkiWiki/Plugin/datearchives.pm
-===================================================================
---- IkiWiki/Plugin/datearchives.pm      (revision 0)
-+++ IkiWiki/Plugin/datearchives.pm      (revision 0)
-@@ -0,0 +1,31 @@
-+#!/usr/bin/perl
-+
-+package IkiWiki::Plugin::datearchives;
-+
-+use warnings;
-+use strict;
-+use IkiWiki;
-+
-+sub import { #{{{
-+    hook(type => "pagetemplate", id => "datearchives", call => \&pagetemplate, scan => 1);
-+} # }}}
-+
-+sub pagetemplate (@) { #{{{
-+    my %args = @_;
-+    my $dt;
-+    eval {
-+        use DateTime;
-+        $dt = DateTime->from_epoch(epoch => $IkiWiki::pagectime{ $args{page} });
-+    };
-+    return if $@;
-+    my $base = $config{datearchives_base} || 'date';
-+    my $link = $base.'/'.$dt->strftime('%Y/%m/%d');
-+    push @{$links{$args{page}}}, $link;
-+    my $template = $args{template};
-+       if ($template->query(name => "ctime")) {
-+        $template->param(ctime => htmllink( $args{page}, $args{destpage}, $link, 0, 0,
-+                                            $template->param('ctime')));
-+       }
-+} # }}}
-+
-+1
-
- -This works (although accessing `%IkiWiki::pagectime` is not too clever), but it would be far more useful if the date pages were automatically created and populated with the relevant posts. A [[Pagespec]] works perfectly for displaying the relevant content, but we're still left with the issue of actually creating the page. What's the Right Way to do this? We could create them in the RCS working copy and check them in, or create them directly in the output directory... (I'd also like to create an option for the tags plugin to auto-create its targets in the same way). Any opinions? :-) - -> Ok, first, I don't understand what your plugin does. Maybe I need to get -> some sleep, but a better explanation might help. :-) It seems to make -> links from pages to the archive pages? But I don't understand why you -> want such links .. wouldn't a sidebar with links to the available archive -> pages work better? Or something else, depending on personal preference. -> -> Secondly, you're certianly not the first to wat to do data based archive -> pages. So far I have successfully punted the issue of creating these -> pages out of ikiwiki by pointing out that everyone wants them to be -> _different_, and suggesting people set up cron jobs or other machinery to -> generate the kinds of archives that they like. This makes me happy -> because generalizing all the possible ways people might want to do date -> based archives and somehow bolting support for creating them onto the -> size of ikiwiki seems to be a recipe for a mess. -> -> A few examples of ikiwiki sites with date archives: -> and -> --[[Joey]] - ->> Yeah, it wasn't much of a description, was it? ;-) It's an attempt to emulate the style of Wordpress and other popular blog platforms, which can link a post's creation date to YYY/MM/DD archive pages, which then list all the relevant posts. My use-case is on a blog page which in-lines (via pagespecs) recent blog posts. - ->> I agree with not adding this kind of functionality to the core. :-) I simply didn't want to have break links when I convert to IkiWiki. I guess I'll just play around with the page-creation thing myself then. Feel free to delete this from the queue. :-) --Ben - ->>> Ah, I get it, I hadn't realized it was making the date into a link. ->>> No reason to delete this from the queue, it's a reasonable plugin. I ->>> might move it to the contributed plugins directory as it's a bit ->>> specialised to be included in ikiwiki though. --[[Joey]] diff --git a/doc/patchqueue/enable-htaccess-files.mdwn b/doc/patchqueue/enable-htaccess-files.mdwn deleted file mode 100644 index ed968b195..000000000 --- a/doc/patchqueue/enable-htaccess-files.mdwn +++ /dev/null @@ -1,28 +0,0 @@ - Index: IkiWiki.pm - =================================================================== - --- IkiWiki.pm (revision 2981) - +++ IkiWiki.pm (working copy) - @@ -26,7 +26,7 @@ - memoize("file_pruned"); - - sub defaultconfig () { #{{{ - - wiki_file_prune_regexps => [qr/\.\./, qr/^\./, qr/\/\./, - + wiki_file_prune_regexps => [qr/\.\./, qr/^\.(?!htaccess)/, qr/\/\.(?!htaccess)/, - qr/\.x?html?$/, qr/\.ikiwiki-new$/, - qr/(^|\/).svn\//, qr/.arch-ids\//, qr/{arch}\//], - wiki_link_regexp => qr/\[\[(?:([^\]\|]+)\|)?([^\s\]#]+)(?:#([^\s\]]+))?\]\]/, - - -This lets the site administrator have a `.htaccess` file in their underlay -directory, say, then get it copied over when the wiki is built. Without -this, installations that are located at the root of a domain don't get the -benefit of `.htaccess` such as improved directory listings, IP blocking, -URL rewriting, authorisation, etc. - -> I'm concerned about security ramifications of this patch. While ikiwiki -> won't allow editing such a .htaccess file in the web interface, it would -> be possible for a user who has svn commit access to the wiki to use it to -> add a .htaccess file that does $EVIL. -> -> Perhaps this should be something that is configurable via the setup file -> instead. --[[Joey]] diff --git a/doc/patchqueue/format_escape.mdwn b/doc/patchqueue/format_escape.mdwn deleted file mode 100644 index ba65b7072..000000000 --- a/doc/patchqueue/format_escape.mdwn +++ /dev/null @@ -1,225 +0,0 @@ -Since some preprocessor directives insert raw HTML, it would be good to -specify, per-format, how to pass HTML so that it goes through the format -OK. With Markdown we cross our fingers; with reST we use the "raw" -directive. - -I added an extra named parameter to the htmlize hook, which feels sort of -wrong, since none of the other hooks take parameters. Let me know what -you think. --Ethan - -Seems fairly reasonable, actually. Shouldn't the `$type` come from `$page` -instead of `$destpage` though? Only other obvious change is to make the -escape parameter optional, and only call it if set. --[[Joey]] - -> I couldn't figure out what to make it from, but thinking it through, -> yeah, it should be $page. Revised patch follows. --Ethan - ->> I've updated the patch some more, but I think it's incomplete. ikiwiki ->> emits raw html when expanding WikiLinks too, and it would need to escape ->> those. Assuming that escaping html embedded in the middle of a sentence ->> works.. --[[Joey]] - ->>> Revised again. I get around this by making another hook, htmlescapelink, ->>> which is called to generate links in whatever language. In addition, it ->>> doesn't (can't?) generate ->>> spans, and it doesn't handle inlineable image links. If these were ->>> desired, the approach to take would probably be to use substitution ->>> definitions, which would require generating two bits of code for each ->>> link/html snippet, and putting one at the end of the paragraph (or maybe ->>> the document?). ->>> To specify that (for example) Discussion links are meant to be HTML and ->>> not rst or whatever, I added a "genhtml" parameter to htmllink. It seems ->>> to work -- see for an example. ->>> --Ethan - -
-Index: debian/changelog
-===================================================================
---- debian/changelog	(revision 3197)
-+++ debian/changelog	(working copy)
-@@ -24,6 +24,9 @@
-     than just a suggests, since OpenID is enabled by default.
-   * Fix a bug that caused link(foo) to succeed if page foo did not exist.
-   * Fix tags to page names that contain special characters.
-+  * Based on a patch by Ethan, add a new htmlescape hook, that is called
-+    when a preprocssor directive emits inline html. The rst plugin uses this
-+    hook to support inlined raw html.
- 
-   [ Josh Triplett ]
-   * Use pngcrush and optipng on all PNG files.
-Index: IkiWiki/Render.pm
-===================================================================
---- IkiWiki/Render.pm	(revision 3197)
-+++ IkiWiki/Render.pm	(working copy)
-@@ -96,7 +96,7 @@
- 		if ($page !~ /.*\/\Q$discussionlink\E$/ &&
- 		   (length $config{cgiurl} ||
- 		    exists $links{$page."/".$discussionlink})) {
--			$template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1));
-+			$template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1, genhtml => 1));
- 			$actions++;
- 		}
- 	}
-Index: IkiWiki/Plugin/rst.pm
-===================================================================
---- IkiWiki/Plugin/rst.pm	(revision 3197)
-+++ IkiWiki/Plugin/rst.pm	(working copy)
-@@ -30,15 +30,36 @@
- html = publish_string(stdin.read(), writer_name='html', 
-        settings_overrides = { 'halt_level': 6, 
-                               'file_insertion_enabled': 0,
--                              'raw_enabled': 0 }
-+                              'raw_enabled': 1 }
- );
- print html[html.find('')+6:html.find('')].strip();
- ";
- 
- sub import { #{{{
- 	hook(type => "htmlize", id => "rst", call => \&htmlize);
-+	hook(type => "htmlescape", id => "rst", call => \&htmlescape);
-+	hook(type => "htmlescapelink", id => "rst", call => \&htmlescapelink);
- } # }}}
- 
-+sub htmlescapelink ($$;@) { #{{{
-+	my $url = shift;
-+	my $text = shift;
-+	my %params = @_;
-+
-+	if ($params{broken}){
-+		return "`? <$url>`_\ $text";
-+	}
-+	else {
-+		return "`$text <$url>`_";
-+	}
-+} # }}}
-+
-+sub htmlescape ($) { #{{{
-+	my $html=shift;
-+	$html=~s/^/  /mg;
-+	return ".. raw:: html\n\n".$html;
-+} # }}}
-+
- sub htmlize (@) { #{{{
- 	my %params=@_;
- 	my $content=$params{content};
-Index: doc/plugins/write.mdwn
-===================================================================
---- doc/plugins/write.mdwn	(revision 3197)
-+++ doc/plugins/write.mdwn	(working copy)
-@@ -121,6 +121,26 @@
- The function is passed named parameters: "page" and "content" and should
- return the htmlized content.
- 
-+### htmlescape
-+
-+	hook(type => "htmlescape", id => "ext", call => \&htmlescape);
-+
-+Some markup languages do not allow raw html to be mixed in with the markup
-+language, and need it to be escaped in some way. This hook is a companion
-+to the htmlize hook, and is called when ikiwiki detects that a preprocessor
-+directive is inserting raw html. It is passed the chunk of html in
-+question, and should return the escaped chunk.
-+
-+### htmlescapelink
-+
-+	hook(type => "htmlescapelink", id => "ext", call => \&htmlescapelink);
-+
-+Some markup languages have special syntax to link to other pages. This hook
-+is a companion to the htmlize and htmlescape hooks, and it is called when a
-+link is inserted. It is passed the target of the link and the text of the 
-+link, and an optional named parameter "broken" if a broken link is being
-+generated. It should return the correctly-formatted link.
-+
- ### pagetemplate
- 
- 	hook(type => "pagetemplate", id => "foo", call => \&pagetemplate);
-@@ -355,6 +375,7 @@
- * forcesubpage  - set to force a link to a subpage
- * linktext - set to force the link text to something
- * anchor - set to make the link include an anchor
-+* genhtml - set to generate HTML and not escape for correct format
- 
- #### `readfile($;$)`
- 
-Index: doc/plugins/rst.mdwn
-===================================================================
---- doc/plugins/rst.mdwn	(revision 3197)
-+++ doc/plugins/rst.mdwn	(working copy)
-@@ -10,10 +10,8 @@
- Note that this plugin does not interoperate very well with the rest of
- ikiwiki. Limitations include:
- 
--* reStructuredText does not allow raw html to be inserted into
--  documents, but ikiwiki does so in many cases, including
--  [[WikiLinks|WikiLink]] and many
--  [[PreprocessorDirectives|PreprocessorDirective]].
-+* Some bits of ikiwiki may still assume that markdown is used or embed html
-+  in ways that break reStructuredText. (Report bugs if you find any.)
- * It's slow; it forks a copy of python for each page. While there is a
-   perl version of the reStructuredText processor, it is not being kept in
-   sync with the standard version, so is not used.
-Index: IkiWiki.pm
-===================================================================
---- IkiWiki.pm	(revision 3197)
-+++ IkiWiki.pm	(working copy)
-@@ -469,6 +469,10 @@
- 	my $page=shift; # the page that will contain the link (different for inline)
- 	my $link=shift;
- 	my %opts=@_;
-+	# we are processing $lpage and so we need to format things in accordance
-+	# with the formatting language of $lpage. inline generates HTML so links
-+	# will be escaped seperately.
-+	my $type=pagetype($pagesources{$lpage});
- 
- 	my $bestlink;
- 	if (! $opts{forcesubpage}) {
-@@ -494,12 +498,17 @@
- 	}
- 	if (! grep { $_ eq $bestlink } map { @{$_} } values %renderedfiles) {
- 		return $linktext unless length $config{cgiurl};
--		return " "create",
--				page => pagetitle(lc($link), 1),
--				from => $lpage
--			).
-+		my $url = cgiurl(
-+				 do => "create",
-+				 page => pagetitle(lc($link), 1),
-+				 from => $lpage
-+				);
-+
-+		if ($hooks{htmlescapelink}{$type} && ! $opts{genhtml}){
-+			return $hooks{htmlescapelink}{$type}{call}->($url, $linktext,
-+							       broken => 1);
-+		}
-+		return "?$linktext"
- 	}
- 	
-@@ -514,6 +523,9 @@
- 		$bestlink.="#".$opts{anchor};
- 	}
- 
-+	if ($hooks{htmlescapelink}{$type} && !$opts{genhtml}) {
-+	  return $hooks{htmlescapelink}{$type}{call}->($bestlink, $linktext);
-+	}
- 	return "$linktext";
- } #}}}
- 
-@@ -628,6 +640,14 @@
- 				preview => $preprocess_preview,
- 			);
- 			$preprocessing{$page}--;
-+
-+			# Handle escaping html if the htmlizer needs it.
-+			if ($ret =~ /[<>]/ && $pagesources{$page}) {
-+				my $type=pagetype($pagesources{$page});
-+				if ($hooks{htmlescape}{$type}) {
-+					return $hooks{htmlescape}{$type}{call}->($ret);
-+				}
-+			}
- 			return $ret;
- 		}
- 		else {
-
diff --git a/doc/patchqueue/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn b/doc/patchqueue/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn deleted file mode 100644 index 1efa5361f..000000000 --- a/doc/patchqueue/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn +++ /dev/null @@ -1,92 +0,0 @@ -Hi, - -some operating systems use PREFIX/man instead of PREFIX/share/man as the base -directory for man pages and PREFIX/libexec/ instead of PREFIX/lib/ for files -like CGI programs. -At the moment the location of the installed man pages and the w3m cgi wrapper -is hard-coded in Makefile.PL. -The patch below makes it possible to install those files to alternative directories -while the default stays as it is now. - -> It should be possible to use the existing MakeMaker variables such as -> INSTALLMAN1DIR (though MakeMaker lacks one for man8). I'd prefer not -> adding new variables where MakeMaker already has them. --[[Joey]] - -
-
-  - Introduce two variables, IKI_MANDIR and IKI_W3MCGIDIR, to be set from
-    the command line. This enables locations for man pages and the w3m
-    cgi wrapper other than the hard-coded defaults in Makefile.PL.
-
---- Makefile.PL.orig    2007-05-20 03:03:58.000000000 +0200
-+++ Makefile.PL
-@@ -3,9 +3,32 @@ use warnings;
- use strict;
- use ExtUtils::MakeMaker;
- 
-+my %params = ( 'IKI_MANDIR' => '$(PREFIX)/share/man',
-+               'IKI_W3MCGIDIR' => '$(PREFIX)/lib/w3m/cgi-bin'
-+             );
-+
-+@ARGV = grep {
-+  my ($key, $value) = split(/=/, $_, 2);
-+  if ( exists $params{$key} ) {
-+    $params{$key} = $value;
-+    print "Using $params{$key} for $key.\n";
-+    0
-+  } else {
-+    1
-+  }
-+} @ARGV;
-+
-+
- # Add a few more targets.
- sub MY::postamble {
--q{
-+  package MY;
-+
-+  my $scriptvars = <<"EOSCRIPTVARS";
-+IKI_MANDIR = $params{'IKI_MANDIR'}
-+IKI_W3MCGIDIR = $params{'IKI_W3MCGIDIR'}
-+EOSCRIPTVARS
-+
-+  my $script = q{
- all:: extra_build
- clean:: extra_clean
- install:: extra_install
-@@ -56,23 +79,24 @@ extra_install:
-                done; \
-        done
- 
--       install -d $(DESTDIR)$(PREFIX)/share/man/man1
--       install -m 644 ikiwiki.man $(DESTDIR)$(PREFIX)/share/man/man1/ikiwiki.1
-+       install -d $(DESTDIR)$(IKI_MANDIR)/man1
-+       install -m 644 ikiwiki.man $(DESTDIR)$(IKI_MANDIR)/man1/ikiwiki.1
-        
--       install -d $(DESTDIR)$(PREFIX)/share/man/man8
--       install -m 644 ikiwiki-mass-rebuild.man $(DESTDIR)$(PREFIX)/share/man/ma
-n8/ikiwiki-mass-rebuild.8
-+       install -d $(DESTDIR)$(IKI_MANDIR)/man8
-+       install -m 644 ikiwiki-mass-rebuild.man $(DESTDIR)$(IKI_MANDIR)/man8/iki
-wiki-mass-rebuild.8
-        
-        install -d $(DESTDIR)$(PREFIX)/sbin
-        install ikiwiki-mass-rebuild $(DESTDIR)$(PREFIX)/sbin
- 
--       install -d $(DESTDIR)$(PREFIX)/lib/w3m/cgi-bin
--       install ikiwiki-w3m.cgi $(DESTDIR)$(PREFIX)/lib/w3m/cgi-bin
-+       install -d $(DESTDIR)$(IKI_W3MCGIDIR)
-+       install ikiwiki-w3m.cgi $(DESTDIR)$(IKI_W3MCGIDIR)
- 
-        install -d $(DESTDIR)$(PREFIX)/bin
-        install ikiwiki.out $(DESTDIR)$(PREFIX)/bin/ikiwiki
- 
-        $(MAKE) -C po install PREFIX=$(PREFIX)
--}
-+};
-+  return $scriptvars.$script;
- }
- 
- WriteMakefile(
-
-
diff --git a/doc/patchqueue/index.html_allowed.mdwn b/doc/patchqueue/index.html_allowed.mdwn deleted file mode 100644 index f8bf15ac4..000000000 --- a/doc/patchqueue/index.html_allowed.mdwn +++ /dev/null @@ -1,104 +0,0 @@ -This page used to be used for two patches, one of which is applied -providing the usedirs option for output. The remaining patch, discussed -below, concerns wanting to use foo/index.mdwn source files and get an -output page name of foo, rather than foo/index. --[[Joey]] - ---- - -I independently implemented a similar, but smaller patch. -(It's smaller because I only care about rendering; not CGI, for example.) -The key to this patch is that "A/B/C" is treated as equivalent -to "A/B/C/index". -Here it is: --Per Bothner - - --- IkiWiki/Render.pm~ 2007-01-11 15:01:51.000000000 -0800 - +++ IkiWiki/Render.pm 2007-02-02 22:24:12.000000000 -0800 - @@ -60,9 +60,9 @@ - foreach my $dir (reverse split("/", $page)) { - if (! $skip) { - $path.="../"; - - unshift @ret, { url => $path.htmlpage($dir), page => pagetitle($dir) }; - + unshift @ret, { url => abs2rel(htmlpage(bestlink($page, $dir)), dirname($page)), page => pagetitle($dir) }; - } - - else { - + elsif ($dir ne "index") { - $skip=0; - } - } - - --- IkiWiki.pm~ 2007-01-12 12:47:09.000000000 -0800 - +++ IkiWiki.pm 2007-02-02 18:02:16.000000000 -0800 - @@ -315,6 +315,12 @@ - elsif (exists $pagecase{lc $l}) { - return $pagecase{lc $l}; - } - + else { - + my $lindex = $l . "/index"; - + if (exists $links{$lindex}) { - + return $lindex; - + } - + } - } while $cwd=~s!/?[^/]+$!!; - - if (length $config{userdir} && exists $links{"$config{userdir}/".lc($link)}) { - -Note I handle setting the url; slightly differently. -Also note that an initial "index" is ignored. I.e. a -page "A/B/index.html" is treated as "A/B". - -> Actually, your patch is shorter because it's more elegant and better :) -> I'm withdrawing my old patch, because yours is much more in line with -> ikiwiki's design and architecture. -> I would like to make one suggestion to your patch, which is: - - diff -urX ignorepats clean-ikidev/IkiWiki/Plugin/inline.pm ikidev/IkiWiki/Plugin/inline.pm - --- clean-ikidev/IkiWiki/Plugin/inline.pm 2007-02-25 12:26:54.099113000 -0800 - +++ ikidev/IkiWiki/Plugin/inline.pm 2007-02-25 14:55:21.163340000 -0800 - @@ -154,7 +154,7 @@ - $link=htmlpage($link) if defined $type; - $link=abs2rel($link, dirname($params{destpage})); - $template->param(pageurl => $link); - - $template->param(title => pagetitle(basename($page))); - + $template->param(title => titlename($page)); - $template->param(ctime => displaytime($pagectime{$page})); - - if ($actions) { - @@ -318,7 +318,7 @@ - my $pcontent = absolute_urls(get_inline_content($p, $page), $url); - - $itemtemplate->param( - - title => pagetitle(basename($p), 1), - + title => titlename($p, 1), - url => $u, - permalink => $u, - date_822 => date_822($pagectime{$p}), - diff -urX ignorepats clean-ikidev/IkiWiki/Render.pm ikidev/IkiWiki/Render.pm - --- clean-ikidev/IkiWiki/Render.pm 2007-02-25 12:26:54.745833000 -0800 - +++ ikidev/IkiWiki/Render.pm 2007-02-25 14:54:01.564715000 -0800 - @@ -110,7 +110,7 @@ - $template->param( - title => $page eq 'index' - ? $config{wikiname} - - : pagetitle(basename($page)), - + : titlename($page), - wikiname => $config{wikiname}, - parentlinks => [parentlinks($page)], - content => $content, - diff -urX ignorepats clean-ikidev/IkiWiki.pm ikidev/IkiWiki.pm - --- clean-ikidev/IkiWiki.pm 2007-02-25 12:26:58.812850000 -0800 - +++ ikidev/IkiWiki.pm 2007-02-25 15:05:22.328852000 -0800 - @@ -192,6 +192,12 @@ - return $untainted; - } #}}} - - +sub titlename($;@) { #{{{ - + my $page = shift; - + $page =~ s!/index$!!; - + return pagetitle(basename($page), @_); - +} #}}} - + - sub basename ($) { #{{{ - my $file=shift; - - -> This way foo/index gets "foo" as its title, not "index". --Ethan diff --git a/doc/patchqueue/l10n.mdwn b/doc/patchqueue/l10n.mdwn deleted file mode 100644 index 3369bec11..000000000 --- a/doc/patchqueue/l10n.mdwn +++ /dev/null @@ -1,61 +0,0 @@ -From [[Recai]]: -> Here is my initial work on ikiwiki l10n infrastructure (I'm sending it -> before finalizing, there may be errors). - -I've revised the patches (tested OK): - -- $config{lang} patch: - - - - + Support for CGI::FormBuilder. - + Modify Makefile.PL for l10n. - -- l10n infrastructure from Koha project. (This patch must be applied with - '-p1', also, it needs a 'chmod +x l10n/*.pl' after patching.) - - + Leave templates dir untouched, use a temporary translations directory - instead. - + Fix Makefile (it failed to update templates). - - http://people.debian.org/~roktas/patches/ikiwiki/ikiwiki-l10n.diff - -However... - -> fine. Also a final note, I haven't examined the quality of generated -> templates yet. - -Looks like, tmpl_process3 cannot preserve line breaks in template files. -For example, it processed the following template: - - Someone[1], possibly you, requested that you be emailed the password for -user - on [2]. - - The password is: - - -- - ikiwiki - - [1] The user requesting the password was at IP address - [2] Located at - -as (in Turkish): - -Birisi[1], ki muhtemelen bu sizsiniz, [2] üzerindeki - kullanıcısına ait parolanın epostalanması isteğinde -bulundu. Parola: -- ikiwiki [1] Parolayı isteyen -kullanıcının ait IP adresi: [2] - -> Looks like, tmpl_process3 cannot preserve line breaks in template files. -> For example, it processed the following template: - -This could be easily worked around in tmpl_process3, but I wouldn't like to -maintain a separate utility. - ----- - -As to the hardcoded strings in ikiwiki, I've internationalized the program, -and there is a po/ikiwiki.pot in the source that can be translated. ---[[Joey]] diff --git a/doc/patchqueue/more_class__61____34____34___for_css.mdwn b/doc/patchqueue/more_class__61____34____34___for_css.mdwn deleted file mode 100644 index 49affd29b..000000000 --- a/doc/patchqueue/more_class__61____34____34___for_css.mdwn +++ /dev/null @@ -1,59 +0,0 @@ -I'm writing my own CSS for ikiwiki. During this effort I often found the need of adding more class="" attributes to the default ikiwiki templates. This way more presentational aspects of visual formatting can be delegated to CSS and removed from the HTML structure. - -In this patch I plan to collect changes in this direction. - -The first, one-liner, patch is to use a "div" element with a -class="actions" attribute for inline page as is done with non-inlined page. -This way the same CSS formatting can be applied to div.actions in the CSS, -while at the moment it must be duplicated for a span.actions (which I -believe is also incorrect, since it will contain a "ul" element, not sure -though). In case the markup should be differentiated it will still be -possible relying on the fact that a div.actions is contained or not in a -div.inlinepage. - -Here's the one-liner: - -> applied --[[Joey]] - -The following adds a div element with class="trailer" around the meta-information -added after an inlined page (namely: the post date, the tags, and the actions): - - --- inlinepage.tmpl.orig 2006-12-28 16:56:49.000000000 +0100 - +++ inlinepage.tmpl 2006-12-28 17:02:06.000000000 +0100 - @@ -17,6 +17,8 @@ - - - - +
- + - - Posted - - @@ -44,3 +46,5 @@ - - -
- + - + - -> Unfortunately, the inlinepage content passes through markdown, and markdown -> gets confused by these nested div's and puts p's around one of them, generating -> broken html. If you can come up with a way to put in the div that passes -> the test suite, or a fix to markdown, I will accept it, but the above patch -> fails the test suite. --[[Joey]] - ->> Just a note... This discrepancy doesn't exist in [pandoc](http://code.google.com/p/pandoc/) as ->> demonstrated in the relevant [page](http://code.google.com/p/pandoc/wiki/PandocVsMarkdownPl). ->> Pandoc is a _real parser_ for markdown (contrasting the regexp based implementation of ->> markdown.pl). I've almost finished the Debian packaging. John is working on a `--strict` mode ->> which will hopefully make pandoc a drop-in replacement for markdown. I'll upload pandoc after ->> his work has finished. Whether it could be used in IkiWiki is an open question, but having ->> alternatives is always a good thing and perhaps, the fact that pandoc can make markdown->LaTeX ->> conversion may lead to new possibilities. --[[Roktas]] - ->>> I confirm that this ([[debbug 405058]]) has just been fixed in markdown ->>> [`1.0.2b7`](http://packages.debian.org/experimental/web/markdown) (BTW, thanks to your bug ->>> report Joey). FYI, I've observed some performance drop with `1.0.2b7` compared to `1.0.1`, ->>> especially noticable with big files. This was also confirmed by someone else, for example, ->>> see this [thread](http://six.pairlist.net/pipermail/markdown-discuss/2006-August/000152.html) ->>> --[[Roktas]] \ No newline at end of file diff --git a/doc/patchqueue/move_page.mdwn b/doc/patchqueue/move_page.mdwn deleted file mode 100644 index 21be9ba6b..000000000 --- a/doc/patchqueue/move_page.mdwn +++ /dev/null @@ -1,274 +0,0 @@ -This is my second cut at a feature like that requested in [[todo/Moving_Pages]]. -It can also be found [here](http://ikidev.betacantrips.com/patches/move.patch). - -A few shortcomings exist: - -* No precautions whatsoever are made to protect against race conditions or failures - in the rcs\_move function. I didn't even do the `cgi_editpage` thing where I hold - the lock and render afterwards (mostly because the copy I was editing was not - up-to-date enough to have that code). Although FAILED_SAVE is in movepage.tmpl, - no code activates it yet. -* Some code is duplicated between cgi\_movepage and cgi\_editpage, as well - as rcs\_commit and rcs\_move. -* The user interface is pretty lame. I couldn't figure out a good way to let - the user specify which directory to move things to without implementing a - FileChooser thing. -* No redirect pages like those mentioned on [[todo/Moving_Pages]] exist yet, - so none are created. -* I added a Move link to page.tmpl but it may belong better someplace else -- - maybe editpage.tmpl? Not sure. -* from is redundant with page so far -- but since the Move links could someday - come from someplace other than the page itself I kept it around. -* If I move foo.mdwn to bar.mdwn, foo/* should move too, probably. - -> Looks like a good start, although I agree about many of the points above, -> and also feel that something needs to be done about rcses that don't -> implement a move operation -- falling back to an add and delete. -> --[[Joey]] - -Hmm. Shouldn't that be done on a by-RCS basis, though? (i.e. implemented -by backends in the `rcs_move` function) - -> Probably, yes, but maybe there's a way to avoid duplicating code for that -> in several of them. - -Also, how should ikiwiki react if a page is edited (say, by another user) -before it is moved? Bail, or shrug and proceed? - -> The important thing is to keep in mind that the page could be edited, -> moved, deleted, etc in between the user starting the move and the move -> happening. So, the code really needs to deal with all of these cases in -> some way. It seems fine to me to go ahead with the move even if the page -> was edited. If the page was deleted or moved, it seems reasonable to exit -> with an error. - - diff -urNX ignorepats ikiwiki/IkiWiki/CGI.pm ikidev/IkiWiki/CGI.pm - --- ikiwiki/IkiWiki/CGI.pm 2007-02-14 18:17:12.000000000 -0800 - +++ ikidev/IkiWiki/CGI.pm 2007-02-22 18:54:23.194982000 -0800 - @@ -561,6 +561,106 @@ - } - } #}}} - - +sub cgi_movepage($$) { - + my $q = shift; - + my $session = shift; - + eval q{use CGI::FormBuilder}; - + error($@) if $@; - + my @fields=qw(do from rcsinfo page newdir newname comments); - + my @buttons=("Rename Page", "Cancel"); - + - + my $form = CGI::FormBuilder->new( - + fields => \@fields, - + header => 1, - + charset => "utf-8", - + method => 'POST', - + action => $config{cgiurl}, - + template => (-e "$config{templatedir}/movepage.tmpl" ? - + {template_params("movepage.tmpl")} : ""), - + ); - + run_hooks(formbuilder_setup => sub { - + shift->(form => $form, cgi => $q, session => $session); - + }); - + - + decode_form_utf8($form); - + - + # This untaint is safe because if the page doesn't exist, bail. - + my $page = $form->field('page'); - + $page = possibly_foolish_untaint($page); - + if (! exists $pagesources{$page}) { - + error("page does not exist"); - + } - + my $file=$pagesources{$page}; - + my $type=pagetype($file); - + - + my $from; - + if (defined $form->field('from')) { - + ($from)=$form->field('from')=~/$config{wiki_file_regexp}/; - + } - + - + $form->field(name => "do", type => 'hidden'); - + $form->field(name => "from", type => 'hidden'); - + $form->field(name => "rcsinfo", type => 'hidden'); - + $form->field(name => "newdir", type => 'text', size => 80); - + $form->field(name => "page", value => $page, force => 1); - + $form->field(name => "newname", type => "text", size => 80); - + $form->field(name => "comments", type => "text", size => 80); - + $form->tmpl_param("can_commit", $config{rcs}); - + $form->tmpl_param("indexlink", indexlink()); - + $form->tmpl_param("baseurl", baseurl()); - + - + if (! $form->submitted) { - + $form->field(name => "rcsinfo", value => rcs_prepedit($file), - + force => 1); - + } - + - + if ($form->submitted eq "Cancel") { - + redirect($q, "$config{url}/".htmlpage($page)); - + return; - + } - + - + if (! $form->submitted || ! $form->validate) { - + check_canedit($page, $q, $session); - + $form->tmpl_param("page_select", 0); - + $form->field(name => "page", type => 'hidden'); - + $form->field(name => "type", type => 'hidden'); - + $form->title(sprintf(gettext("moving %s"), pagetitle($page))); - + my $pname = basename($page); - + my $dname = dirname($page); - + if (! defined $form->field('newname') || - + ! length $form->field('newname')) { - + $form->field(name => "newname", - + value => pagetitle($pname, 1), force => 1); - + } - + if (! defined $form->field('newdir') || - + ! length $form->field('newdir')) { - + $form->field(name => "newdir", - + value => pagetitle($dname, 1), force => 1); - + } - + print $form->render(submit => \@buttons); - + } - + else{ - + # This untaint is safe because titlepage removes any problematic - + # characters. - + my ($newname)=$form->field('newname'); - + $newname=titlepage(possibly_foolish_untaint($newname)); - + my ($newdir)=$form->field('newdir'); - + $newdir=titlepage(possibly_foolish_untaint($newdir)); - + if (! defined $newname || ! length $newname || file_pruned($newname, $config{srcdir}) || $newname=~/^\//) { - + error("bad page name"); - + } - + check_canedit($page, $q, $session); - + - + my $newpage = ($newdir?"$newdir/":"") . $newname; - + my $newfile = $newpage . ".$type"; - + my $message = $form->field('comments'); - + unlockwiki(); - + rcs_move($file, $newfile, $message, $form->field("rcsinfo"), - + $session->param("name"), $ENV{REMOTE_ADDR}); - + redirect($q, "$config{url}/".htmlpage($newpage)); - + } - +} - + - sub cgi_getsession ($) { #{{{ - my $q=shift; - - @@ -656,6 +756,9 @@ - elsif (defined $session->param("postsignin")) { - cgi_postsignin($q, $session); - } - + elsif ($do eq 'move') { - + cgi_movepage($q, $session); - + } - elsif ($do eq 'prefs') { - cgi_prefs($q, $session); - } - diff -urNX ignorepats ikiwiki/IkiWiki/Rcs/svn.pm ikidev/IkiWiki/Rcs/svn.pm - --- ikiwiki/IkiWiki/Rcs/svn.pm 2007-01-27 16:04:48.000000000 -0800 - +++ ikidev/IkiWiki/Rcs/svn.pm 2007-02-22 01:51:29.923626000 -0800 - @@ -60,6 +60,34 @@ - } - } #}}} - - +sub rcs_move ($$$$;$$) { - + my $file=shift; - + my $newname=shift; - + my $message=shift; - + my $rcstoken=shift; - + my $user=shift; - + my $ipaddr=shift; - + if (defined $user) { - + $message="web commit by $user".(length $message ? ": $message" : ""); - + } - + elsif (defined $ipaddr) { - + $message="web commit from $ipaddr".(length $message ? ": $message" : ""); - + } - + - + chdir($config{srcdir}); # svn merge wants to be here - + - + if (system("svn", "move", "--quiet", - + "$file", "$newname") != 0) { - + return 1; - + } - + if (system("svn", "commit", "--quiet", - + "--encoding", "UTF-8", "-m", - + possibly_foolish_untaint($message)) != 0) { - + return 1; - + } - + return undef # success - +} - + - sub rcs_commit ($$$;$$) { #{{{ - # Tries to commit the page; returns undef on _success_ and - # a version of the page with the rcs's conflict markers on failure. - diff -urNX ignorepats ikiwiki/IkiWiki/Render.pm ikidev/IkiWiki/Render.pm - --- ikiwiki/IkiWiki/Render.pm 2007-02-14 17:00:05.000000000 -0800 - +++ ikidev/IkiWiki/Render.pm 2007-02-22 18:30:00.451755000 -0800 - @@ -80,6 +80,7 @@ - - if (length $config{cgiurl}) { - $template->param(editurl => cgiurl(do => "edit", page => $page)); - + $template->param(moveurl => cgiurl(do => "move", page => $page)); - $template->param(prefsurl => cgiurl(do => "prefs")); - if ($config{rcs}) { - $template->param(recentchangesurl => cgiurl(do => "recentchanges")); - diff -urNX ignorepats ikiwiki/templates/movepage.tmpl ikidev/templates/movepage.tmpl - --- ikiwiki/templates/movepage.tmpl 1969-12-31 16:00:00.000000000 -0800 - +++ ikidev/templates/movepage.tmpl 2007-02-22 18:40:39.751763000 -0800 - @@ -0,0 +1,44 @@ - + - + - + - + - + - +<TMPL_VAR FORM-TITLE> - + - + - + - + - + - + - + - + - +

- +Failed to save your changes. - +

- +

- +Your changes were not able to be saved to disk. The system gave the error: - +

- + - +
- +Your changes are preserved below, and you can try again to save them. - +

- +
- + - +
- +/ - +
- + - + - + - + - +New location: / - +
- + - +Optional comment about this change:
- +
- +
- + - + - + - + - diff -urNX ignorepats ikiwiki/templates/page.tmpl ikidev/templates/page.tmpl - --- ikiwiki/templates/page.tmpl 2006-12-28 12:27:01.000000000 -0800 - +++ ikidev/templates/page.tmpl 2007-02-22 01:52:33.078464000 -0800 - @@ -32,6 +32,9 @@ - -
  • Edit
  • -
    - + - +
  • Move
  • - +
    - -
  • RecentChanges
  • -
    diff --git a/doc/patchqueue/rcs___40__third-party_plugin__41__.mdwn b/doc/patchqueue/rcs___40__third-party_plugin__41__.mdwn deleted file mode 100644 index 977022cf8..000000000 --- a/doc/patchqueue/rcs___40__third-party_plugin__41__.mdwn +++ /dev/null @@ -1,23 +0,0 @@ -Here is a beginning of a rcs plugin that uses rcsmerge, rcs, ci, co and rlog. -I have used it probably over hundred times but needs some work. - - - -> Clearly needs some cleanup and perhaps some of the missing stubs -> implemented, before it can be included into ikiwiki. -> -> Notes on individual functions: -> -> * rcs_prepedit - I'm not sure why you do the locking since the comment -> notes that the locking does no good.. -> -> * rcs_getctime - You ask why this would be better than mtime. It's -> because with something like subversion, a file's modification time or -> ctime is not necessarily accurate WRT when the file was first checked -> into the repo. -> ---[[Joey]] - -Also here is a quick script to browse the RCS history to use for "historyurl". - - diff --git a/doc/patchqueue/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn b/doc/patchqueue/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn deleted file mode 100644 index 41f92d554..000000000 --- a/doc/patchqueue/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn +++ /dev/null @@ -1,233 +0,0 @@ -varioki - Add variables for use in ikiwiki templates - -This plugin attempts to provide a means to add templates for use in ikiwiki templates, based on a hash variable set in the ikiwiki configuration file. The motivation for this plugin was to provide an easy way for end users to add information to be used in templates -- for example, my "Blosxom" blog entry template does fancy things with the date components of the entry, and there was no easy way to get that information into the template. Or if one wants to have a different page template for the top level index page than for the rest of the pages inthe wiki (for example, to only put special content, like, say, 'last.fm" play lists, only on the front page). - -This plugin hooks itsef into the "pagetemplate" hook, and adds parameters to the appropriate templates based on the type. For example, the following inserted into "ikiwiki.setup" creates "TMPL_VAR MOTTO" and "TOPLVL" which can then be used in your templates. - - varioki => { - ’motto’ => ’"Manoj\’s musings"’, - ’toplvl’ => ’sub {return $page eq "index"}’ - }, - -For every key in the configured hash, the corresponding value is evaluated. Based on whether the value was a stringified scalar, code, array, or hash, the value of the template parameter is generated on the fly. The available variables are whatever is available to "pagetemplate" hook scripts, namely, $page, $destpage, and $template. Additionally, the global variables and functions as defined in the Ikiwiki documentation () may be used. - -ManojSrivastava - -> I think you could now implement "toplvl" using [[conditionals|/plugins/conditional]]: -> -> \[[if test="destpage(/index)" then="""...""" else="""..."""]] -> -> --[[JoshTriplett]] - -> Here's a dump of the file Manoj sent me, for reference. -> -> My take on this is that simple plugins can do the same sort of things, this is -> kind of wanting to avoid the plugin mechanism and just use templates and -> stuff in the config file. Not too thrilled about that. --[[Joey]] - ----- - -
    -* looking for srivasta@debian.org--2006-misc/ikiwiki--upstream--1.0--patch-488 to compare with
    -* comparing to srivasta@debian.org--2006-misc/ikiwiki--upstream--1.0--patch-488: ................................................................ done.
    -
    -* added files
    -
    ---- /dev/null
    -+++ mod/IkiWiki/Plugin/.arch-ids/varioki.pm.id
    -@@ -0,0 +1 @@
    -+Manoj Srivastava  Thu Dec  7 12:59:07 2006 12659.0
    ---- /dev/null
    -+++ mod/IkiWiki/Plugin/varioki.pm
    -@@ -0,0 +1,190 @@
    -+#!/usr/bin/perl
    -+#                              -*- Mode: Cperl -*- 
    -+# varioki.pm --- 
    -+# Author           : Manoj Srivastava ( srivasta@glaurung.internal.golden-gryphon.com ) 
    -+# Created On       : Wed Dec  6 22:25:44 2006
    -+# Created On Node  : glaurung.internal.golden-gryphon.com
    -+# Last Modified By : Manoj Srivastava
    -+# Last Modified On : Thu Dec  7 13:07:36 2006
    -+# Last Machine Used: glaurung.internal.golden-gryphon.com
    -+# Update Count     : 127
    -+# Status           : Unknown, Use with caution!
    -+# HISTORY          : 
    -+# Description      : 
    -+# 
    -+# arch-tag: 6961717b-156f-4ab2-980f-0d6a973aea21
    -+#
    -+# Copyright (c) 2006 Manoj Srivastava 
    -+#
    -+# This program is free software; you can redistribute it and/or modify
    -+# it under the terms of the GNU General Public License as published by
    -+# the Free Software Foundation; either version 2 of the License, or
    -+# (at your option) any later version.
    -+#
    -+# This program is distributed in the hope that it will be useful,
    -+# but WITHOUT ANY WARRANTY; without even the implied warranty of
    -+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    -+# GNU General Public License for more details.
    -+#
    -+# You should have received a copy of the GNU General Public License
    -+# along with this program; if not, write to the Free Software
    -+# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
    -+#
    -+
    -+require 5.002;
    -+
    -+package IkiWiki::Plugin::varioki;
    -+
    -+use warnings;
    -+use strict;
    -+use IkiWiki '1.00';
    -+
    -+our $VERSION = "0.1";
    -+my $file = __FILE__;
    -+
    -+
    -+=head1 NAME
    -+
    -+varioki - Add variables for use in ikiwiki templates
    -+
    -+=cut
    -+
    -+=head1 DESCRIPTION
    -+
    -+This plugin attempts to provide a means to add templates for use in
    -+ikiwiki templates, based on a hash variable set in the ikiwiki
    -+configuration file. The motivation for this plugin was to provide an
    -+easy way for end users to add information to be used in templates --
    -+for example, my C blog entry template does fancy things with
    -+the date components of the entry, and there was no easy way to get
    -+that information into the template. Or if one wants to have a
    -+different page template for the top level index page than for the rest
    -+of the pages in the wiki (for example, to only put special content,
    -+like, say, C play lists, only on the front page).
    -+
    -+This plugin hooks itsef into the C hook, and adds
    -+parameters to the appropriate templates based on the type. For
    -+example, the following inseted into C creates
    -+C, C, C and C which can
    -+then be used in your templates. The array and hash variables are only
    -+for completeness; I suspect that the first two forms are all that are
    -+really required.
    -+
    -+ varioki => {
    -+   'motto'    => '"Manoj\'s musings"',
    -+   'toplvl'   => 'sub {return $page eq "index"}',
    -+   'arrayvar' => '[0, 1, 2, 3]',
    -+   'hashvar'  => '{1, 1, 2, 2}'
    -+ },
    -+
    -+Please note that the values in the hash must be simple strings which
    -+are then eval'd, so a string value has to be double quoted, as above
    -+(the eval strips off the outer quotes).  
    -+
    -+=cut
    -+
    -+
    -+sub import { #{{{
    -+	hook(type => "pagetemplate", id => "varioki", call => \&pagetemplate);
    -+} # }}}
    -+
    -+
    -+=pod
    -+
    -+For every key in the configured hash, the corresponding value is
    -+evaluated.  Based on whether the value was a stringified scalar, code,
    -+array, or hash, the value of the template parameter is generated on
    -+the fly.  The available variables are whatever is available to
    -+C hook scripts, namely, C<$page>, C<$destpage>, and
    -+C<$template>.  Additionally, the global variables and functions as
    -+defined in the Ikiwiki documentation
    -+(L) may be used.
    -+
    -+=cut
    -+
    -+sub pagetemplate (@) { #{{{
    -+	my %params=@_;
    -+	my $page=$params{page};
    -+	my $template=$params{template};
    -+        
    -+        return unless defined $config{varioki};
    -+         for my $var (keys %{$config{varioki}}) {
    -+           my $value;
    -+           my $foo;
    -+           eval "\$foo=$config{varioki}{$var}";
    -+           if (ref($foo) eq "CODE") {
    -+             $value = $foo->();
    -+           }
    -+           elsif (ref($foo) eq "SCALAR") {
    -+             $value = $foo;
    -+           }
    -+           elsif (ref($foo) eq "ARRAY") {
    -+             $value = join ' ', @$foo;
    -+           }
    -+           elsif (ref($foo) eq "HASH") {
    -+             for my $i (values %$foo ) {
    -+               $value .= ' ' . "$i";
    -+             }
    -+           }
    -+           else {
    -+             $value = $foo;
    -+           }
    -+           warn "$page $var $value\n";
    -+           if ($template->query(name => "$var")) {
    -+             $template->param("$var" =>"$value");
    -+           }
    -+        }
    -+} # }}}
    -+
    -+1;
    -+
    -+=head1 CAVEATS
    -+
    -+This is very inchoate, at the moment, and needs testing. Also, there
    -+is no good way to determine how to handle hashes as values --
    -+currently, the code just joins all hash values with spaces, but it
    -+would be easier for the user to just use an anonymous sub instead of
    -+passing in a hash or an array.
    -+
    -+=cut
    -+
    -+=head1 BUGS
    -+
    -+Since C evals the configuration file, the values have to all
    -+on a single physical line. This is the reason we need to use strings
    -+and eval, instead of just passing in real anonymous sub references,
    -+since the eval pass converts the coderef into a string of the form
    -+"(CODE 12de345657)" which can't be dereferenced.
    -+
    -+=cut
    -+
    -+=head1 AUTHOR
    -+
    -+Manoj Srivastava 
    -+
    -+=head1 COPYRIGHT AND LICENSE
    -+
    -+This script is a part of the Devotee package, and is 
    -+
    -+Copyright (c) 2002 Manoj Srivastava 
    -+
    -+This program is free software; you can redistribute it and/or modify
    -+it under the terms of the GNU General Public License as published by
    -+the Free Software Foundation; either version 2 of the License, or
    -+(at your option) any later version.
    -+
    -+This program is distributed in the hope that it will be useful,
    -+but WITHOUT ANY WARRANTY; without even the implied warranty of
    -+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    -+GNU General Public License for more details.
    -+
    -+You should have received a copy of the GNU General Public License
    -+along with this program; if not, write to the Free Software
    -+Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
    -+
    -+=cut
    -+
    -+1;
    -+
    -+__END__
    -+
    -
    diff --git a/doc/patchqueue/various_fixes.mdwn b/doc/patchqueue/various_fixes.mdwn deleted file mode 100644 index 318e7e941..000000000 --- a/doc/patchqueue/various_fixes.mdwn +++ /dev/null @@ -1,172 +0,0 @@ -Sorry if you don't appreciate me lumping all of these patches together. - -These are various fixes I had to make when installing Ikiwiki. Some are -due it being a non-Debian system, the others are actual bugs. - - --- upstream/IkiWiki/Rcs/svn.pm 2006-09-16 01:11:55.000000000 +0100 - +++ main/IkiWiki/Rcs/svn.pm 2006-09-16 01:12:50.000000000 +0100 - @@ -34,7 +34,7 @@ - my $field=shift; - my $file=shift; - - - my $info=`LANG=C svn info $file`; - + my $info=`svn info $file`; - my ($ret)=$info=~/^$field: (.*)$/m; - return $ret; - } #}}} - @@ -140,7 +140,7 @@ - if $svn_version =~ /\d\.(\d)\.\d/ && $1 >= 2; - - my $svn_url=svn_info("URL", $config{srcdir}); - - my $xml = XMLin(scalar `svn $svn_limit --xml -v log '$svn_url'`, - + my $xml = XMLin(scalar `svn $svn_limit --xml -v log '$svn_url' --config-dir /tmp`, - ForceArray => [ 'logentry', 'path' ], - GroupTags => { paths => 'path' }, - KeyAttr => { path => 'content' }, - -The first hunk of this patch is strange. It just failed to work with this -in place, and it took me a long time to figure it out. I realise what you -are trying to do, and it still works here as I use a similar LANG anyway. - -For reference svn version 1.3.1 (r19032), my $LANG=en_GB.utf8, but I'm not -sure what the CGI was running under. - -> That's strange. Is the problem to do with setting LANG=C or to do -> with the way it's set and exported on the same line as the svn info call? -> Can you reproduce the problem running svn info outside of ikiwiki? -> --[[Joey]] - ->> I've now managed to reproduce the problem. I'll try and give some more information. ->> When going to the Recent Changes link I get - - [Sat Sep 16 15:16:08 2006] [error] [client xxxx] svn: Can't check path '/home/jw2328/.subversion': Permission denied, referer: http://xxxxx/test/sandbox.html - [Sat Sep 16 15:16:08 2006] [error] [client xxxx] Use of uninitialized value in concatenation (.) or string at /usr/lib/perl5/site_perl/5.8.3/IkiWiki/Rcs/svn.pm line 145., referer: http://xxxx/test/sandbox.html - [Sat Sep 16 15:16:08 2006] [error] [client xxxxx] svn: Can't check path '/home/jw2328/.subversion': Permission denied, referer: http://xxxx/test/sandbox.html - [Sat Sep 16 15:16:09 2006] [error] [client xxxx] File does not exist: at /usr/lib/perl5/site_perl/5.8.3/IkiWiki/Rcs/svn.pm line 145, referer: http://xxxx/test/sandbox.html - [Sat Sep 16 15:16:09 2006] [error] [client xxxx] Premature end of script headers: ikitest, referer: http://xxxx/test/sandbox.html - ->> which the $svn_url is causing the uninitialised value, due to the ->> LANG=C it seems, as if I remove it it goes away. ->> The file does not exist is due to the unreadable .subversion. ->> echoing the LANG before it is set shows that the variable is normally ->> empty for the user that is running it. - -The second removes problems with cannot access /home/$user/.svnsomething in -the logs. I think this problem was also fatal (I should have reported these -sooner). - -I can try and debug these problems if you could suggest some way to do so, -but I am probably losing the server in a couple of days, so I can't be of too -much help I'm afraid. - -> I imagine that passing --config-dir /tmp would be either insecure or -> would limit ikiwiki use to one user per machine. -> `--config-dir /etc/subversion` might be better, though still a hack, -> since a user's real ~/.subversion might be needed to access some repos. -> -> Maybe I didn't notice this problem since I have a ~/.subversion -> everywhere that I've used ikiwiki. Hmm, no, I don't reproduce it, svn -> happily recreated the directory during an ikiwiki run after I moved it -> out of the way here. Maybe an issue with old versions of svn? Although -> AFIACR, svn has auto-created ~/.subversion for years. -> -> What's the error message? --[[Joey]] - ->> `svn: Can't check path '/home/jw2328/.subversion': Permission denied,` ->> where jw2328 is my usual user. ->> I have restrictive permissions of 0700 on home dirs on the server, ->> and the CGI is running as uid apache, euid root. (Not my setup anymore). ->> The way I had it set up, was jw2328 owning thesource dir, and the svn repo, ->> with g+sw on them both. I ran sudo ikiwiki --setup though, as I was reluctant ->> to adjust permissions on my cgi-dir. This seems to be the root of the ->> problem. - ->>> Ah, I think it's better to keep the permissions of the repository ->>> and source directory sane (755) and make the cgi suid to your user, ->>> which is how it's designed to work. - ->>>> I realise that now, and I now have a much more sane setup that works. - ----- - - --- IkiWiki.pm - +++ IkiWiki.pm - @@ -734,7 +734,18 @@ - my $page=shift; - my $spec=shift; - - - return eval pagespec_translate($spec); - + my $pagespec = pagespec_translate($spec); - + - + my $newpagespec; - + - + local($1); - + if ($pagespec =~ /(.*)/) { - + $newpagespec = $1; - + } else { - + die "oh"; - + } - + - + return eval $newpagespec; - } #}}} - - sub match_glob ($$) { #{{{ - -This works around a silly, but extremely annoying, taint bug in older -versions of perl. I'm not sure of the details, but it means that either -values become tainted from nowhere, or they don't get untainted possibly. -This also affects backports to sarge. `"oh"` is not going to be very -informative if that code path is ever taken, but I hope that it never is. - -> You're not the first person to report a problem here with older versions -> of perl and pagespec tainting. I suspect that this would work around it: - return eval possibly_foolish_untaint(pagespec_translate($spec)); -> I'm _very_ uncomfortable putting that in the shipping version of ikiwiki, -> because pagespecs are potentially _insanely_ dangerous, given how they're -> evaled and all. The tainting is the only sanity check there is that -> `pagespec_translate` manages to clean up any possibly harmful perl code -> in a pagespec. It's good to have belt and suspenders here. -> -> For all I know, older versions of perl are keeping it tainted because -> `pagespec_translate` is somehow broken under old versions of perl and is -> in fact not fully untainting the pagespec. Ok, probably not, it's more -> likely that some of the regexps in there don't manage to clear the taint -> flag with old versions of perl, while still doing a perfectly ok job of -> sanitising the pagespec. -> -> I suppose that the version of perl ($^V) could be checked and the untaint -> only be called for the old version. Though it seems it would be better -> to try to debug this some first. Maybe instrumenting `pagespec_translate` -> with calls to Scalar::Utils's tainted() function and seeing which parts -> of pagespecs arn't getting untainted would be a good start. -> -> --[[Joey]] - ->> It seems like it is always the (with instrumentation) - - elsif ($word =~ /^(link|backlink|created_before|created_after|creation_month|creation_year|creation_day)\((.+)\)$/) { - warn("\$1 tainted=".tainted($1).", \$2 tainted=".tainted($2)." \$code tainted=".tainted($code)); - $code.=" match_$1(\$page, ".safequote($2).")"; - warn("\$1 tainted=".tainted($1).", \$2 tainted=".tainted($2)." \$code tainted=".tainted($code)); - warn("safequote tainted=".tainted(safequote($2))); - } - ->> bit that causes it. With the following trace: - - $1 tainted=0, $2 tainted=0 $code tainted=0 at IkiWiki.pm line 718. - $1 tainted=0, $2 tainted=0 $code tainted=1 at IkiWiki.pm line 720. - safequote tainted=0 at IkiWiki.pm line 721. - ->> which shows that `$code` appears to become tainted from nowhere. ->> ->> is what pointed me to find the problem/workaround. - ->>> Given that verification, an untaint contingent on the value of $^V ->>> sounds reasonable and I'd accept such a patch. I'm not quite sure which ->>> version(s) of perl it should check for. - ->>>> I'm not going to write one though. I don't know what versions either, ->>>> but I think the evil of the special case is too much in this case. If ->>>> you are happy to insist on a newer version of perl then I will leave ->>>> it at that and sort something out locally. If you want the patch I will ->>>> code it though, as I realise you may want to support sarge installs. diff --git a/doc/plugins/orphans.mdwn b/doc/plugins/orphans.mdwn index a78016f98..895fd18f6 100644 --- a/doc/plugins/orphans.mdwn +++ b/doc/plugins/orphans.mdwn @@ -14,5 +14,6 @@ orphans. [[if test="enabled(orphans)" then=""" Here's a list of orphaned pages on this wiki: -[[orphans pages="* and !news/* and !todo/* and !bugs/* and !patchqueue/* and !users/* and !examples/*"]] +[[orphans pages="* and !news/* and !todo/* and !bugs/* and !users/* and +!examples/* and !tips/*"]] """]] diff --git a/doc/roadmap/discussion.mdwn b/doc/roadmap/discussion.mdwn index b9ad5d10e..0b69867bf 100644 --- a/doc/roadmap/discussion.mdwn +++ b/doc/roadmap/discussion.mdwn @@ -12,7 +12,7 @@ them to be absolute, but I definitely remember tripping over absolute pagespecs a few times when I was just starting out. Thus I think we've learned to accept it as natural, where a new user wouldn't. -* bugs, todo, news, blog, users, sandbox, and patchqueue +* bugs, todo, news, blog, users, and sandbox are all at "toplevel", so they are equivalent whether pagespecs are absolute or relative. * soc doesn't refer to any pages explicitly so it doesn't matter @@ -28,4 +28,4 @@ learned to accept it as natural, where a new user wouldn't. right now) Maybe inline should use relative pagespecs by default, and other plugins -don't? --Ethan \ No newline at end of file +don't? --Ethan diff --git a/doc/sitemap.mdwn b/doc/sitemap.mdwn index 8255c8df2..939f20a74 100644 --- a/doc/sitemap.mdwn +++ b/doc/sitemap.mdwn @@ -1,5 +1,5 @@ This map excludes discussion pages, as well as subpages that are in feeds. [[map pages="* and !*/discussion -and !bugs/* and !examples/*/* and !news/* and !patchqueue/* and !plugins/* and !sandbox/* and !todo/* and !users/* +and !bugs/* and !examples/*/* and !news/* and !tips/* and !plugins/* and !sandbox/* and !todo/* and !users/* and !*.css and !*.ico and !*.png and !*.svgz and !*.gif"]] diff --git a/doc/soc.mdwn b/doc/soc.mdwn index 2ed1305b6..f82dbdc89 100644 --- a/doc/soc.mdwn +++ b/doc/soc.mdwn @@ -14,10 +14,10 @@ the following projects will be worked on: (See [[todo/fileupload/soc-proposal]]) * Wiki WYSIWYG Editor by [[TaylorKillian]] - (See [[patchqueue/Wikiwyg_Plugin_for_IkiWiki]]) + (See [[todo/wikiwyg]]) * Creating a gallery of a bunch of images by [[ArpitJain]] - (See [[patchqueue/Gallery_Plugin_for_Ikiwiki]]) + (See [[todo/Gallery]]) Students working on these projects are encouraged to add links to any blogs, patch repositories, etc that they will use. diff --git a/doc/todo.mdwn b/doc/todo.mdwn index 01c4cf158..71c01d63c 100644 --- a/doc/todo.mdwn +++ b/doc/todo.mdwn @@ -3,6 +3,6 @@ Welcome to ikiwiki's todo list. Link items to [[todo/done]] when done. Also see the [[wishlist]] and [[bugs]] pages. [[inline pages="todo/* and !todo/done and !link(todo/done) and -!link(wishlist) and !todo/*/*" +!link(patch) and !link(wishlist) and !todo/*/*" feedpages="created_after(todo/supporting_comments_via_disussion_pages)" actions=yes rootpage="todo" postformtext="Add a new todo item titled:" show=0]] diff --git a/doc/todo/Gallery.mdwn b/doc/todo/Gallery.mdwn index ef4c55d74..a53b77d82 100644 --- a/doc/todo/Gallery.mdwn +++ b/doc/todo/Gallery.mdwn @@ -20,4 +20,35 @@ That's one way to do it, and it has some nice benefits, like being able to edit --[[Joey]] -[[tag soc]] \ No newline at end of file +[[tag soc]] + +[[tag wishlist]] + +---- + +I have implemented the first version of the Gallery Plugin for Ikiwiki as part of [[soc]]. This plugin would create a nice looking gallery of the images once the directory containing images is specified with some additional parameters. It has been build over the img plugin. + +Plugin can be downloaded from [here](http://myweb.unomaha.edu/~ajain/gallery.tar). + +It can be used as :
    +\[[gallery imagedir="images" thumbnailsize="200x200" cols="3" alt="Can not be displayed" title="My Pictures"]] + +where-
    +* imagedir => Directory containing images. It will scan all the files with jpg|png|gif extension from the directory and will put it in the gallery.
    +* thumbnailsize(optional) => Size of the thumbnail that you want to generate for the gallery.
    +* alt(optional) => If image can not be displayed, it will display the text contained in alt argument.
    +* cols(optional) => Number of columns of thumbnails that you want to generate.
    +* title(optional) => Title of the gallery.
    + +Features of the Gallery Plugin:
    +* You can go the next image by clicking on the right side of the image or by pressing 'n'.
    +* Similary, you can go the previous image by clicking on the left side of the image or by pressing 'p'.
    +* Press esc to close the gallery.
    +* While on image, nearby images are preloaded in the background, so as to make the browsing fast.
    + +Right now, it features only one template namely [Lightbox](http://www.hudddletogether.com). Later on, I will add few more templates.
    +For any feedback or query, feel free to mail me at arpitjain11 [AT] gmail.com + +Additional details are available [here](http://myweb.unomaha.edu/~ajain/ikiwikigallery.html). + +[[tag patch]] diff --git a/doc/todo/Moving_Pages.mdwn b/doc/todo/Moving_Pages.mdwn index 62dda204a..2e0603ca7 100644 --- a/doc/todo/Moving_Pages.mdwn +++ b/doc/todo/Moving_Pages.mdwn @@ -35,4 +35,281 @@ Brad > to edit the new page, only the call to redirect. > --Ethan -Note that there is a partial implementation in the [[patchqueoe|patchqueue/move_page]]. +----- + +[[tag patch]] + +This is my second cut at a feature like that requested here. +It can also be found [here](http://ikidev.betacantrips.com/patches/move.patch). + +A few shortcomings exist: + +* No precautions whatsoever are made to protect against race conditions or failures + in the rcs\_move function. I didn't even do the `cgi_editpage` thing where I hold + the lock and render afterwards (mostly because the copy I was editing was not + up-to-date enough to have that code). Although FAILED_SAVE is in movepage.tmpl, + no code activates it yet. +* Some code is duplicated between cgi\_movepage and cgi\_editpage, as well + as rcs\_commit and rcs\_move. +* The user interface is pretty lame. I couldn't figure out a good way to let + the user specify which directory to move things to without implementing a + FileChooser thing. +* No redirect pages like those mentioned on [[todo/Moving_Pages]] exist yet, + so none are created. +* I added a Move link to page.tmpl but it may belong better someplace else -- + maybe editpage.tmpl? Not sure. +* from is redundant with page so far -- but since the Move links could someday + come from someplace other than the page itself I kept it around. +* If I move foo.mdwn to bar.mdwn, foo/* should move too, probably. + +> Looks like a good start, although I agree about many of the points above, +> and also feel that something needs to be done about rcses that don't +> implement a move operation -- falling back to an add and delete. +> --[[Joey]] + +Hmm. Shouldn't that be done on a by-RCS basis, though? (i.e. implemented +by backends in the `rcs_move` function) + +> Probably, yes, but maybe there's a way to avoid duplicating code for that +> in several of them. + +Also, how should ikiwiki react if a page is edited (say, by another user) +before it is moved? Bail, or shrug and proceed? + +> The important thing is to keep in mind that the page could be edited, +> moved, deleted, etc in between the user starting the move and the move +> happening. So, the code really needs to deal with all of these cases in +> some way. It seems fine to me to go ahead with the move even if the page +> was edited. If the page was deleted or moved, it seems reasonable to exit +> with an error. + + diff -urNX ignorepats ikiwiki/IkiWiki/CGI.pm ikidev/IkiWiki/CGI.pm + --- ikiwiki/IkiWiki/CGI.pm 2007-02-14 18:17:12.000000000 -0800 + +++ ikidev/IkiWiki/CGI.pm 2007-02-22 18:54:23.194982000 -0800 + @@ -561,6 +561,106 @@ + } + } #}}} + + +sub cgi_movepage($$) { + + my $q = shift; + + my $session = shift; + + eval q{use CGI::FormBuilder}; + + error($@) if $@; + + my @fields=qw(do from rcsinfo page newdir newname comments); + + my @buttons=("Rename Page", "Cancel"); + + + + my $form = CGI::FormBuilder->new( + + fields => \@fields, + + header => 1, + + charset => "utf-8", + + method => 'POST', + + action => $config{cgiurl}, + + template => (-e "$config{templatedir}/movepage.tmpl" ? + + {template_params("movepage.tmpl")} : ""), + + ); + + run_hooks(formbuilder_setup => sub { + + shift->(form => $form, cgi => $q, session => $session); + + }); + + + + decode_form_utf8($form); + + + + # This untaint is safe because if the page doesn't exist, bail. + + my $page = $form->field('page'); + + $page = possibly_foolish_untaint($page); + + if (! exists $pagesources{$page}) { + + error("page does not exist"); + + } + + my $file=$pagesources{$page}; + + my $type=pagetype($file); + + + + my $from; + + if (defined $form->field('from')) { + + ($from)=$form->field('from')=~/$config{wiki_file_regexp}/; + + } + + + + $form->field(name => "do", type => 'hidden'); + + $form->field(name => "from", type => 'hidden'); + + $form->field(name => "rcsinfo", type => 'hidden'); + + $form->field(name => "newdir", type => 'text', size => 80); + + $form->field(name => "page", value => $page, force => 1); + + $form->field(name => "newname", type => "text", size => 80); + + $form->field(name => "comments", type => "text", size => 80); + + $form->tmpl_param("can_commit", $config{rcs}); + + $form->tmpl_param("indexlink", indexlink()); + + $form->tmpl_param("baseurl", baseurl()); + + + + if (! $form->submitted) { + + $form->field(name => "rcsinfo", value => rcs_prepedit($file), + + force => 1); + + } + + + + if ($form->submitted eq "Cancel") { + + redirect($q, "$config{url}/".htmlpage($page)); + + return; + + } + + + + if (! $form->submitted || ! $form->validate) { + + check_canedit($page, $q, $session); + + $form->tmpl_param("page_select", 0); + + $form->field(name => "page", type => 'hidden'); + + $form->field(name => "type", type => 'hidden'); + + $form->title(sprintf(gettext("moving %s"), pagetitle($page))); + + my $pname = basename($page); + + my $dname = dirname($page); + + if (! defined $form->field('newname') || + + ! length $form->field('newname')) { + + $form->field(name => "newname", + + value => pagetitle($pname, 1), force => 1); + + } + + if (! defined $form->field('newdir') || + + ! length $form->field('newdir')) { + + $form->field(name => "newdir", + + value => pagetitle($dname, 1), force => 1); + + } + + print $form->render(submit => \@buttons); + + } + + else{ + + # This untaint is safe because titlepage removes any problematic + + # characters. + + my ($newname)=$form->field('newname'); + + $newname=titlepage(possibly_foolish_untaint($newname)); + + my ($newdir)=$form->field('newdir'); + + $newdir=titlepage(possibly_foolish_untaint($newdir)); + + if (! defined $newname || ! length $newname || file_pruned($newname, $config{srcdir}) || $newname=~/^\//) { + + error("bad page name"); + + } + + check_canedit($page, $q, $session); + + + + my $newpage = ($newdir?"$newdir/":"") . $newname; + + my $newfile = $newpage . ".$type"; + + my $message = $form->field('comments'); + + unlockwiki(); + + rcs_move($file, $newfile, $message, $form->field("rcsinfo"), + + $session->param("name"), $ENV{REMOTE_ADDR}); + + redirect($q, "$config{url}/".htmlpage($newpage)); + + } + +} + + + sub cgi_getsession ($) { #{{{ + my $q=shift; + + @@ -656,6 +756,9 @@ + elsif (defined $session->param("postsignin")) { + cgi_postsignin($q, $session); + } + + elsif ($do eq 'move') { + + cgi_movepage($q, $session); + + } + elsif ($do eq 'prefs') { + cgi_prefs($q, $session); + } + diff -urNX ignorepats ikiwiki/IkiWiki/Rcs/svn.pm ikidev/IkiWiki/Rcs/svn.pm + --- ikiwiki/IkiWiki/Rcs/svn.pm 2007-01-27 16:04:48.000000000 -0800 + +++ ikidev/IkiWiki/Rcs/svn.pm 2007-02-22 01:51:29.923626000 -0800 + @@ -60,6 +60,34 @@ + } + } #}}} + + +sub rcs_move ($$$$;$$) { + + my $file=shift; + + my $newname=shift; + + my $message=shift; + + my $rcstoken=shift; + + my $user=shift; + + my $ipaddr=shift; + + if (defined $user) { + + $message="web commit by $user".(length $message ? ": $message" : ""); + + } + + elsif (defined $ipaddr) { + + $message="web commit from $ipaddr".(length $message ? ": $message" : ""); + + } + + + + chdir($config{srcdir}); # svn merge wants to be here + + + + if (system("svn", "move", "--quiet", + + "$file", "$newname") != 0) { + + return 1; + + } + + if (system("svn", "commit", "--quiet", + + "--encoding", "UTF-8", "-m", + + possibly_foolish_untaint($message)) != 0) { + + return 1; + + } + + return undef # success + +} + + + sub rcs_commit ($$$;$$) { #{{{ + # Tries to commit the page; returns undef on _success_ and + # a version of the page with the rcs's conflict markers on failure. + diff -urNX ignorepats ikiwiki/IkiWiki/Render.pm ikidev/IkiWiki/Render.pm + --- ikiwiki/IkiWiki/Render.pm 2007-02-14 17:00:05.000000000 -0800 + +++ ikidev/IkiWiki/Render.pm 2007-02-22 18:30:00.451755000 -0800 + @@ -80,6 +80,7 @@ + + if (length $config{cgiurl}) { + $template->param(editurl => cgiurl(do => "edit", page => $page)); + + $template->param(moveurl => cgiurl(do => "move", page => $page)); + $template->param(prefsurl => cgiurl(do => "prefs")); + if ($config{rcs}) { + $template->param(recentchangesurl => cgiurl(do => "recentchanges")); + diff -urNX ignorepats ikiwiki/templates/movepage.tmpl ikidev/templates/movepage.tmpl + --- ikiwiki/templates/movepage.tmpl 1969-12-31 16:00:00.000000000 -0800 + +++ ikidev/templates/movepage.tmpl 2007-02-22 18:40:39.751763000 -0800 + @@ -0,0 +1,44 @@ + + + + + + + + + + + +<TMPL_VAR FORM-TITLE> + + + + + + + + + + + + + + + + + +

    + +Failed to save your changes. + +

    + +

    + +Your changes were not able to be saved to disk. The system gave the error: + +

    + + + +
    + +Your changes are preserved below, and you can try again to save them. + +

    + +
    + + + +
    + +/ + +
    + + + + + + + + + +New location: / + +
    + + + +Optional comment about this change:
    + +
    + +
    + + + + + + + + + diff -urNX ignorepats ikiwiki/templates/page.tmpl ikidev/templates/page.tmpl + --- ikiwiki/templates/page.tmpl 2006-12-28 12:27:01.000000000 -0800 + +++ ikidev/templates/page.tmpl 2007-02-22 01:52:33.078464000 -0800 + @@ -32,6 +32,9 @@ + +
  • Edit
  • +
    + + + +
  • Move
  • + +
    + +
  • RecentChanges
  • +
    diff --git a/doc/todo/Wrapper_config_with_multiline_regexp.mdwn b/doc/todo/Wrapper_config_with_multiline_regexp.mdwn new file mode 100644 index 000000000..c0311bc92 --- /dev/null +++ b/doc/todo/Wrapper_config_with_multiline_regexp.mdwn @@ -0,0 +1,36 @@ +Turning the wikilink regexp into an extended regexp on the svn trunk seems +to have broken the setuid wrapper on my system, because of two reasons: +First, the wrapper generator should turn each newline in $configstring into +`\n` in the C code rather than `\` followed by a newline in the C code. +Second, the untainting of $configstring should allow newlines. + +> Both of these problems were already dealt with in commit r3714, on June +> 3rd. Confused why you're posting patches for them now. [[done]] --[[Joey]] + + Modified: wiki-meta/perl/IkiWiki.pm + ============================================================================== + --- wiki-meta/perl/IkiWiki.pm (original) + +++ wiki-meta/perl/IkiWiki.pm Mon Jun 11 10:52:07 2007 + @@ -205,7 +205,7 @@ + + sub possibly_foolish_untaint ($) { #{{{ + my $tainted=shift; + - my ($untainted)=$tainted=~/(.*)/; + + my ($untainted)=$tainted=~/(.*)/s; + return $untainted; + } #}}} + + + Modified: wiki-meta/perl/IkiWiki/Wrapper.pm + ============================================================================== + --- wiki-meta/perl/IkiWiki/Wrapper.pm (original) + +++ wiki-meta/perl/IkiWiki/Wrapper.pm Mon Jun 11 10:52:07 2007 + @@ -62,7 +62,7 @@ + } + $configstring=~s/\\/\\\\/g; + $configstring=~s/"/\\"/g; + - $configstring=~s/\n/\\\n/g; + + $configstring=~s/\n/\\n/g; + + #translators: The first parameter is a filename, and the second is + #translators: a (probably not translated) error message. diff --git a/doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn b/doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn new file mode 100644 index 000000000..d52772a67 --- /dev/null +++ b/doc/todo/calendar_--_archive_browsing_via_a_calendar_frontend.mdwn @@ -0,0 +1,667 @@ +I am serving notice that I am starting work on a calendar plugin inspired by Blosxom's calendar plugin. The current plan is to create a plugin that looks through all the source files matching a certain pagespec, and optionally spit out a month view for the specified month (default to current), or spit out a year view for a given year (defaulting to the current year), of a list of year with posts in them. The output would be a table, with the same CSS directives that the Blosxom plugin used to use (so that I can just reuse my css file). The links would be created to a $config{archivedir}/$year or $config{archivedir}/$year-$month file, which can just have + + \[[inline pages="blog/* and !*/Discussion and creation_year($year) and creation_month($month)" rss="no" atom="no" show="0"]] + +or some thing to generate a archive of postings. + +Roland Mas suggested a separate cron job to generate these archive indices automatically, but that is another thread. + +ManojSrivastava + +This plugin is inspired by the calendar plugin for Blosxom, but derivesno code from it. This plugin is essentially a fancy front end to archives of previous pages, usually used for blogs. It can produce a calendar for a given month, or a list of months for a given year. To invoke the calendar, just use the preprocessor directive: + + \[[calendar ]] + +or + + \[[calendar type="month" pages="blog/* and !*/Discussion"]] + +or + + \[[calendar type="year" year="2005" pages="blog/* and !*/Discussion"]] + + +The year and month entities in the out put have links to archive index pages, which are supposed to exist already. The idea is to create an archives hierarchy, rooted in the subdirectory specified in the site-wide customization variable, archivebase. archivebase defaults to "archives". Links are created to pages "$archivebase/$year" and "$archivebase/$year/$month". The idea is to create annual and monthly indices, for example, by using something like this sample from my archives/2006/01.mdwn + + \[[meta title="Archives for 2006/01"]] + \[[inline rootpage="blog" atom="no" rss="no" show="0" pages="blog/* and !*/Discussion and creation_year(2006) and creation_month(01)" ]] + +I'll send in the patch via email. + +ManojSrivastava + +------ + +Since this is a little bit er, stalled, I'll post here the stuff Manoj +mailed me, and my response to it. --[[Joey]] + +[[tag patch]] + +
    +#! /usr/bin/perl
    +#                              -*- Mode: Cperl -*- 
    +# calendar.pm --- 
    +# Author           : Manoj Srivastava ( srivasta@glaurung.internal.golden-gryphon.com ) 
    +# Created On       : Fri Dec  8 16:05:48 2006
    +# Created On Node  : glaurung.internal.golden-gryphon.com
    +# Last Modified By : Manoj Srivastava
    +# Last Modified On : Sun Dec 10 01:53:22 2006
    +# Last Machine Used: glaurung.internal.golden-gryphon.com
    +# Update Count     : 139
    +# Status           : Unknown, Use with caution!
    +# HISTORY          : 
    +# Description      : 
    +# 
    +# arch-tag: 2aa737c7-3d62-4918-aaeb-fd85b4b1384c
    +#
    +# Copyright (c) 2006 Manoj Srivastava 
    +#
    +# This program is free software; you can redistribute it and/or modify
    +# it under the terms of the GNU General Public License as published by
    +# the Free Software Foundation; either version 2 of the License, or
    +# (at your option) any later version.
    +#
    +# This program is distributed in the hope that it will be useful,
    +# but WITHOUT ANY WARRANTY; without even the implied warranty of
    +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    +# GNU General Public License for more details.
    +#
    +# You should have received a copy of the GNU General Public License
    +# along with this program; if not, write to the Free Software
    +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
    +#
    +
    +require 5.002;
    +package IkiWiki::Plugin::calendar;
    +
    +use warnings;
    +use strict;
    +use IkiWiki '1.00';
    +use Time::Local;
    +
    +our $VERSION = "0.1";
    +my $file = __FILE__;
    +
    +my %calpages;
    +my %cache;
    +my %linkcache;
    +
    +my $index=1;
    +my @now=localtime();
    +
    +=head1 NAME
    +
    +calendar - Add links for the current month's, current year's, and older archived postings
    +
    +=cut
    +
    +=head1 SYNOPSIS
    +
    +To invoke the calendar, just use the preprocessor directive (options
    +and variations are detailed below):
    +
    +  [[calendar ]]
    +
    +or
    +
    +  [[calendar type="month" pages="blog/* and !*/Discussion"]]
    +
    +or
    +
    +  [[calendar type="year"  year="2005" pages="blog/* and !*/Discussion"]]
    +
    +=cut
    +
    +
    +=head1 DESCRIPTION
    +
    +This plugin is inspired by the calendar plugin for Blosxom, but
    +derives no code from it. This plugin is essentially a fancy front end
    +to archives of previous pages, usually used for blogs. It can produce
    +a calendar for a given month, or a list of months for a given year. 
    +
    +The year and month entities in the out put have links to archive index
    +pages, which are supposed to exist already. The idea is to create an
    +archives hierarchy, rooted in the subdirectory specified in the
    +site wide customization variable, I. I
    +defaults to C.  Links are created to pages
    +C<$archivebase/$year> and C<$archivebase/$year/$month>. If one creates
    +annual and monthly indices, for example, by using something like this
    +sample from my I (warning: line split for
    +readability):
    +
    +   \[[meta title="Archives for 2006/01"]]
    +   \[[inline rootpage="blog" atom="no" rss="no" show="0"
    +     pages="blog/* and !*/Discussion and creation_year(2006)
    +            and creation_month(01)"
    +   ]]
    +
    +=cut
    +
    +=head1 OPTIONS
    +
    +=over
    +
    +=item B
    +
    +Used to specify the type of calendar wanted. Can be one of C or
    +C. The default is a month view calendar.
    +
    +=item B
    +
    +Specifies the C used to get pages to match for
    +linking. Usually this should be something like C.
    +Defaults to C<*>.
    +
    +=item B
    +
    +The year for which the calendar is requested. Defaults to the current year.
    +
    +=item B
    +
    +The numeric month for which the calendar is requested, in the range
    +1..12. Used only for the month view calendar, and defaults to the
    +current month.
    +
    +=item B
    +
    +A number, in the range 0..6, which represents the day of the week that
    +the month calendar starts with. 0 is Sunday, 1 is Monday, and so
    +on. Defaults to 0, which is Sunday.
    +
    +=item B
    +
    +In the annual calendar, number of months to place in each row. Defaults to 3.
    +
    +=back
    +
    +=cut
    +
    +=head1 Classes for CSS control
    +
    +The output is liberally sprinkled with classes, for fine grained CSS
    +customization.
    +
    +=over
    +
    +=item C 
    +
    +The month calendar as a whole
    +
    +=item C
    +
    +The head of the month calendar (ie,"March"), localized to the environment.
    +
    +=item C
    +
    +A column head in the month calendar (ie, a day-of-week abbreviation),
    +localized.
    +
    +=item C, C,
    +  C, C,
    +  C 
    +
    +The day squares on the month calendar, for days that don't exist
    +(before or after the month itself), that don't have stories, that do
    +have stories, that are in the future, or are that currently selected,
    +respectively (today).
    +
    +=item Day-of-week-name
    +
    +Each day square is also given a class matching its day of week, this
    +can be used to high light weekends. This is also localized.
    +
    +=item C
    +
    +The year calendar as a whole
    +
    +=item C
    +
    +The head of the year calendar (ie, "2006")
    +
    +=item C
    +
    +For example, "Months"
    +
    +=item C, C,
    +  C, C
    +
    +The month squares on the year calendar, for months with stories,
    +without, in the future, and currently selected, respectively.
    +
    +=back
    +
    +=cut
    +
    +
    +sub import {
    +  hook(type => "preprocess", id => "calendar", call => \&preprocess);
    +  hook(type => "format", id => "calendar", call => \&format);
    +}
    +
    +sub preprocess (@) {
    +  my %params=@_;
    +  $params{pages} = "*"            unless defined $params{pages};
    +  $params{type}  = "month"        unless defined $params{type};
    +  $params{year}  = 1900 + $now[5] unless defined $params{year};
    +  $params{month} = sprintf("%02d", $params{month}) if defined  $params{month};
    +  $params{month} = 1    + $now[4] unless defined $params{month};
    +  $params{week_start_day} = 0     unless defined $params{week_start_day};
    +  $params{months_per_row} = 3     unless defined $params{months_per_row};
    +
    +  # Store parameters (could be multiple calls per page)
    +  $calpages{$params{destpage}}{$index} = \%params;
    +
    +  return "\n
    " . $index++ . "
    \n"; +} + +sub is_leap_year (@) { + my %params=@_; + return ($params{year} % 4 == 0 && (($params{year} % 100 != 0) || $params{year} % 400 ==0)) ; +} + + +sub month_days { + my %params=@_; + my $days_in_month = (31,28,31,30,31,30,31,31,30,31,30,31)[$params{month}-1]; + if ($params{month} == 2 && is_leap_year(%params)) { + $days_in_month++; + } + return $days_in_month; +} + + +sub format_month (@) { + my %params=@_; + my $pagespec = $params{pages}; + my $year = $params{year}; + my $month = $params{month}; + + my $calendar="\n"; + + # When did this month start? + my @monthstart = localtime(timelocal(0,0,0,1,$month-1,$year-1900)); + + my $future_dom = 0; + my $today = 0; + $future_dom = $now[3]+1 if ($year == $now[5]+1900 && $month == $now[4]+1); + $today = $now[3] if ($year == $now[5]+1900 && $month == $now[4]+1); + + # Calculate month names for next month, and previous months + my $pmonth = $month - 1; + my $nmonth = $month + 1; + my $pyear = $year; + my $nyear = $year; + + # Adjust for January and December + if ($month == 1) { $pmonth = 12; $pyear--; } + if ($month == 12) { $nmonth = 1; $nyear++; } + + # Find out month names for this, next, and previous months + my $monthname=POSIX::strftime("%B", @monthstart); + my $pmonthname= + POSIX::strftime("%B", localtime(timelocal(0,0,0,1,$pmonth-1,$pyear-1900))); + my $nmonthname= + POSIX::strftime("%B", localtime(timelocal(0,0,0,1,$nmonth-1,$nyear-1900))); + + # Calculate URL's for monthly archives, and article counts + my $archivebase = 'archives'; + $archivebase = $config{archivebase} if defined $config{archivebase}; + + my ($url, $purl, $nurl)=("$monthname",'',''); + my ($count, $pcount, $ncount) = (0,0,0); + + if (exists $cache{$pagespec}{"$year/$month"}) { + $url = htmllink($params{page}, $params{destpage}, + "$archivebase/$year/" . sprintf("%02d", $month), + 0,0," $monthname "); + } + + if (exists $cache{$pagespec}{"$pyear/$pmonth"}) { + $purl = htmllink($params{page}, $params{destpage}, + "$archivebase/$pyear/" . sprintf("%02d", $pmonth), + 0,0," $pmonthname "); + } + if (exists $cache{$pagespec}{"$nyear/$nmonth"}) { + $nurl = htmllink($params{page}, $params{destpage}, + "$archivebase/$nyear/" . sprintf("%02d", $nmonth), + 0,0," $nmonthname "); + } + + # Start producing the month calendar + $calendar=< + + $purl + $url + $nurl + + +EOF + # Suppose we want to start the week with day $week_start_day + # If $monthstart[6] == 1 + my $week_start_day = $params{week_start_day}; + + my $start_day = 1 + (7 - $monthstart[6] + $week_start_day) % 7; + my %downame; + my %dowabbr; + for my $dow ($week_start_day..$week_start_day+6) { + my @day=localtime(timelocal(0,0,0,$start_day++,$month-1,$year-1900)); + my $downame = POSIX::strftime("%A", @day); + my $dowabbr = POSIX::strftime("%a", @day); + $downame{$dow % 7}=$downame; + $dowabbr{$dow % 7}=$dowabbr; + $calendar.= + qq{ $dowabbr\n}; + } + + $calendar.=< +EOF + + my $wday; + # we start with a week_start_day, and skip until we get to the first + for ($wday=$week_start_day; $wday != $monthstart[6]; $wday++, $wday %= 7) { + $calendar.=qq{ \n} if $wday == $week_start_day; + $calendar.= + qq{  \n}; + } + + # At this point, either the first is a week_start_day, in which case nothing + # has been printed, or else we are in the middle of a row. + for (my $day = 1; $day <= month_days(year => $year, month => $month); + $day++, $wday++, $wday %= 7) { + # At tihs point, on a week_start_day, we close out a row, and start a new + # one -- unless it is week_start_day on the first, where we do not close a + # row -- since none was started. + if ($wday == $week_start_day) { + $calendar.=qq{ \n} unless $day == 1; + $calendar.=qq{ \n}; + } + my $tag; + my $mtag = sprintf("%02d", $month); + if (defined $cache{$pagespec}{"$year/$mtag/$day"}) { + if ($day == $today) { $tag='month-calendar-day-this-day'; } + else { $tag='month-calendar-day-link'; } + $calendar.=qq{ }; + $calendar.= + htmllink($params{page}, $params{destpage}, + pagename($linkcache{"$year/$mtag/$day"}), + 0,0,"$day"); + $calendar.=qq{\n}; + } + else { + if ($day == $today) { $tag='month-calendar-day-this-day'; } + elsif ($day == $future_dom) { $tag='month-calendar-day-future'; } + else { $tag='month-calendar-day-nolink'; } + $calendar.=qq{ $day\n}; + } + } + # finish off the week + for (; $wday != $week_start_day; $wday++, $wday %= 7) { + $calendar.=qq{  \n}; + } + $calendar.=< + +EOF + + return $calendar; +} + +sub format_year (@) { + my %params=@_; + my $pagespec = $params{pages}; + my $year = $params{year}; + my $month = $params{month}; + my $calendar="\n"; + my $pyear = $year - 1; + my $nyear = $year + 1; + my $future_month = 0; + $future_month = $now[4]+1 if ($year == $now[5]+1900); + + # calculate URL's for previous and next years + my $archivebase = 'archives'; + $archivebase = $config{archivebase} if defined $config{archivebase}; + my ($url, $purl, $nurl)=("$year",'',''); + if (exists $cache{$pagespec}{"$year"}) { + $url = htmllink($params{page}, $params{destpage}, + "$archivebase/$year", + 0,0,"$year"); + } + + if (exists $cache{$pagespec}{"$pyear"}) { + $purl = htmllink($params{page}, $params{destpage}, + "$archivebase/$pyear", + 0,0,"\←"); + } + if (exists $cache{$pagespec}{"$nyear"}) { + $nurl = htmllink($params{page}, $params{destpage}, + "$archivebase/$nyear", + 0,0,"\→"); + } + # Start producing the year calendar + $calendar=< + + $purl + $url + $nurl + + + Months + +EOF + + for ($month = 1; $month <= 12; $month++) { + my @day=localtime(timelocal(0,0,0,15,$month-1,$year-1900)); + my $murl; + my $monthname = POSIX::strftime("%B", @day); + my $monthabbr = POSIX::strftime("%b", @day); + $calendar.=qq{ \n} if ($month % $params{months_per_row} == 1); + my $tag; + my $mtag=sprintf("%02d", $month); + if ($month == $params{month}) { + if ($cache{$pagespec}{"$year/$mtag"}) {$tag = 'this_month_link'} + else {$tag = 'this_month_nolink'} + } + elsif ($cache{$pagespec}{"$year/$mtag"}) {$tag = 'month_link'} + elsif ($future_month && $month >=$future_month){$tag = 'month_future'} + else {$tag = 'month_nolink'} + if ($cache{$pagespec}{"$year/$mtag"}) { + $murl = htmllink($params{page}, $params{destpage}, + "$archivebase/$year/$mtag", + 0,0,"$monthabbr"); + $calendar.=qq{ }; + $calendar.=$murl; + $calendar.=qq{\n}; + } + else { + $calendar.=qq{ $monthabbr\n}; + } + $calendar.=qq{ \n} if ($month % $params{months_per_row} == 0); + } + $calendar.=< +EOF + + return $calendar; +} + + +sub format (@) { + my %params=@_; + my $content=$params{content}; + return $content unless exists $calpages{$params{page}}; + + # Restore parameters for each invocation + foreach my $index (keys %{$calpages{$params{page}}}) { + my $calendar="\n"; + my %saved = %{$calpages{$params{page}}{$index}}; + my $pagespec=$saved{pages}; + + if (! defined $cache{$pagespec}) { + for my $page (sort keys %pagesources) { + next unless pagespec_match($page,$pagespec); + my $mtime; + my $src = $pagesources{$page}; + if (! exists $IkiWiki::pagectime{$page}) { + $mtime=(stat(srcfile($src)))[9]; + } + else { + $mtime=$IkiWiki::pagectime{$page} + } + my @date = localtime($mtime); + my $mday = $date[3]; + my $month = $date[4] + 1; + my $year = $date[5] + 1900; + my $mtag = sprintf("%02d", $month); + $linkcache{"$year/$mtag/$mday"} = "$src"; + $cache{$pagespec}{"$year"}++; + $cache{$pagespec}{"$year/$mtag"}++; + $cache{$pagespec}{"$year/$mtag/$mday"}++; + } + } + # So, we have cached data for the current pagespec at this point + if ($saved{type} =~ /month/i) { + $calendar=format_month(%saved); + } + elsif ($saved{type} =~ /year/i) { + $calendar=format_year(%saved); + } + $content =~ s/(
    \s*.?\s*$index\b)/
    $calendar/ms; + } + return $content; +} + + + +=head1 CAVEATS + +In the month calendar, for days in which there is more than one +posting, the link created randomly selects one of them. Since there is +no easy way in B to automatically generate index pages, and +pregenerating daily index pages seems too much of an overhead, we have +to live with this. All postings can still be viewed in the monthly or +annual indices, of course. This can be an issue for very prolific +scriveners. + +=cut + +=head1 BUGS + +None Known so far. + +=head1 BUGS + +Since B eval's the configuration file, the values have to all +on a single physical line. This is the reason we need to use strings +and eval, instead of just passing in real anonymous sub references, +since the eval pass converts the coderef into a string of the form +"(CODE 12de345657)" which can't be dereferenced. + +=cut + +=head1 AUTHOR + +Manoj Srivastava + +=head1 COPYRIGHT AND LICENSE + +This script is a part of the Devotee package, and is + +Copyright (c) 2002 Manoj Srivastava + +This program is free software; you can redistribute it and/or modify +it under the terms of the GNU General Public License as published by +the Free Software Foundation; either version 2 of the License, or +(at your option) any later version. + +This program is distributed in the hope that it will be useful, +but WITHOUT ANY WARRANTY; without even the implied warranty of +MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +GNU General Public License for more details. + +You should have received a copy of the GNU General Public License +along with this program; if not, write to the Free Software +Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA + +=cut + +1; + +__END__ +
    + +------ + +I've been looking over the calendar plugin. Some items: + +* Why did you need to use a two-stage generation with a format hook? + That approach should only be needed if adding something to a page that + would be removed by the htmlscrubber, and as far as I can tell, the + calendars don't involve anything that would be a problem. It seems + that emitting the whole calendar in the preprocess hook would simplify + things and you'd not need to save state about calendars. + +> I am scared of the html scrubber, and have never turned it on, +> and did not look too deeply into what would be scrubbed out --ManojSrivastava +>> Unless you're using javascript, a few annoyances link , or inline +>> css, it's unlikly to object to any html you might write. The list of +>> allowed tags and attributes is easy to find near the top of the plugin. + +> In case the option that gets the ctime of the pages from the +> SCM itself, %IkiWiki::pagectime is not populated that early, +> is it? So I waited until the last possible moment to look at +> the time information. +> +>> Actually, since my big rewrite of the rendering path a few months ago, +>> ikiwiki scans and populates almost all page information before starting +>> to render any page. This includes %pagectime, and even %links. So you +>> shouldn't need to worry about running it late. + +* The way that it defaults to the current year and current month + is a little bit tricky, because of course the wiki might not get + updated in a particular time period, and even if it is updated, only + iff a page containing a calendar is rebuilt for some other reason will + the calendar get updated, and change what year or month it shows. This + is essentially the same problem described in + [[todo/tagging_with_a_publication_date]], + although I don't think it will affect the calendar plugin very badly. + Still, the docs probably need to be clear about this. + +> I use it on the sidebar; and the blog pages are almost always +> rebuilt, which is where the calendar is looked at most often. Oh, +> and I also cheat, I have ikiwiki --setup foo as a @daily cronjob, so +> my wiki is always built daily from scratch. +> +> I think it should be mentioned, yes. + +* There seems to be something a bit wrong with the year-to-year + navigation in the calendar, based on the example in your blog. If I'm + on the page for 2006, there's an arrow pointing left which takes me to + 2005. If I'm on 2005, the arrow points left, but goes to 2006, not + 2004. + +> I need to look into this. + +* AIUI, the archivebase setting makes a directory rooted at the top of + the wiki, so you can have only one set of archives per wiki, in + /archives/. It would be good if it were possible to have multiple + archived for different blogs in the same wiki at multiple locations. + Though since the archives contain calendars, the archive location + can't just be relative to the page with the calendar. But perhaps + archivebase could be a configurable parameter that can be specified in + the directive for the calendar? (It would be fine to keep the global + location as a default.) + +> OK, this is simple enough to implement. I'll do that (well, +> perhaps not before Xmas, I have a family dinner to cook) and send in +> another patch. + + +---- + +And that's all I've heard so far. Hoping I didn't miss another patch? + +--[[Joey]] diff --git a/doc/todo/clickable-openid-urls-in-logs.mdwn b/doc/todo/clickable-openid-urls-in-logs.mdwn new file mode 100644 index 000000000..acf2c2d49 --- /dev/null +++ b/doc/todo/clickable-openid-urls-in-logs.mdwn @@ -0,0 +1,23 @@ +OpenID URLs aren't clickable in the ViewVC logs because they're directly +followed by a colon. At the expense of, um, proper grammar, here's a patch +for SVN. If this is OK, I'll patch the other RCS modules, too. + +> Reasonable, but probably needs to modify the wiki\_commit\_regexp to +> recognise such commit messages when parsing the logs. Do that and extend +> to the other modules and I'll accept it. --[[Joey]] + +[[tag patch]] + +
    +--- IkiWiki/Rcs/svn.pm  (revision 2650)
    ++++ IkiWiki/Rcs/svn.pm  (working copy)
    +@@ -71,7 +71,7 @@
    +        my $ipaddr=shift;
    + 
    +        if (defined $user) {
    +-               $message="web commit by $user".(length $message ? ": $message" : "");
    ++               $message="web commit by $user ".(length $message ? ": $message" : "");
    +        }
    +        elsif (defined $ipaddr) {
    +                $message="web commit from $ipaddr".(length $message ? ": $message" : "");
    +
    diff --git a/doc/todo/darcs.mdwn b/doc/todo/darcs.mdwn new file mode 100644 index 000000000..13bd82513 --- /dev/null +++ b/doc/todo/darcs.mdwn @@ -0,0 +1,464 @@ +Here's Thomas Schwinge unfinished darcs support for ikiwiki. + +(Finishing this has been suggested as a [[soc]] project.) + +> I haven't been working on this for months and also won't in the near +> future. Feel free to use what I have done so +> far and bring it into an usable state! Also, feel free to contact me +> if there are questions. + +-- [Thomas Schwinge](mailto:tschwinge@gnu.org) + +[[toggle text="show"]] +[[toggleable text=""" + # Support for the darcs rcs, . + # Copyright (C) 2006 Thomas Schwinge + # + # This program is free software; you can redistribute it and/or modify it + # under the terms of the GNU General Public License as published by the + # Free Software Foundation; either version 2 of the License, or (at your + # option) any later version. + # + # This program is distributed in the hope that it will be useful, but + # WITHOUT ANY WARRANTY; without even the implied warranty of + # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + # General Public License for more details. + # + # You should have received a copy of the GNU General Public License along + # with this program; if not, write to the Free Software Foundation, Inc., + # 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA. + + + # We're guaranteed to be the only instance of ikiwiki running at a given + # time. It is essential that only ikiwiki is working on a particular + # repository. That means one instance of ikiwiki and it also means that + # you must not `darcs push' into this repository, as this might create + # race conditions, as I understand it. + + + use warnings; + use strict; + use IkiWiki; + + package IkiWiki; + + + # Which darcs executable to use. + my $darcs = ($ENV{DARCS} or 'darcs'); + + + # Internal functions. + + sub darcs_info ($$$) { + my $field = shift; + my $repodir = shift; + my $file = shift; # Relative to the repodir. + + my $child = open(DARCS_CHANGES, "-|"); + if (! $child) { + exec($darcs, 'changes', '--repo=' . $repodir, '--xml-output', $file) or + error('failed to run `darcs changes\''); + } + + # Brute force for now. :-/ + while () { + last if /^<\/created_as>$/; + } + ($_) = =~ /$field=\'([^\']+)/; + $field eq 'hash' and s/\.gz//; # Strip away the `.gz' from `hash'es. + + close(DARCS_CHANGES) or error('`darcs changes\' exited ' . $?); + + return $_; + } + + + # Exported functions. + + sub rcs_update () { + # Not needed. + } + + sub rcs_prepedit ($) { + # Prepares to edit a file under revision control. Returns a token that + # must be passed to rcs_commit() when the file is to be commited. For us, + # this token the hash value of the latest patch that modifies the file, + # i.e. something like its current revision. If the file is not yet added + # to the repository, we return TODO: the empty string. + + my $file = shift; # Relative to the repodir. + + my $hash = darcs_info('hash', $config{srcdir}, $file); + return defined $hash ? $hash : ""; + } + + sub rcs_commit ($$$) { + # Commit the page. Returns `undef' on success and a version of the page + # with conflict markers on failure. + + my $file = shift; # Relative to the repodir. + my $message = shift; + my $rcstoken = shift; + + # Compute if the ``revision'' of $file changed. + my $changed = darcs_info('hash', $config{srcdir}, $file) ne $rcstoken; + + # Yes, the following is a bit convoluted. + if ($changed) { + # TODO. Invent a better, non-conflicting name. + rename("$config{srcdir}/$file", "$config{srcdir}/$file.save") or + error("failed to rename $file to $file.save: $!"); + + # Roll the repository back to $rcstoken. + + # TODO. Can we be sure that no changes are lost? I think that + # we can, if we make sure that the `darcs push' below will always + # succeed. + + # We need to revert everything as `darcs obliterate' might choke + # otherwise. + # TODO: `yes | ...' needed? Doesn't seem so. + system($darcs, "revert", "--repodir=" . $config{srcdir}, "--all") and + error("`darcs revert' failed"); + # Remove all patches starting at $rcstoken. + # TODO. Something like `yes | darcs obliterate ...' seems to be needed. + system($darcs, "obliterate", "--quiet", "--repodir" . $config{srcdir}, + "--match", "hash " . $rcstoken) and + error("`darcs obliterate' failed"); + # Restore the $rcstoken one. + system($darcs, "pull", "--quiet", "--repodir=" . $config{srcdir}, + "--match", "hash " . $rcstoken, "--all") and + error("`darcs pull' failed"); + + # We're back at $rcstoken. Re-install the modified file. + rename("$config{srcdir}/$file.save", "$config{srcdir}/$file") or + error("failed to rename $file.save to $file: $!"); + } + + # Record the changes. + # TODO: What if $message is empty? + writefile("$file.log", $config{srcdir}, $message); + system($darcs, 'record', '--repodir=' . $config{srcdir}, '--all', + '--logfile=' . "$config{srcdir}/$file.log", + '--author=' . 'web commit ', $file) and + error('`darcs record\' failed'); + + # Update the repository by pulling from the default repository, which is + # master repository. + system($darcs, "pull", "--quiet", "--repodir=" . $config{srcdir}, + "--all") and error("`darcs pull' failed\n"); + + # If this updating yields any conflicts, we'll record them now to resolve + # them. If nothing is recorded, there are no conflicts. + $rcstoken = darcs_info('hash', $config{srcdir}, $file); + # TODO: Use only the first line here, i.e. only the patch name? + writefile("$file.log", $config{srcdir}, 'resolve conflicts: ' . $message); + system($darcs, 'record', '--repodir=' . $config{srcdir}, '--all', + '--logfile=' . "$config{srcdir}/$file.log", + '--author=' . 'web commit ', $file) and + error('`darcs record\' failed'); + my $conflicts = darcs_info('hash', $config{srcdir}, $file) ne $rcstoken; + unlink("$config{srcdir}/$file.log") or + error("failed to remove `$file.log'"); + + # Push the changes to the main repository. + system($darcs, 'push', '--quiet', '--repodir=' . $config{srcdir}, '--all') + and error('`darcs push\' failed'); + # TODO: darcs send? + + if ($conflicts) { + my $document = readfile("$config{srcdir}/$file"); + # Try to leave everything in a consistent state. + # TODO: `yes | ...' needed? Doesn't seem so. + system($darcs, "revert", "--repodir=" . $config{srcdir}, "--all") and + warn("`darcs revert' failed.\n"); + return $document; + } else { + return undef; + } + } + + sub rcs_add ($) { + my $file = shift; # Relative to the repodir. + + # Intermediate directories will be added automagically. + system($darcs, 'add', '--quiet', '--repodir=' . $config{srcdir}, + '--boring', $file) and error('`darcs add\' failed'); + } + + sub rcs_recentchanges ($) { + warn('rcs_recentchanges() is not implemented'); + return 'rcs_recentchanges() is not implemented'; + } + + sub rcs_notify () { + warn('rcs_notify() is not implemented'); + } + + sub rcs_getctime () { + warn('rcs_getctime() is not implemented'); + } + + 1 +"""]] + +This is my ([bma](bma@bmalee.eu)) darcs.pm - it's messy (my Perl isn't up to much) but seems to work. It uses just one repo, like the mercurial plugin (unlike the above version, which AIUI uses two). + +`rcs_commit()` uses backticks instead of `system()`, to prevent darcs' output being sent to the browser and mucking with the HTTP headers (`darcs record` has no --quiet option). And `rcs_recentchanges()` uses regexes rather than parsing darcs' XML output. + +[[toggle text="show" id="bma"]] +[[toggleable id="bma" text=""" + + #!/usr/bin/perl + + use warnings; + use strict; + use IkiWiki; + use Date::Parse; + use open qw{:utf8 :std}; + + package IkiWiki; + + sub rcs_update () { #{{{ + # Do nothing - there's nowhere to update *from*. + } #}}} + + sub rcs_prepedit ($) { #{{{ + } #}}} + + sub rcs_commit ($$$;$$) { #{{{ + my ($file, $message, $rcstoken, $user, $ipaddr) = @_; + + # $user should probably be a name and an email address, by darcs + # convention. + if (defined $user) { + $user = possibly_foolish_untaint($user); + } + elsif (defined $ipaddr) { + $user = "Anonymous from $ipaddr"; + } + else { + $user = "Anonymous"; + } + + $message = possibly_foolish_untaint($message); + + # BUG: this outputs one line of text, and there's not a -q or --quiet + # option. Redirecting output to /dev/null works, but I still get the + # HTTP status and location headers displayed in the browser - is that + # darcs' fault or ikiwiki's? + # Doing it in backticks *works*, but I'm sure it could be done better. + my @cmdline = ("darcs", "record", "--repodir", "$config{srcdir}", + "-a", "-m", "$message", "--author", "$user", $file); + `darcs record --repodir "$config{srcdir}" -a -m "$message" --author "$user" $file`; # Return value? Output? Who needs 'em? + #if (system(@cmdline) != 0) { + # warn "'@cmdline' failed: $!"; + #} + + return undef; # success + + sub rcs_add ($) { # {{{ + my ($file) = @_; + + my @cmdline = ("darcs", "add", "--repodir", "$config{srcdir}", "-a", "-q", "$file"); + if (system(@cmdline) != 0) { + warn "'@cmdline' failed: $!"; + } + } #}}} + + sub rcs_recentchanges ($) { #{{{ + # TODO: This is horrible code. It doesn't work perfectly, and uses regexes + # rather than parsing Darcs' XML output. + my $num=shift; + my @ret; + + return unless -d "$config{srcdir}/_darcs"; + + my $changelog = `darcs changes --xml --summary --repodir "$config{srcdir}"`; + $changelog = join("", split(/\s*\n\s*/, $changelog)); + my @changes = split(/<\/patch>.*?(.*?)<\/name>/g; + my @message = {line => $1}; + foreach my $match ($change =~ m/(.*?)<\/comment>/gm) { + push @message, {line => $1}; + } + + my @pages; + foreach my $match ($change =~ m/<.*?_(file|directory)>(.*?)(<(added|removed)_lines.*\/>)*<\/.*?_(file|directory)>/g) { + # My perl-fu is weak. I'm probably going about this all wrong, anyway. + push @pages, {page => pagename($match)} if ( -f $config{srcdir}."/".$match || -d $config{srcdir}."/".$match) and not $match =~ m/^$/; + } + push @ret, { rev => $rev, + user => $user, + committype => $committype, + when => $when, + message => [@message], + pages => [@pages], + } + } + return @ret; + } #}}} + + sub rcs_notify () { #{{{ + # TODO + } #}}} + + sub rcs_getctime ($) { #{{{ + error gettext("getctime not implemented"); + } #}}} + + 1 + + + +"""]] + +--- + +Well, here's my version too. It only does getctime -- using a real XML parser, instead of regexp ugliness -- and maybe recentchanges, but that may be bitrotted, or maybe I never finished it, as I only need the getctime. As for actual commits, I have previously voiced my opinion, that this should be done by the plugin generating a patch bundle, and forwarding it to darcs in some way (`darcs apply` or even email to another host, possibly moderated), instead of the hacky direct modification of a working copy. It could also be faster to getctime in a batch. Just reading in all the changes the first time they're needed, might not be a big improvement in many cases, but if we got a batch request from ikiwiki, we could keep reaing the changes until all the files in this batch request have been met. --[[tuomov]] + +[[toggle text="show" id="tuomov"]] +[[toggleable id="tuomov" text=""" +
    +#!/usr/bin/perl
    +# Stubs for no revision control.
    +
    +use warnings;
    +use strict;
    +use IkiWiki;
    +
    +package IkiWiki;
    +
    +sub rcs_update () {
    +}
    +
    +sub rcs_prepedit ($) {
    +	return ""
    +}
    +
    +sub rcs_commit ($$$) {
    +	return undef # success
    +}
    +
    +sub rcs_add ($) {
    +}
    +
    +sub rcs_recentchanges ($) {
    +	my $num=shift;
    +	my @ret;
    +	
    +	eval q{use Date::Parse};
    +	eval q{use XML::Simple};
    +	
    +	my $repodir=$config{srcdir};
    +	
    +	if (-d "$config{srcdir}/_darcs") {
    +		my $child = open(LOG, "-|");
    +		if (! $child) {
    +			exec("darcs", "changes", "--xml", 
    +			     "--repodir", "$repodir",
    +			     "--last", "$num")
    +			|| error("darcs changes failed to run");
    +		}
    +		my $data=;
    +		close LOG;
    +		
    +		my $log = XMLin($data, ForceArray => 1);
    +		
    +		foreach my $patch ($log->{patch}) {
    +			my $date=$patch->{local_date};
    +			my $hash=$patch->{hash};
    +			my $when=concise(ago(time - str2time($date)));
    +			my @pages;
    +			
    +			my $child = open(SUMMARY, "-|");
    +			if (! $child) {
    +				exec("darcs", "annotate", "-s", "--xml", 
    +				     "--match", "hash: $hash",
    +				     "--repodir", "$repodir")
    +				|| error("darcs annotate failed to run");
    +			}
    +			my $data=;
    +			close SUMMARY;
    +		
    +			my $summary = XMLin("$data", ForceArray => 1);
    +
    +			# TODO: find @pages
    +			
    +			push @ret, {
    +				#rev => $rev,
    +				user => $patch->{author},
    +				#committype => $committype,
    +				when => $when, 
    +				#message => [@message],
    +				pages => [@pages],
    +			}; # if @pages;
    +			return @ret if @ret >= $num;
    +		}
    +	}
    +	
    +	return @ret;
    +}
    +
    +sub rcs_notify () {
    +}
    +
    +sub rcs_getctime ($) {
    +	my $file=shift;
    +	
    +	eval q{use Date::Parse};
    +	eval q{use XML::Simple};
    +	local $/=undef;
    +	
    +	# Sigh... doing things the hard way again
    +	my $repodir=$config{srcdir};
    +	
    +	my $filer=substr($file, length($repodir));
    +	$filer =~ s:^[/]+::;
    +	
    +	my $child = open(LOG, "-|");
    +	if (! $child) {
    +		exec("darcs", "changes", "--xml", "--reverse",
    +		     "--repodir", "$repodir", "$filer")
    +		|| error("darcs changes $filer failed to run");
    +	}
    +	
    +	my $data=;
    +	close LOG;
    +	
    +	my $log = XMLin($data, ForceArray => 1);
    +	
    +	my $datestr=$log->{patch}[0]->{local_date};
    +	
    +	if (! defined $datestr) {
    +		warn "failed to get ctime for $filer";
    +		return 0;
    +	}
    +	
    +	my $date=str2time($datestr);
    +	
    +	debug("found ctime ".localtime($date)." for $file");
    +	
    +	return $date;
    +}
    +
    +1
    +
    +"""]] diff --git a/doc/todo/datearchives-plugin.mdwn b/doc/todo/datearchives-plugin.mdwn new file mode 100644 index 000000000..8c3faf9ca --- /dev/null +++ b/doc/todo/datearchives-plugin.mdwn @@ -0,0 +1,75 @@ +I'll be using IkiWiki primarily as a blog, so I want a way to view entries +by date. A URL of the form `/date/YYYY/MM/DD.html` (or `/date/YYYY/MM/DD/` +when using the `use_dirs` patch) should show posts from that period. ATM, I +have this: + +
    +Index: IkiWiki/Plugin/datearchives.pm
    +===================================================================
    +--- IkiWiki/Plugin/datearchives.pm      (revision 0)
    ++++ IkiWiki/Plugin/datearchives.pm      (revision 0)
    +@@ -0,0 +1,31 @@
    ++#!/usr/bin/perl
    ++
    ++package IkiWiki::Plugin::datearchives;
    ++
    ++use warnings;
    ++use strict;
    ++use IkiWiki;
    ++
    ++sub import { #{{{
    ++    hook(type => "pagetemplate", id => "datearchives", call => \&pagetemplate, scan => 1);
    ++} # }}}
    ++
    ++sub pagetemplate (@) { #{{{
    ++    my %args = @_;
    ++    my $dt;
    ++    eval {
    ++        use DateTime;
    ++        $dt = DateTime->from_epoch(epoch => $IkiWiki::pagectime{ $args{page} });
    ++    };
    ++    return if $@;
    ++    my $base = $config{datearchives_base} || 'date';
    ++    my $link = $base.'/'.$dt->strftime('%Y/%m/%d');
    ++    push @{$links{$args{page}}}, $link;
    ++    my $template = $args{template};
    ++       if ($template->query(name => "ctime")) {
    ++        $template->param(ctime => htmllink( $args{page}, $args{destpage}, $link, 0, 0,
    ++                                            $template->param('ctime')));
    ++       }
    ++} # }}}
    ++
    ++1
    +
    + +This works (although accessing `%IkiWiki::pagectime` is not too clever), but it would be far more useful if the date pages were automatically created and populated with the relevant posts. A [[Pagespec]] works perfectly for displaying the relevant content, but we're still left with the issue of actually creating the page. What's the Right Way to do this? We could create them in the RCS working copy and check them in, or create them directly in the output directory... (I'd also like to create an option for the tags plugin to auto-create its targets in the same way). Any opinions? :-) + +> Ok, first, I don't understand what your plugin does. Maybe I need to get +> some sleep, but a better explanation might help. :-) It seems to make +> links from pages to the archive pages? But I don't understand why you +> want such links .. wouldn't a sidebar with links to the available archive +> pages work better? Or something else, depending on personal preference. +> +> Secondly, you're certianly not the first to wat to do data based archive +> pages. So far I have successfully punted the issue of creating these +> pages out of ikiwiki by pointing out that everyone wants them to be +> _different_, and suggesting people set up cron jobs or other machinery to +> generate the kinds of archives that they like. This makes me happy +> because generalizing all the possible ways people might want to do date +> based archives and somehow bolting support for creating them onto the +> size of ikiwiki seems to be a recipe for a mess. +> +> A few examples of ikiwiki sites with date archives: +> and +> --[[Joey]] + +>> Yeah, it wasn't much of a description, was it? ;-) It's an attempt to emulate the style of Wordpress and other popular blog platforms, which can link a post's creation date to YYY/MM/DD archive pages, which then list all the relevant posts. My use-case is on a blog page which in-lines (via pagespecs) recent blog posts. + +>> I agree with not adding this kind of functionality to the core. :-) I simply didn't want to have break links when I convert to IkiWiki. I guess I'll just play around with the page-creation thing myself then. Feel free to delete this from the queue. :-) --Ben + +>>> Ah, I get it, I hadn't realized it was making the date into a link. +>>> No reason to delete this from the queue, it's a reasonable plugin. I +>>> might move it to the contributed plugins directory as it's a bit +>>> specialised to be included in ikiwiki though. --[[Joey]] + +[[tag patch]] diff --git a/doc/todo/enable-htaccess-files.mdwn b/doc/todo/enable-htaccess-files.mdwn new file mode 100644 index 000000000..accd96bd7 --- /dev/null +++ b/doc/todo/enable-htaccess-files.mdwn @@ -0,0 +1,29 @@ + Index: IkiWiki.pm + =================================================================== + --- IkiWiki.pm (revision 2981) + +++ IkiWiki.pm (working copy) + @@ -26,7 +26,7 @@ + memoize("file_pruned"); + + sub defaultconfig () { #{{{ + - wiki_file_prune_regexps => [qr/\.\./, qr/^\./, qr/\/\./, + + wiki_file_prune_regexps => [qr/\.\./, qr/^\.(?!htaccess)/, qr/\/\.(?!htaccess)/, + qr/\.x?html?$/, qr/\.ikiwiki-new$/, + qr/(^|\/).svn\//, qr/.arch-ids\//, qr/{arch}\//], + wiki_link_regexp => qr/\[\[(?:([^\]\|]+)\|)?([^\s\]#]+)(?:#([^\s\]]+))?\]\]/, + +[[tag patch]] + +This lets the site administrator have a `.htaccess` file in their underlay +directory, say, then get it copied over when the wiki is built. Without +this, installations that are located at the root of a domain don't get the +benefit of `.htaccess` such as improved directory listings, IP blocking, +URL rewriting, authorisation, etc. + +> I'm concerned about security ramifications of this patch. While ikiwiki +> won't allow editing such a .htaccess file in the web interface, it would +> be possible for a user who has svn commit access to the wiki to use it to +> add a .htaccess file that does $EVIL. +> +> Perhaps this should be something that is configurable via the setup file +> instead. --[[Joey]] diff --git a/doc/todo/format_escape.mdwn b/doc/todo/format_escape.mdwn new file mode 100644 index 000000000..f8ea789ec --- /dev/null +++ b/doc/todo/format_escape.mdwn @@ -0,0 +1,227 @@ +Since some preprocessor directives insert raw HTML, it would be good to +specify, per-format, how to pass HTML so that it goes through the format +OK. With Markdown we cross our fingers; with reST we use the "raw" +directive. + +I added an extra named parameter to the htmlize hook, which feels sort of +wrong, since none of the other hooks take parameters. Let me know what +you think. --Ethan + +Seems fairly reasonable, actually. Shouldn't the `$type` come from `$page` +instead of `$destpage` though? Only other obvious change is to make the +escape parameter optional, and only call it if set. --[[Joey]] + +> I couldn't figure out what to make it from, but thinking it through, +> yeah, it should be $page. Revised patch follows. --Ethan + +>> I've updated the patch some more, but I think it's incomplete. ikiwiki +>> emits raw html when expanding WikiLinks too, and it would need to escape +>> those. Assuming that escaping html embedded in the middle of a sentence +>> works.. --[[Joey]] + +>>> Revised again. I get around this by making another hook, htmlescapelink, +>>> which is called to generate links in whatever language. In addition, it +>>> doesn't (can't?) generate +>>> spans, and it doesn't handle inlineable image links. If these were +>>> desired, the approach to take would probably be to use substitution +>>> definitions, which would require generating two bits of code for each +>>> link/html snippet, and putting one at the end of the paragraph (or maybe +>>> the document?). +>>> To specify that (for example) Discussion links are meant to be HTML and +>>> not rst or whatever, I added a "genhtml" parameter to htmllink. It seems +>>> to work -- see for an example. +>>> --Ethan + +[[tag patch]] + +
    +Index: debian/changelog
    +===================================================================
    +--- debian/changelog	(revision 3197)
    ++++ debian/changelog	(working copy)
    +@@ -24,6 +24,9 @@
    +     than just a suggests, since OpenID is enabled by default.
    +   * Fix a bug that caused link(foo) to succeed if page foo did not exist.
    +   * Fix tags to page names that contain special characters.
    ++  * Based on a patch by Ethan, add a new htmlescape hook, that is called
    ++    when a preprocssor directive emits inline html. The rst plugin uses this
    ++    hook to support inlined raw html.
    + 
    +   [ Josh Triplett ]
    +   * Use pngcrush and optipng on all PNG files.
    +Index: IkiWiki/Render.pm
    +===================================================================
    +--- IkiWiki/Render.pm	(revision 3197)
    ++++ IkiWiki/Render.pm	(working copy)
    +@@ -96,7 +96,7 @@
    + 		if ($page !~ /.*\/\Q$discussionlink\E$/ &&
    + 		   (length $config{cgiurl} ||
    + 		    exists $links{$page."/".$discussionlink})) {
    +-			$template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1));
    ++			$template->param(discussionlink => htmllink($page, $page, gettext("Discussion"), noimageinline => 1, forcesubpage => 1, genhtml => 1));
    + 			$actions++;
    + 		}
    + 	}
    +Index: IkiWiki/Plugin/rst.pm
    +===================================================================
    +--- IkiWiki/Plugin/rst.pm	(revision 3197)
    ++++ IkiWiki/Plugin/rst.pm	(working copy)
    +@@ -30,15 +30,36 @@
    + html = publish_string(stdin.read(), writer_name='html', 
    +        settings_overrides = { 'halt_level': 6, 
    +                               'file_insertion_enabled': 0,
    +-                              'raw_enabled': 0 }
    ++                              'raw_enabled': 1 }
    + );
    + print html[html.find('')+6:html.find('')].strip();
    + ";
    + 
    + sub import { #{{{
    + 	hook(type => "htmlize", id => "rst", call => \&htmlize);
    ++	hook(type => "htmlescape", id => "rst", call => \&htmlescape);
    ++	hook(type => "htmlescapelink", id => "rst", call => \&htmlescapelink);
    + } # }}}
    + 
    ++sub htmlescapelink ($$;@) { #{{{
    ++	my $url = shift;
    ++	my $text = shift;
    ++	my %params = @_;
    ++
    ++	if ($params{broken}){
    ++		return "`? <$url>`_\ $text";
    ++	}
    ++	else {
    ++		return "`$text <$url>`_";
    ++	}
    ++} # }}}
    ++
    ++sub htmlescape ($) { #{{{
    ++	my $html=shift;
    ++	$html=~s/^/  /mg;
    ++	return ".. raw:: html\n\n".$html;
    ++} # }}}
    ++
    + sub htmlize (@) { #{{{
    + 	my %params=@_;
    + 	my $content=$params{content};
    +Index: doc/plugins/write.mdwn
    +===================================================================
    +--- doc/plugins/write.mdwn	(revision 3197)
    ++++ doc/plugins/write.mdwn	(working copy)
    +@@ -121,6 +121,26 @@
    + The function is passed named parameters: "page" and "content" and should
    + return the htmlized content.
    + 
    ++### htmlescape
    ++
    ++	hook(type => "htmlescape", id => "ext", call => \&htmlescape);
    ++
    ++Some markup languages do not allow raw html to be mixed in with the markup
    ++language, and need it to be escaped in some way. This hook is a companion
    ++to the htmlize hook, and is called when ikiwiki detects that a preprocessor
    ++directive is inserting raw html. It is passed the chunk of html in
    ++question, and should return the escaped chunk.
    ++
    ++### htmlescapelink
    ++
    ++	hook(type => "htmlescapelink", id => "ext", call => \&htmlescapelink);
    ++
    ++Some markup languages have special syntax to link to other pages. This hook
    ++is a companion to the htmlize and htmlescape hooks, and it is called when a
    ++link is inserted. It is passed the target of the link and the text of the 
    ++link, and an optional named parameter "broken" if a broken link is being
    ++generated. It should return the correctly-formatted link.
    ++
    + ### pagetemplate
    + 
    + 	hook(type => "pagetemplate", id => "foo", call => \&pagetemplate);
    +@@ -355,6 +375,7 @@
    + * forcesubpage  - set to force a link to a subpage
    + * linktext - set to force the link text to something
    + * anchor - set to make the link include an anchor
    ++* genhtml - set to generate HTML and not escape for correct format
    + 
    + #### `readfile($;$)`
    + 
    +Index: doc/plugins/rst.mdwn
    +===================================================================
    +--- doc/plugins/rst.mdwn	(revision 3197)
    ++++ doc/plugins/rst.mdwn	(working copy)
    +@@ -10,10 +10,8 @@
    + Note that this plugin does not interoperate very well with the rest of
    + ikiwiki. Limitations include:
    + 
    +-* reStructuredText does not allow raw html to be inserted into
    +-  documents, but ikiwiki does so in many cases, including
    +-  [[WikiLinks|WikiLink]] and many
    +-  [[PreprocessorDirectives|PreprocessorDirective]].
    ++* Some bits of ikiwiki may still assume that markdown is used or embed html
    ++  in ways that break reStructuredText. (Report bugs if you find any.)
    + * It's slow; it forks a copy of python for each page. While there is a
    +   perl version of the reStructuredText processor, it is not being kept in
    +   sync with the standard version, so is not used.
    +Index: IkiWiki.pm
    +===================================================================
    +--- IkiWiki.pm	(revision 3197)
    ++++ IkiWiki.pm	(working copy)
    +@@ -469,6 +469,10 @@
    + 	my $page=shift; # the page that will contain the link (different for inline)
    + 	my $link=shift;
    + 	my %opts=@_;
    ++	# we are processing $lpage and so we need to format things in accordance
    ++	# with the formatting language of $lpage. inline generates HTML so links
    ++	# will be escaped seperately.
    ++	my $type=pagetype($pagesources{$lpage});
    + 
    + 	my $bestlink;
    + 	if (! $opts{forcesubpage}) {
    +@@ -494,12 +498,17 @@
    + 	}
    + 	if (! grep { $_ eq $bestlink } map { @{$_} } values %renderedfiles) {
    + 		return $linktext unless length $config{cgiurl};
    +-		return " "create",
    +-				page => pagetitle(lc($link), 1),
    +-				from => $lpage
    +-			).
    ++		my $url = cgiurl(
    ++				 do => "create",
    ++				 page => pagetitle(lc($link), 1),
    ++				 from => $lpage
    ++				);
    ++
    ++		if ($hooks{htmlescapelink}{$type} && ! $opts{genhtml}){
    ++			return $hooks{htmlescapelink}{$type}{call}->($url, $linktext,
    ++							       broken => 1);
    ++		}
    ++		return "?$linktext"
    + 	}
    + 	
    +@@ -514,6 +523,9 @@
    + 		$bestlink.="#".$opts{anchor};
    + 	}
    + 
    ++	if ($hooks{htmlescapelink}{$type} && !$opts{genhtml}) {
    ++	  return $hooks{htmlescapelink}{$type}{call}->($bestlink, $linktext);
    ++	}
    + 	return "$linktext";
    + } #}}}
    + 
    +@@ -628,6 +640,14 @@
    + 				preview => $preprocess_preview,
    + 			);
    + 			$preprocessing{$page}--;
    ++
    ++			# Handle escaping html if the htmlizer needs it.
    ++			if ($ret =~ /[<>]/ && $pagesources{$page}) {
    ++				my $type=pagetype($pagesources{$page});
    ++				if ($hooks{htmlescape}{$type}) {
    ++					return $hooks{htmlescape}{$type}{call}->($ret);
    ++				}
    ++			}
    + 			return $ret;
    + 		}
    + 		else {
    +
    diff --git a/doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn b/doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn new file mode 100644 index 000000000..2ef231dde --- /dev/null +++ b/doc/todo/hard-coded_location_for_man_pages_and_w3m_cgi_wrapper.mdwn @@ -0,0 +1,94 @@ +Hi, + +some operating systems use PREFIX/man instead of PREFIX/share/man as the base +directory for man pages and PREFIX/libexec/ instead of PREFIX/lib/ for files +like CGI programs. +At the moment the location of the installed man pages and the w3m cgi wrapper +is hard-coded in Makefile.PL. +The patch below makes it possible to install those files to alternative directories +while the default stays as it is now. + +> It should be possible to use the existing MakeMaker variables such as +> INSTALLMAN1DIR (though MakeMaker lacks one for man8). I'd prefer not +> adding new variables where MakeMaker already has them. --[[Joey]] + +[[tag patch]] + +
    +
    +  - Introduce two variables, IKI_MANDIR and IKI_W3MCGIDIR, to be set from
    +    the command line. This enables locations for man pages and the w3m
    +    cgi wrapper other than the hard-coded defaults in Makefile.PL.
    +
    +--- Makefile.PL.orig    2007-05-20 03:03:58.000000000 +0200
    ++++ Makefile.PL
    +@@ -3,9 +3,32 @@ use warnings;
    + use strict;
    + use ExtUtils::MakeMaker;
    + 
    ++my %params = ( 'IKI_MANDIR' => '$(PREFIX)/share/man',
    ++               'IKI_W3MCGIDIR' => '$(PREFIX)/lib/w3m/cgi-bin'
    ++             );
    ++
    ++@ARGV = grep {
    ++  my ($key, $value) = split(/=/, $_, 2);
    ++  if ( exists $params{$key} ) {
    ++    $params{$key} = $value;
    ++    print "Using $params{$key} for $key.\n";
    ++    0
    ++  } else {
    ++    1
    ++  }
    ++} @ARGV;
    ++
    ++
    + # Add a few more targets.
    + sub MY::postamble {
    +-q{
    ++  package MY;
    ++
    ++  my $scriptvars = <<"EOSCRIPTVARS";
    ++IKI_MANDIR = $params{'IKI_MANDIR'}
    ++IKI_W3MCGIDIR = $params{'IKI_W3MCGIDIR'}
    ++EOSCRIPTVARS
    ++
    ++  my $script = q{
    + all:: extra_build
    + clean:: extra_clean
    + install:: extra_install
    +@@ -56,23 +79,24 @@ extra_install:
    +                done; \
    +        done
    + 
    +-       install -d $(DESTDIR)$(PREFIX)/share/man/man1
    +-       install -m 644 ikiwiki.man $(DESTDIR)$(PREFIX)/share/man/man1/ikiwiki.1
    ++       install -d $(DESTDIR)$(IKI_MANDIR)/man1
    ++       install -m 644 ikiwiki.man $(DESTDIR)$(IKI_MANDIR)/man1/ikiwiki.1
    +        
    +-       install -d $(DESTDIR)$(PREFIX)/share/man/man8
    +-       install -m 644 ikiwiki-mass-rebuild.man $(DESTDIR)$(PREFIX)/share/man/ma
    +n8/ikiwiki-mass-rebuild.8
    ++       install -d $(DESTDIR)$(IKI_MANDIR)/man8
    ++       install -m 644 ikiwiki-mass-rebuild.man $(DESTDIR)$(IKI_MANDIR)/man8/iki
    +wiki-mass-rebuild.8
    +        
    +        install -d $(DESTDIR)$(PREFIX)/sbin
    +        install ikiwiki-mass-rebuild $(DESTDIR)$(PREFIX)/sbin
    + 
    +-       install -d $(DESTDIR)$(PREFIX)/lib/w3m/cgi-bin
    +-       install ikiwiki-w3m.cgi $(DESTDIR)$(PREFIX)/lib/w3m/cgi-bin
    ++       install -d $(DESTDIR)$(IKI_W3MCGIDIR)
    ++       install ikiwiki-w3m.cgi $(DESTDIR)$(IKI_W3MCGIDIR)
    + 
    +        install -d $(DESTDIR)$(PREFIX)/bin
    +        install ikiwiki.out $(DESTDIR)$(PREFIX)/bin/ikiwiki
    + 
    +        $(MAKE) -C po install PREFIX=$(PREFIX)
    +-}
    ++};
    ++  return $scriptvars.$script;
    + }
    + 
    + WriteMakefile(
    +
    +
    diff --git a/doc/todo/index.html_allowed.mdwn b/doc/todo/index.html_allowed.mdwn new file mode 100644 index 000000000..9c09eec5a --- /dev/null +++ b/doc/todo/index.html_allowed.mdwn @@ -0,0 +1,106 @@ +This page used to be used for two patches, one of which is applied +providing the usedirs option for output. The remaining patch, discussed +below, concerns wanting to use foo/index.mdwn source files and get an +output page name of foo, rather than foo/index. --[[Joey]] + +[[tag patch]] + +--- + +I independently implemented a similar, but smaller patch. +(It's smaller because I only care about rendering; not CGI, for example.) +The key to this patch is that "A/B/C" is treated as equivalent +to "A/B/C/index". +Here it is: --Per Bothner + + --- IkiWiki/Render.pm~ 2007-01-11 15:01:51.000000000 -0800 + +++ IkiWiki/Render.pm 2007-02-02 22:24:12.000000000 -0800 + @@ -60,9 +60,9 @@ + foreach my $dir (reverse split("/", $page)) { + if (! $skip) { + $path.="../"; + - unshift @ret, { url => $path.htmlpage($dir), page => pagetitle($dir) }; + + unshift @ret, { url => abs2rel(htmlpage(bestlink($page, $dir)), dirname($page)), page => pagetitle($dir) }; + } + - else { + + elsif ($dir ne "index") { + $skip=0; + } + } + + --- IkiWiki.pm~ 2007-01-12 12:47:09.000000000 -0800 + +++ IkiWiki.pm 2007-02-02 18:02:16.000000000 -0800 + @@ -315,6 +315,12 @@ + elsif (exists $pagecase{lc $l}) { + return $pagecase{lc $l}; + } + + else { + + my $lindex = $l . "/index"; + + if (exists $links{$lindex}) { + + return $lindex; + + } + + } + } while $cwd=~s!/?[^/]+$!!; + + if (length $config{userdir} && exists $links{"$config{userdir}/".lc($link)}) { + +Note I handle setting the url; slightly differently. +Also note that an initial "index" is ignored. I.e. a +page "A/B/index.html" is treated as "A/B". + +> Actually, your patch is shorter because it's more elegant and better :) +> I'm withdrawing my old patch, because yours is much more in line with +> ikiwiki's design and architecture. +> I would like to make one suggestion to your patch, which is: + + diff -urX ignorepats clean-ikidev/IkiWiki/Plugin/inline.pm ikidev/IkiWiki/Plugin/inline.pm + --- clean-ikidev/IkiWiki/Plugin/inline.pm 2007-02-25 12:26:54.099113000 -0800 + +++ ikidev/IkiWiki/Plugin/inline.pm 2007-02-25 14:55:21.163340000 -0800 + @@ -154,7 +154,7 @@ + $link=htmlpage($link) if defined $type; + $link=abs2rel($link, dirname($params{destpage})); + $template->param(pageurl => $link); + - $template->param(title => pagetitle(basename($page))); + + $template->param(title => titlename($page)); + $template->param(ctime => displaytime($pagectime{$page})); + + if ($actions) { + @@ -318,7 +318,7 @@ + my $pcontent = absolute_urls(get_inline_content($p, $page), $url); + + $itemtemplate->param( + - title => pagetitle(basename($p), 1), + + title => titlename($p, 1), + url => $u, + permalink => $u, + date_822 => date_822($pagectime{$p}), + diff -urX ignorepats clean-ikidev/IkiWiki/Render.pm ikidev/IkiWiki/Render.pm + --- clean-ikidev/IkiWiki/Render.pm 2007-02-25 12:26:54.745833000 -0800 + +++ ikidev/IkiWiki/Render.pm 2007-02-25 14:54:01.564715000 -0800 + @@ -110,7 +110,7 @@ + $template->param( + title => $page eq 'index' + ? $config{wikiname} + - : pagetitle(basename($page)), + + : titlename($page), + wikiname => $config{wikiname}, + parentlinks => [parentlinks($page)], + content => $content, + diff -urX ignorepats clean-ikidev/IkiWiki.pm ikidev/IkiWiki.pm + --- clean-ikidev/IkiWiki.pm 2007-02-25 12:26:58.812850000 -0800 + +++ ikidev/IkiWiki.pm 2007-02-25 15:05:22.328852000 -0800 + @@ -192,6 +192,12 @@ + return $untainted; + } #}}} + + +sub titlename($;@) { #{{{ + + my $page = shift; + + $page =~ s!/index$!!; + + return pagetitle(basename($page), @_); + +} #}}} + + + sub basename ($) { #{{{ + my $file=shift; + + +> This way foo/index gets "foo" as its title, not "index". --Ethan diff --git a/doc/todo/l10n.mdwn b/doc/todo/l10n.mdwn new file mode 100644 index 000000000..3369bec11 --- /dev/null +++ b/doc/todo/l10n.mdwn @@ -0,0 +1,61 @@ +From [[Recai]]: +> Here is my initial work on ikiwiki l10n infrastructure (I'm sending it +> before finalizing, there may be errors). + +I've revised the patches (tested OK): + +- $config{lang} patch: + + + + + Support for CGI::FormBuilder. + + Modify Makefile.PL for l10n. + +- l10n infrastructure from Koha project. (This patch must be applied with + '-p1', also, it needs a 'chmod +x l10n/*.pl' after patching.) + + + Leave templates dir untouched, use a temporary translations directory + instead. + + Fix Makefile (it failed to update templates). + + http://people.debian.org/~roktas/patches/ikiwiki/ikiwiki-l10n.diff + +However... + +> fine. Also a final note, I haven't examined the quality of generated +> templates yet. + +Looks like, tmpl_process3 cannot preserve line breaks in template files. +For example, it processed the following template: + + Someone[1], possibly you, requested that you be emailed the password for +user + on [2]. + + The password is: + + -- + ikiwiki + + [1] The user requesting the password was at IP address + [2] Located at + +as (in Turkish): + +Birisi[1], ki muhtemelen bu sizsiniz, [2] üzerindeki + kullanıcısına ait parolanın epostalanması isteğinde +bulundu. Parola: -- ikiwiki [1] Parolayı isteyen +kullanıcının ait IP adresi: [2] + +> Looks like, tmpl_process3 cannot preserve line breaks in template files. +> For example, it processed the following template: + +This could be easily worked around in tmpl_process3, but I wouldn't like to +maintain a separate utility. + +---- + +As to the hardcoded strings in ikiwiki, I've internationalized the program, +and there is a po/ikiwiki.pot in the source that can be translated. +--[[Joey]] diff --git a/doc/todo/more_class__61____34____34___for_css.mdwn b/doc/todo/more_class__61____34____34___for_css.mdwn new file mode 100644 index 000000000..064b6b35d --- /dev/null +++ b/doc/todo/more_class__61____34____34___for_css.mdwn @@ -0,0 +1,61 @@ +I'm writing my own CSS for ikiwiki. During this effort I often found the need of adding more class="" attributes to the default ikiwiki templates. This way more presentational aspects of visual formatting can be delegated to CSS and removed from the HTML structure. + +In this patch I plan to collect changes in this direction. + +The first, one-liner, patch is to use a "div" element with a +class="actions" attribute for inline page as is done with non-inlined page. +This way the same CSS formatting can be applied to div.actions in the CSS, +while at the moment it must be duplicated for a span.actions (which I +believe is also incorrect, since it will contain a "ul" element, not sure +though). In case the markup should be differentiated it will still be +possible relying on the fact that a div.actions is contained or not in a +div.inlinepage. + +Here's the one-liner: + +> applied --[[Joey]] + +The following adds a div element with class="trailer" around the meta-information +added after an inlined page (namely: the post date, the tags, and the actions): + + --- inlinepage.tmpl.orig 2006-12-28 16:56:49.000000000 +0100 + +++ inlinepage.tmpl 2006-12-28 17:02:06.000000000 +0100 + @@ -17,6 +17,8 @@ + + + + +
    + + + + Posted + + @@ -44,3 +46,5 @@ + + +
    + + + + + +[[tag patch]] + +> Unfortunately, the inlinepage content passes through markdown, and markdown +> gets confused by these nested div's and puts p's around one of them, generating +> broken html. If you can come up with a way to put in the div that passes +> the test suite, or a fix to markdown, I will accept it, but the above patch +> fails the test suite. --[[Joey]] + +>> Just a note... This discrepancy doesn't exist in [pandoc](http://code.google.com/p/pandoc/) as +>> demonstrated in the relevant [page](http://code.google.com/p/pandoc/wiki/PandocVsMarkdownPl). +>> Pandoc is a _real parser_ for markdown (contrasting the regexp based implementation of +>> markdown.pl). I've almost finished the Debian packaging. John is working on a `--strict` mode +>> which will hopefully make pandoc a drop-in replacement for markdown. I'll upload pandoc after +>> his work has finished. Whether it could be used in IkiWiki is an open question, but having +>> alternatives is always a good thing and perhaps, the fact that pandoc can make markdown->LaTeX +>> conversion may lead to new possibilities. --[[Roktas]] + +>>> I confirm that this ([[debbug 405058]]) has just been fixed in markdown +>>> [`1.0.2b7`](http://packages.debian.org/experimental/web/markdown) (BTW, thanks to your bug +>>> report Joey). FYI, I've observed some performance drop with `1.0.2b7` compared to `1.0.1`, +>>> especially noticable with big files. This was also confirmed by someone else, for example, +>>> see this [thread](http://six.pairlist.net/pipermail/markdown-discuss/2006-August/000152.html) +>>> --[[Roktas]] diff --git a/doc/todo/rcs___40__third-party_plugin__41__.mdwn b/doc/todo/rcs___40__third-party_plugin__41__.mdwn new file mode 100644 index 000000000..3793f7533 --- /dev/null +++ b/doc/todo/rcs___40__third-party_plugin__41__.mdwn @@ -0,0 +1,25 @@ +Here is a beginning of a rcs plugin that uses rcsmerge, rcs, ci, co and rlog. +I have used it probably over hundred times but needs some work. + + + +[[tag patch]] + +> Clearly needs some cleanup and perhaps some of the missing stubs +> implemented, before it can be included into ikiwiki. +> +> Notes on individual functions: +> +> * rcs_prepedit - I'm not sure why you do the locking since the comment +> notes that the locking does no good.. +> +> * rcs_getctime - You ask why this would be better than mtime. It's +> because with something like subversion, a file's modification time or +> ctime is not necessarily accurate WRT when the file was first checked +> into the repo. +> +--[[Joey]] + +Also here is a quick script to browse the RCS history to use for "historyurl". + + diff --git a/doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn b/doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn new file mode 100644 index 000000000..9b3e015a5 --- /dev/null +++ b/doc/todo/varioki_--_add_template_variables___40__with_closures_for_values__41___in_ikiwiki.setup.mdwn @@ -0,0 +1,235 @@ +varioki - Add variables for use in ikiwiki templates + +This plugin attempts to provide a means to add templates for use in ikiwiki templates, based on a hash variable set in the ikiwiki configuration file. The motivation for this plugin was to provide an easy way for end users to add information to be used in templates -- for example, my "Blosxom" blog entry template does fancy things with the date components of the entry, and there was no easy way to get that information into the template. Or if one wants to have a different page template for the top level index page than for the rest of the pages inthe wiki (for example, to only put special content, like, say, 'last.fm" play lists, only on the front page). + +This plugin hooks itsef into the "pagetemplate" hook, and adds parameters to the appropriate templates based on the type. For example, the following inserted into "ikiwiki.setup" creates "TMPL_VAR MOTTO" and "TOPLVL" which can then be used in your templates. + + varioki => { + ’motto’ => ’"Manoj\’s musings"’, + ’toplvl’ => ’sub {return $page eq "index"}’ + }, + +For every key in the configured hash, the corresponding value is evaluated. Based on whether the value was a stringified scalar, code, array, or hash, the value of the template parameter is generated on the fly. The available variables are whatever is available to "pagetemplate" hook scripts, namely, $page, $destpage, and $template. Additionally, the global variables and functions as defined in the Ikiwiki documentation () may be used. + +ManojSrivastava + +> I think you could now implement "toplvl" using [[conditionals|/plugins/conditional]]: +> +> \[[if test="destpage(/index)" then="""...""" else="""..."""]] +> +> --[[JoshTriplett]] + +> Here's a dump of the file Manoj sent me, for reference. +> +> My take on this is that simple plugins can do the same sort of things, this is +> kind of wanting to avoid the plugin mechanism and just use templates and +> stuff in the config file. Not too thrilled about that. --[[Joey]] + +---- + +
    +* looking for srivasta@debian.org--2006-misc/ikiwiki--upstream--1.0--patch-488 to compare with
    +* comparing to srivasta@debian.org--2006-misc/ikiwiki--upstream--1.0--patch-488: ................................................................ done.
    +
    +* added files
    +
    +--- /dev/null
    ++++ mod/IkiWiki/Plugin/.arch-ids/varioki.pm.id
    +@@ -0,0 +1 @@
    ++Manoj Srivastava  Thu Dec  7 12:59:07 2006 12659.0
    +--- /dev/null
    ++++ mod/IkiWiki/Plugin/varioki.pm
    +@@ -0,0 +1,190 @@
    ++#!/usr/bin/perl
    ++#                              -*- Mode: Cperl -*- 
    ++# varioki.pm --- 
    ++# Author           : Manoj Srivastava ( srivasta@glaurung.internal.golden-gryphon.com ) 
    ++# Created On       : Wed Dec  6 22:25:44 2006
    ++# Created On Node  : glaurung.internal.golden-gryphon.com
    ++# Last Modified By : Manoj Srivastava
    ++# Last Modified On : Thu Dec  7 13:07:36 2006
    ++# Last Machine Used: glaurung.internal.golden-gryphon.com
    ++# Update Count     : 127
    ++# Status           : Unknown, Use with caution!
    ++# HISTORY          : 
    ++# Description      : 
    ++# 
    ++# arch-tag: 6961717b-156f-4ab2-980f-0d6a973aea21
    ++#
    ++# Copyright (c) 2006 Manoj Srivastava 
    ++#
    ++# This program is free software; you can redistribute it and/or modify
    ++# it under the terms of the GNU General Public License as published by
    ++# the Free Software Foundation; either version 2 of the License, or
    ++# (at your option) any later version.
    ++#
    ++# This program is distributed in the hope that it will be useful,
    ++# but WITHOUT ANY WARRANTY; without even the implied warranty of
    ++# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    ++# GNU General Public License for more details.
    ++#
    ++# You should have received a copy of the GNU General Public License
    ++# along with this program; if not, write to the Free Software
    ++# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
    ++#
    ++
    ++require 5.002;
    ++
    ++package IkiWiki::Plugin::varioki;
    ++
    ++use warnings;
    ++use strict;
    ++use IkiWiki '1.00';
    ++
    ++our $VERSION = "0.1";
    ++my $file = __FILE__;
    ++
    ++
    ++=head1 NAME
    ++
    ++varioki - Add variables for use in ikiwiki templates
    ++
    ++=cut
    ++
    ++=head1 DESCRIPTION
    ++
    ++This plugin attempts to provide a means to add templates for use in
    ++ikiwiki templates, based on a hash variable set in the ikiwiki
    ++configuration file. The motivation for this plugin was to provide an
    ++easy way for end users to add information to be used in templates --
    ++for example, my C blog entry template does fancy things with
    ++the date components of the entry, and there was no easy way to get
    ++that information into the template. Or if one wants to have a
    ++different page template for the top level index page than for the rest
    ++of the pages in the wiki (for example, to only put special content,
    ++like, say, C play lists, only on the front page).
    ++
    ++This plugin hooks itsef into the C hook, and adds
    ++parameters to the appropriate templates based on the type. For
    ++example, the following inseted into C creates
    ++C, C, C and C which can
    ++then be used in your templates. The array and hash variables are only
    ++for completeness; I suspect that the first two forms are all that are
    ++really required.
    ++
    ++ varioki => {
    ++   'motto'    => '"Manoj\'s musings"',
    ++   'toplvl'   => 'sub {return $page eq "index"}',
    ++   'arrayvar' => '[0, 1, 2, 3]',
    ++   'hashvar'  => '{1, 1, 2, 2}'
    ++ },
    ++
    ++Please note that the values in the hash must be simple strings which
    ++are then eval'd, so a string value has to be double quoted, as above
    ++(the eval strips off the outer quotes).  
    ++
    ++=cut
    ++
    ++
    ++sub import { #{{{
    ++	hook(type => "pagetemplate", id => "varioki", call => \&pagetemplate);
    ++} # }}}
    ++
    ++
    ++=pod
    ++
    ++For every key in the configured hash, the corresponding value is
    ++evaluated.  Based on whether the value was a stringified scalar, code,
    ++array, or hash, the value of the template parameter is generated on
    ++the fly.  The available variables are whatever is available to
    ++C hook scripts, namely, C<$page>, C<$destpage>, and
    ++C<$template>.  Additionally, the global variables and functions as
    ++defined in the Ikiwiki documentation
    ++(L) may be used.
    ++
    ++=cut
    ++
    ++sub pagetemplate (@) { #{{{
    ++	my %params=@_;
    ++	my $page=$params{page};
    ++	my $template=$params{template};
    ++        
    ++        return unless defined $config{varioki};
    ++         for my $var (keys %{$config{varioki}}) {
    ++           my $value;
    ++           my $foo;
    ++           eval "\$foo=$config{varioki}{$var}";
    ++           if (ref($foo) eq "CODE") {
    ++             $value = $foo->();
    ++           }
    ++           elsif (ref($foo) eq "SCALAR") {
    ++             $value = $foo;
    ++           }
    ++           elsif (ref($foo) eq "ARRAY") {
    ++             $value = join ' ', @$foo;
    ++           }
    ++           elsif (ref($foo) eq "HASH") {
    ++             for my $i (values %$foo ) {
    ++               $value .= ' ' . "$i";
    ++             }
    ++           }
    ++           else {
    ++             $value = $foo;
    ++           }
    ++           warn "$page $var $value\n";
    ++           if ($template->query(name => "$var")) {
    ++             $template->param("$var" =>"$value");
    ++           }
    ++        }
    ++} # }}}
    ++
    ++1;
    ++
    ++=head1 CAVEATS
    ++
    ++This is very inchoate, at the moment, and needs testing. Also, there
    ++is no good way to determine how to handle hashes as values --
    ++currently, the code just joins all hash values with spaces, but it
    ++would be easier for the user to just use an anonymous sub instead of
    ++passing in a hash or an array.
    ++
    ++=cut
    ++
    ++=head1 BUGS
    ++
    ++Since C evals the configuration file, the values have to all
    ++on a single physical line. This is the reason we need to use strings
    ++and eval, instead of just passing in real anonymous sub references,
    ++since the eval pass converts the coderef into a string of the form
    ++"(CODE 12de345657)" which can't be dereferenced.
    ++
    ++=cut
    ++
    ++=head1 AUTHOR
    ++
    ++Manoj Srivastava 
    ++
    ++=head1 COPYRIGHT AND LICENSE
    ++
    ++This script is a part of the Devotee package, and is 
    ++
    ++Copyright (c) 2002 Manoj Srivastava 
    ++
    ++This program is free software; you can redistribute it and/or modify
    ++it under the terms of the GNU General Public License as published by
    ++the Free Software Foundation; either version 2 of the License, or
    ++(at your option) any later version.
    ++
    ++This program is distributed in the hope that it will be useful,
    ++but WITHOUT ANY WARRANTY; without even the implied warranty of
    ++MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    ++GNU General Public License for more details.
    ++
    ++You should have received a copy of the GNU General Public License
    ++along with this program; if not, write to the Free Software
    ++Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
    ++
    ++=cut
    ++
    ++1;
    ++
    ++__END__
    ++
    +
    + +[[tag patch]] diff --git a/doc/todo/wikiwyg.mdwn b/doc/todo/wikiwyg.mdwn index 5e3430dc3..f8f04c245 100644 --- a/doc/todo/wikiwyg.mdwn +++ b/doc/todo/wikiwyg.mdwn @@ -25,3 +25,20 @@ off in WikiText mode. [[tag soc]] [[tag wishlist]] + +[[tag patch]] + +Project IkiWiki::WIKIWYG v0.8 - +=========================================================== + +[Wikiwyg][] is a "What you see is what you get" editor for wikis. It will allow you to double click on the text in a wiki and save it without reloading the page. The IkiWiki version will allow you to edit your wiki in Markdown or WYSIWYG. + +The plugin can be downloaded from + +### Current Issues + +* Code sections starting with 4 spaces do not work +* Adding links in the WYSIWYG editor is difficult +* Double lists don't work + +[Wikiwyg]: http://www.wikiwyg.net/ diff --git a/doc/todo/wikiwyg/discussion.mdwn b/doc/todo/wikiwyg/discussion.mdwn new file mode 100644 index 000000000..93b9c8ce1 --- /dev/null +++ b/doc/todo/wikiwyg/discussion.mdwn @@ -0,0 +1,33 @@ +Very nice! There are some rough spots yes, but this looks exactly as I'd +hoped it would, and seems close to being ready for merging. + +A few observations, in approximate order of priority: + +* What's the copyright and license of showdown? Please include that from + the original zip file. +* What happens if there are concurrent edits? The CGI.pm modification to + save an edited wikiwyg part doesn't seem to check if the source file has + changed in the meantime, so if the part has moved around, it might + replace the wrong part on saving. I've not tested this. +* The stuff you have in destdir now really belongs in basewiki so it's + copied over to any destdir. +* Personally, I'm not sure if I need double-click to edit a section in my + wiki, but I'd love it if the edit form in the cgi could use wikiwyg. Seems + like both of these could be independent options. Doable, I'm sure? +* It would be good to move as much as possible of the inlined javascript in + wikiwyg.tmpl out to a separate .js file to save space in the rendered + pages. +* Both this plugin and the [[Gallery_Plugin_for_Ikiwiki]] are turning out + to need to add a bunch of pages to the basewiki. I wonder what would be a + good way to do this, without bloating the basewiki when the plugins arn't + used. Perhaps the underlaydir concept needs to be expanded so it's a set + of directories, which plugins can add to. Perhaps you should work with + arpitjain on this so both plugins can benefit. (The smiley plugin would + also benefit from this..) +* Is there any way of only loading enough of wikiwyg by default to catch + the section double-clicks, and have it load the rest on the fly? I'm + thinking about initial page load time when visiting a wikiwyg-using wiki + for the first time. I count 230k or so of data that a browser downloads + in that case.. + +--[[Joey]] diff --git a/doc/translation.mdwn b/doc/translation.mdwn index b224d7031..58a8f4b48 100644 --- a/doc/translation.mdwn +++ b/doc/translation.mdwn @@ -24,7 +24,7 @@ essentailly three peices needed for a complete translation: 1. The templates also need to be translated. Some work has been done on an infrastructure for maintaining translated templates, as documented in - [[patchqueue/l10n]], but until that's complete, you'd need to copy and + [[todo/l10n]], but until that's complete, you'd need to copy and translate the templates by hand. 1. The [[basewiki]] itself needs to be translated. Whether to only translate diff --git a/doc/users/KarlMW/discussion.mdwn b/doc/users/KarlMW/discussion.mdwn index 037635122..9117abcab 100644 --- a/doc/users/KarlMW/discussion.mdwn +++ b/doc/users/KarlMW/discussion.mdwn @@ -13,7 +13,7 @@ things that need changing then I will probably need help/guidance. --[[KarlMW]] > The main problem I see is the html escaping issue. This is not really -> unique to asciidoc, see [[patchqueue/format_escape]]. I wonder if the +> unique to asciidoc, see [[todo/format_escape]]. I wonder if the > technique provided by that patch could be used to let your plugin > automatically handle the escaping. Unfortunatey, I have not yet gotten > around to reviewing/applying the patch. --[[Joey]] @@ -22,4 +22,4 @@ things that need changing then I will probably need help/guidance. >> I suspect that asciidoc can't really be made to play nice to the extent that I would want casual users/abusers to have it as a markup option on a live wiki - it's fine for a personal site where you can look at the output before putting it online, but I think it would be a hideously gaping integrity hole for anything more than that. However, for a personal site (as I am using it), it does seem to have its uses. ->> I'll keep an eye on the format_escape plugin, and assuming it is accepted into ikiwiki, will see if I can apply it to asciidoc. --[[KarlMW]] \ No newline at end of file +>> I'll keep an eye on the format_escape plugin, and assuming it is accepted into ikiwiki, will see if I can apply it to asciidoc. --[[KarlMW]] diff --git a/doc/wishlist.mdwn b/doc/wishlist.mdwn index 0c67fa975..6c3c601b1 100644 --- a/doc/wishlist.mdwn +++ b/doc/wishlist.mdwn @@ -3,4 +3,4 @@ improvements people would like to see in ikiwiki. Good patches for any of these will likely be accepted. [[inline pages="todo/* and !todo/done and !link(todo/done) and -link(wishlist) and !todo/*/*" archive=yes show=0]] +link(wishlist) and !link(patch) and !todo/*/*" archive=yes show=0]]