`; or even just
+>
+> \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtable]]
+>
+> where tagtable.tmpl looks like
+>
+>
+>
+>
+>
+>
your tag here
+>
+>
+>
+>
+>
+> I don't think you're deriving much benefit from Markdown's table syntax
+> if you have to mix it with HTML::Template and ikiwiki directives,
+> and be pathologically careful with whitespace. "Right tool for the job"
+> and all that :-)
+>
+> When I edited this page I was amused to find that you used HTML,
+> not Markdown, as its format. It seems oddly appropriate to my answer, but
+> I've converted it to Markdown and adjusted the formatting, for easier
+> commenting.
+> --[[smcv]]
diff --git a/doc/bugs/Please_update_highlight_plugin_for_highlight_3.18.mdwn b/doc/bugs/Please_update_highlight_plugin_for_highlight_3.18.mdwn
new file mode 100644
index 000000000..e98f66881
--- /dev/null
+++ b/doc/bugs/Please_update_highlight_plugin_for_highlight_3.18.mdwn
@@ -0,0 +1,12 @@
+I have put two patches
+
+ git://pivot.cs.unb.ca/ikiwiki.git -b master
+
+The first works around a highlight API change, and the second supports the new(ish)
+feature of having multiple directories with language defintions for highlight.
+
+The corresponding version of libhighlight-perl is in Debian experimental if you want to test.
+
+[[!tag patch]]
+
+> [[done]] thanks --[[Joey]]
diff --git a/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn b/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn
index 70deda2ab..7e7548657 100644
--- a/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn
+++ b/doc/bugs/__91____91____33__inline_postform__61__no__93____93___doesn__39__t_disable_it.mdwn
@@ -1,3 +1,8 @@
+[[!tag patch users/smcv/ready]]
+[[!template id=gitbranch branch=smcv/ready/postform-no
+author="[[Simon McVittie|smcv]]"
+browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/postform-no]]
+
The [[ikiwiki/directive/inline]] directive generates a form if
it has either rootpage, or postform with a "yes-like" value. This
means that
@@ -9,4 +14,10 @@ mentioning rootpage there is useless).
See also [[forum/How_to_disable_"Add_a_new_post_titled:"_submission_form?]].
+My `ready/postform-no` branch also contains a trivial regression test for
+`inline`. So far the only thing it really tests is that this bug was fixed,
+not the actual inlining of pages, but it's a start.
+
--[[smcv]]
+
+>> this looks simple, straightforward and good to me --[[chrysn]]
diff --git a/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn b/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
index b55605245..627b2c827 100644
--- a/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
+++ b/doc/bugs/can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream....mdwn
@@ -56,19 +56,36 @@ Weird... --[[anarcat]]
> >
> > --[[anarcat]]
+> > > [[!template id=gitbranch branch=ready/more-magic author="[[smcv]]" browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/ready/more-magic]]
> > > If the regex match isn't necessary and it's just about deleting the
-> > > parameters, I think I'd prefer something like
+> > > parameters, I think I'd prefer
> > >
> > > if (! defined $mimetype) {
> > > ...
> > > }
> > > $mimetype =~ s/;.*//;
> > >
-> > > but I'd be hesitant to do that without knowing why Joey implemented it
-> > > the way it is. If it's about catching a result from file(1) that
+> > > as done in my `ready/more-magic` branch.
+> > >
+> > > I'm a little hesitant to do that without knowing why Joey implemented it
+> > > the way it is, but as far as I can tell it's just an oversight.
+> > >
+> > > Or, if the result of the s/// is checked for a reason, and it's
+> > > about catching a result from file(1) that
> > > is not, in fact, a MIME type at all (empty string or error message
> > > or something), maybe something more like this?
> > >
> > > if (! defined $mimetype || $mimetype !~ s{[-\w]+/[-\w]+(?:;.*)?}{})
> > >
> > > (or whatever the allowed characters in MIME types are). --[[smcv]]
+
+> > > > I don't mind either way, but i feel this should be fixed for the next release, as I need to reapply this patch at every upgrade now. -- [[anarcat]]
+
+> > > > > This is still a problem in 3.20140831. -- [[anarcat]]
+
+> > > > > > I still don't think appending a semicolon is the right answer:
+> > > > > > at best it's equivalent to what I suggested, and at worst it's
+> > > > > > disabling a check that does have some reason behind it.
+> > > > > > I've turned the version I suggested above into a proper branch.
+> > > > > > Review by someone who can commit to ikiwiki.git would be appreciated.
+> > > > > > --[[smcv]]
diff --git a/doc/bugs/conditional_preprocess_during_scan.mdwn b/doc/bugs/conditional_preprocess_during_scan.mdwn
index 1ba142331..739be8286 100644
--- a/doc/bugs/conditional_preprocess_during_scan.mdwn
+++ b/doc/bugs/conditional_preprocess_during_scan.mdwn
@@ -55,3 +55,58 @@ reprocessed is done so in the same conditions as the original call.
>> with vicious conditional dependency circles that would break/unbreak
>> depending on which pass we are in. And I believe this is an intrinsic
>> limitation of the system, which cannot be solved at all.
+
+>>> One way forward that I can think of for this issue is to
+>>> have a way to tell `\[[!if]]` which answer it should assume for
+>>> scanning purposes, so it would assume that answer when running
+>>> in the scan phase, and really evaluate the pagespec when running
+>>> in the render phase. For instance:
+>>>
+>>> \[[!if test="enabled(foo)" scan_assume=yes then="""
+>>> \[[!foo]]
+>>> """]]
+>>>
+>>> could maybe scan \[[!foo]] unconditionally.
+>>>
+>>> This makes me wonder whether `\[[!if]]` was too general: by having
+>>> the full generality of pagespecs, it reduces its possible uses to
+>>> "those contexts where pagespecs work".
+>>>
+>>> Another possibility might be to have "complex" pagespecs and sort
+>>> orders (those whose correct answer requires scanning to have completed,
+>>> like `link()` and sorting by `meta(title)`) throw an error when used in
+>>> the scan phase, but simple pagespecs like `enabled()` and `glob()`, and
+>>> simple sort orders like `title` and `path`, could continue to work?
+>>> My `wip-too-soon` work-in-progress branch is heading in this direction,
+>>> although it currently makes `pagespec_match` fail completely and does
+>>> not even allow "simple" pagespecs and sort orders.
+>>>
+>>> At the moment, if a pagespec cannot be evaluated, `\[[!if]]` will
+>>> produce neither the `then` clause nor the `else` clause. This could
+>>> get pretty confusing if it is run during the scan phase and produces
+>>> an error, then run during the render phase and succeeds: if you had,
+>>> say,
+>>>
+>>> \[[!if run_during_scan=1 test="link(foo)" then="""
+>>> there is a link to foo
+>>> \[[!tag there_is_a_link_to_foo]]
+>>> """ else="""
+>>> there is no link to foo
+>>> \[[!tag there_is_no_link_to_foo]]
+>>> """]]
+>>>
+>>> then the resulting page would contain one of the snippets of text,
+>>> but its metadata would contain neither of the tags. Perhaps the plugin
+>>> would have to remember that it failed during the scan phase, so that
+>>> it could warn about the failure during the render phase instead of,
+>>> or in addition to, producing its normal output?
+>>>
+>>> Of the conditional-specific tests, `included()` and `destpage(glob)`
+>>> can never match during scan.
+>>>
+>>> Does anyone actually use `\[[!if]]` in ways that they would want to
+>>> be active during scan, other than an `enabled(foo)` test?
+>>> I'm increasingly tempted to add `\[[!ifenabled foo]]` to solve
+>>> that single case, and call that a solution to this bug...
+>>>
+>>> --[[smcv]]
diff --git a/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn b/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn
index 4b22fd06c..de4296000 100644
--- a/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn
+++ b/doc/bugs/cutpaste.pm:_missing_filter_call.mdwn
@@ -1,7 +1,7 @@
Consider this:
- $ wget http://schwinge.homeip.net/~thomas/tmp/cutpaste_filter.tar.bz2
- $ wget http://schwinge.homeip.net/~thomas/tmp/cutpaste_filter.patch
+ $ wget http://nic-nac-project.de/~schwinge/ikiwiki/cutpaste_filter.tar.bz2
+ $ wget http://nic-nac-project.de/~schwinge/ikiwiki/0001-cutpaste.pm-missing-filter-call.patch
$ tar -xj < cutpaste_filter.tar.bz2
$ cd cutpaste_filter/
diff --git a/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn b/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
new file mode 100644
index 000000000..f83f960ce
--- /dev/null
+++ b/doc/bugs/debwiki_shortcut_creates_buggy_URLs_to_subpages.mdwn
@@ -0,0 +1,5 @@
+E.g. [[!debwiki Derivatives/Guidelines]].
+
+Maybe we should use `%S` instead of `%s` in the shortcut definition?
+
+> seems reasonable, [[done]] --[[smcv]]
diff --git a/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn b/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn
index d8af150c1..c7d0ffbe2 100644
--- a/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn
+++ b/doc/bugs/editing_gitbranch_template_is_really_slow.mdwn
@@ -29,3 +29,37 @@ of the problem is that it evaluates the pagespec
`backlink(plugins/goodstuff)` up to a million times, with various pages and locations.
--[[smcv]]
+
+> [[!template id=gitbranch branch=smcv/ready/perf
+author="[[Simon McVittie|smcv]]"
+browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/perf]]
+> [[!tag patch users/smcv/ready]]
+>
+> Previously, if a page like `plugins/trail` contained a conditional like
+>
+> \[[!if test="backlink(plugins/goodstuff)" all=no]]
+>
+> (which it gets via `templates/gitbranch`), then the
+> [[plugins/conditional]] plugin would give `plugins/trail` a dependency on
+> `(backlink(plugins/goodstuff)) and plugins/trail`. This dependency is
+> useless: that pagespec can never match any page other than
+> `plugins/trail`, but if `plugins/trail` has been modified or deleted,
+> then it's going to be rendered or deleted *anyway*, so there's no point
+> in spending time evaluating match_backlink for it.
+>
+> Conversely, the influences from the result were not taken into account,
+> so `plugins/trail` did not have the
+> `{ "plugins/goodstuff" => $DEPEND_LINKS }` dependency that it should.
+>
+> We should invert that, depending on the influences but not on the test.
+>
+> This is at least an order of magnitude faster: when I edit the docwiki
+> as described above, a refresh takes 37s with nytprof overhead, compared
+> with 458s with nytprof overhead before this change. Without nytprof,
+> that refresh takes 14s, which is faster than the 24s rebuild again.
+> I didn't record how long the refresh took without nytprof before this
+> change, but it was something like 200s.
+>
+> `bestlink` is still the single most expensive function in this refresh
+> at ~ 9.5s, with `match_glob` at ~ 5.2s as the runner-up.
+> --[[smcv]]
diff --git a/doc/bugs/enabling_or_disabling_plugin_x_does_not_rebuild_pages_that_use_enabled__40__x__41__.mdwn b/doc/bugs/enabling_or_disabling_plugin_x_does_not_rebuild_pages_that_use_enabled__40__x__41__.mdwn
new file mode 100644
index 000000000..4b4adb2c6
--- /dev/null
+++ b/doc/bugs/enabling_or_disabling_plugin_x_does_not_rebuild_pages_that_use_enabled__40__x__41__.mdwn
@@ -0,0 +1,11 @@
+If you have a page like
+
+ \[[!if test="enabled(smileys)" then=":-P"]]
+
+then enabling or disabling the smileys plugin will not rebuild it.
+
+Unfortunately, I can't think of a good way to solve this without
+introducing a special case for `enabled()` in Render.pm, either a
+new dependency type `"enabled(smileys)" => $DEPENDS_ENABLED`
+or a special case that treats `"enabled(smileys)" => $DEPENDS_PRESENCE`
+differently. --[[smcv]]
diff --git a/doc/bugs/error_handlers_with_gettext_can_clobber___36____64__.mdwn b/doc/bugs/error_handlers_with_gettext_can_clobber___36____64__.mdwn
index 00656c1f0..719c1ef25 100644
--- a/doc/bugs/error_handlers_with_gettext_can_clobber___36____64__.mdwn
+++ b/doc/bugs/error_handlers_with_gettext_can_clobber___36____64__.mdwn
@@ -25,3 +25,5 @@ for fixing this would be to depend on something like Try::Tiny,
which is already indirectly recommended by ikiwiki, because
[[!cpan RPC::XML]], [[!cpan XML::Feed]], etc., depend on it.
--[[smcv]]
+
+[[fixed in 3.20140227|done]] --s
diff --git a/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
new file mode 100644
index 000000000..657b86baa
--- /dev/null
+++ b/doc/bugs/garbled_non-ascii_characters_in_body_in_web_interface.mdwn
@@ -0,0 +1,126 @@
+since my latest jessie upgrade here, charsets are all broken when editing a page. the page i'm trying to edit is [this wishlist](http://anarc.at/wishlist/), and it used to work fine. now, instead of:
+
+`Voici des choses que vous pouvez m'acheter si vous êtes le Père Nowel (yeah right):`
+
+... as we see in the rendered body right now, when i edit the page i see:
+
+`Voici des choses que vous pouvez m'acheter si vous �tes le P�re Nowel (yeah right):`
+
+... a typical double-encoding nightmare. The actual binary data is this for the word "Père" according to `hd`:
+
+~~~~
+anarcat@marcos:ikiwiki$ echo "Père" | hd
+00000000 50 c3 a8 72 65 0a |P..re.|
+00000006
+anarcat@marcos:ikiwiki$ echo "P�re" | hd
+00000000 50 ef bf bd 72 65 0a |P...re.|
+00000007
+~~~~
+
+> I don't know what that is, but it isn't the usual double-UTF-8 encoding:
+>
+> >>> u'è'.encode('utf-8')
+> '\xc3\xa8'
+> >>> u'è'.encode('utf-8').decode('latin-1').encode('utf-8')
+> '\xc3\x83\xc2\xa8'
+>
+> A packet capture of the incorrect HTTP request/response headers and body
+> might be enlightening? --[[smcv]]
+>
+> > Here are the headers according to chromium:
+> >
+> > ~~~~
+> > GET /ikiwiki.cgi?do=edit&page=wishlist HTTP/1.1
+> > Host: anarc.at
+> > Connection: keep-alive
+> > Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
+> > User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36
+> > Referer: http://anarc.at/wishlist/
+> > Accept-Encoding: gzip,deflate,sdch
+> > Accept-Language: fr,en-US;q=0.8,en;q=0.6
+> > Cookie: openid_provider=openid; ikiwiki_session_anarcat=XXXXXXXXXXXXXXXXXXXXXXX
+> >
+> > HTTP/1.1 200 OK
+> > Date: Mon, 08 Sep 2014 21:22:24 GMT
+> > Server: Apache/2.4.10 (Debian)
+> > Set-Cookie: ikiwiki_session_anarcat=XXXXXXXXXXXXXXXXXXXXXXX; path=/; HttpOnly
+> > Vary: Accept-Encoding
+> > Content-Encoding: gzip
+> > Content-Length: 4093
+> > Keep-Alive: timeout=5, max=100
+> > Connection: Keep-Alive
+> > Content-Type: text/html; charset=utf-8
+> > ~~~~
+> >
+> > ... which seem fairly normal... getting more data than this is a little inconvenient since the data is gzip-encoded and i'm kind of lazy extracting that from the stream. Chromium does seem to auto-detect it as utf8 according to the menus however... not sure what's going on here. I would focus on the following error however, since it's clearly emanating from the CGI... --[[anarcat]]
+
+Clicking on the Cancel button yields the following warning:
+
+~~~~
+Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215.
+~~~~
+
+> Looks as though you might be able to get a Python-style backtrace for this
+> by setting `$Carp::Verbose = 1`.
+>
+> The error is that we're taking some string (which string? only a backtrace
+> would tell you) that is already flagged as Unicode, and trying to decode
+> it from byte-blob to Unicode again, analogous to this Python:
+>
+> some_bytes.decode('utf-8').decode('utf-8')
+>
+> --[[smcv]]
+> >
+> > I couldn't figure out where to set that Carp thing - it doesn't work simply by setting it in /usr/bin/ikiwiki - so i am not sure how to use this. However, with some debugging code in Encode.pm, i was able to find a case of double-encoding - in the left menu, for example, which is the source of the Encode.pm crash.
+> >
+> > It seems that some unicode semantics changed in Perl 5.20, or more precisely, in Encode.pm 2.53, according to [this](https://code.activestate.com/lists/perl-unicode/3314/). 5.20 does have significant Unicode changes, but I am not sure they are related (see [perldelta](https://metacpan.org/pod/distribution/perl/pod/perldelta.pod)). Doing more archeology, it seems that Encode.pm is indeed where the problem started, all the way back in [commit 8005a82](https://github.com/dankogai/p5-encode/commit/8005a82d8aa83024d72b14e66d9eb97d82029eeb#diff-f3330aa405ffb7e3fec2395c1fc953ac) (august 2013), taken from [pull request #11](https://github.com/dankogai/p5-encode/pull/11) which expressively forbids double-decoding, in effect failing like python does in the above example you gave (Perl used to silently succeed instead, a rather big change if you ask me).
+> >
+> > So stepping back, it seems that this would be a bug in Ikiwiki. It could be in any of those places:
+> >
+> > ~~~~
+> > anarcat@marcos:ikiwiki$ grep -r decode_utf8 IkiWiki* | wc -l
+> > 31
+> > ~~~~
+> >
+> > Now the fun part is to determine which one should be turned off... or should we duplicate the logic that was removed in decode_utf8, or make a safe_decode_utf8 for ourselves? --[[anarcat]]
+
+The apache logs yield:
+
+~~~~
+[Mon Sep 08 16:17:43.995827 2014] [cgi:error] [pid 2609] [client 192.168.0.3:47445] AH01215: Died at /usr/share/perl5/IkiWiki/CGI.pm line 467., referer: http://anarc.at/ikiwiki.cgi?do=edit&page=wishlist
+~~~~
+
+Interestingly enough, I can't reproduce the bug here (at least in this page). Also, editing the page through git works fine.
+
+I had put ikiwiki on hold during the last upgrade, so it was upgraded separately. The bug happens both with 3.20140613 and 3.20140831. The major thing that happened today is the upgrade from perl 5.18 to 5.20. Here's the output of `egrep '[0-9] (remove|purge|install|upgrade)' /var/log/dpkg.log | pastebinit -b paste.debian.net` to give an idea of what was upgraded today:
+
+http://paste.debian.net/plain/119944
+
+This is a major bug which should probably be fixed before jessie, yet i can't seem to find a severity statement in reportbug that would justify blocking the release based on this - unless we consider non-english speakers as "most" users (i don't know the demographics well enough). It certainly makes ikiwiki completely unusable for my users that operate on the web interface in french... --[[anarcat]]
+
+Note that on this one page, i can't even get the textarea to display and i immediately get `Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215`: http://anarc.at/ikiwiki.cgi?do=edit&page=hardware%2Fserver%2Fmarcos.
+
+Also note that this is the same as [[forum/"Error: cannot decode string with wide characters" on Mageia Linux x86-64 Cauldron]], I believe. The backtrace I get here is:
+
+~~~~
+Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215. Encode::decode_utf8("**Menu**\x{d}\x{a}\x{d}\x{a} * [[\x{fffd} propos|index]]\x{d}\x{a} * [[Logiciels|software]]"...)
+called at /usr/share/perl5/IkiWiki/CGI.pm line 117 IkiWiki::decode_form_utf8(CGI::FormBuilder=HASH(0x2ad63b8))
+called at /usr/share/perl5/IkiWiki/Plugin/editpage.pm line 90 IkiWiki::cgi_editpage(CGI=HASH(0xd514f8), CGI::Session=HASH(0x27797e0))
+called at /usr/share/perl5/IkiWiki/CGI.pm line 443 IkiWiki::__ANON__(CODE(0xfaa460))
+called at /usr/share/perl5/IkiWiki.pm line 2101 IkiWiki::run_hooks("sessioncgi", CODE(0x2520138))
+called at /usr/share/perl5/IkiWiki/CGI.pm line 443 IkiWiki::cgi()
+called at /usr/bin/ikiwiki line 192 eval {...}
+called at /usr/bin/ikiwiki line 192 IkiWiki::main()
+called at /usr/bin/ikiwiki line 231
+~~~~
+
+so this would explain the error on cancel, but doesn't explain the weird encoding i get when editing the page... ...
+
+... and that leads me to this crazy patch which fixes all the above issue, by avoiding double-decoding... go figure that shit out...
+
+[[!template id=gitbranch branch=anarcat/dev/safe_unicode author="[[anarcat]]"]]
+
+> [[Looks good to me|users/smcv/ready]] although I'm not sure how valuable
+> the `$] < 5.02 || ` test is - I'd be tempted to just call `is_utf8`. --[[smcv]]
+
+>> [[merged|done]] --[[smcv]]
diff --git a/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn b/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn
index c535f88a4..6425c1ece 100644
--- a/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn
+++ b/doc/bugs/image_rescaling_distorts_with_small_pictures.mdwn
@@ -1 +1,49 @@
If you use the rescaling feature of the directive [[ikiwiki/directive/img/]] with a smaller image it will distort. E.g. an image with 150x250 rescaled into size=200x200. --bastla
+
+> More specifically: `img` normally preserves aspect ratio:
+> `size=200x200` normally means "as large as possible, keeping
+> the width 200px or less, the height 200px or less, and the
+> aspect ratio correct". So a 4:3 image with `size=200x200`
+> would actually come out 200px wide and 150px tall.
+>
+> However, when (desired width is specified) && (desired height is specified)
+> && ((width > desired width) || (height > desired height)),
+> it uses exactly the desired size, without preserving aspect ratio.
+> --smcv
+
+>> [[!template id=gitbranch branch=chrysn/imgforpdf-and-more author="[[chrysn]]"]]
+>>
+>> [[!tag patch]]
+>>
+>> i've implemented a fix for this along with a unit test.
+>>
+>> the patch branch is based on the imgforpdf branch
+>> ([[bugs/svg and pdf conversion fails]]), because it would not cleanly merge.
+>> the branch also enhances on how images are handled in preview, falling back
+>> to data: urls if the image has not been rendered in a saved version. please
+>> review. --[[chrysn]]
+
+>>> Mostly [[looks good to me|users/smcv/ready]].
+>>>
+>>> Minor things, which wouldn't stop me merging it if I could:
+>>>
+>>> * `$imgdatalink = "data:image/".$im->Get("magick").";base64,".encode_base64($blob[0]);`:
+>>> is the ImageMagick file type always valid as the second part of
+>>> a MIME type?
+>>> * In this code:
+>>>
+>>> +open (my $outhtmlfd, "<", "$outpath.html");
+>>> +local $/=undef;
+>>> +my $outhtml = <$outhtmlfd>;
+>>> +close $outhtmlfd;
+>>>
+>>> no block is closed, so the "local" is ineffective, so the `<>` operator
+>>> remains in read-entire-file mode afterwards. To avoid odd side-effects,
+>>> I would suggest using `readfile()` like `t/trail.t` does.
+>>>
+>>> [[!template id=gitbranch branch=smcv/ready/imgforpdf-and-more author="[[chrysn]], [[smcv]]"
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/imgforpdf-and-more]]
+>>> I've used `readfile()` (but not done anything about the ImageMagick file type)
+>>> in my copy of the branch.
+>>>
+>>> --[[smcv]]
diff --git a/doc/bugs/linkmap_displays_underscore_escapes.mdwn b/doc/bugs/linkmap_displays_underscore_escapes.mdwn
index 66bffc159..14164d076 100644
--- a/doc/bugs/linkmap_displays_underscore_escapes.mdwn
+++ b/doc/bugs/linkmap_displays_underscore_escapes.mdwn
@@ -17,5 +17,19 @@ the attached [[!taglink patch]] fixes this; from its commit message:
the output will look much better (at least in my wikis) with the "[[bugs/pagetitle function does not respect meta titles]]" issue fixed.
+> [[Looks good to me|users/smcv/ready]].
+>
+> I don't think it's correct for `pagetitle()` to output `\[[!meta title]]`
+> though, as discussed on the linked bug: it appears in an assortment of
+> contexts where the full formal title of the page seems inappropriate.
+> If you want linkmap to use `\[[!meta title]]`, I think it would be
+> better to give it a `show` parameter, like `\[[!map]]` has?
+> --[[smcv]]
+
+>> sounds good; i'll have a look at it the next time i touch the linkmap
+>> plugin. the patch at hand would be a starting point for that. --[[chrysn]]
+
the patch is stored in [[the patch.pl]] as created by git-format-patch, and can
be pulled from the abovementioned branch.
+
+> update 2014-06-29: branch still merges cleanly and works. --[[chrysn]]
diff --git a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
index b462948eb..ad52d780a 100644
--- a/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
+++ b/doc/bugs/listdirectives_doesn__39__t_register_a_link.mdwn
@@ -94,3 +94,21 @@ The [[ikiwiki/directive/listdirectives]]` directive doesn't register a link betw
>>> "add_reachable". On the other hand, maybe that's too computationally
>>> intensive to actually do; I haven't tried it.
>>> --[[smcv]]
+>>>>
+>>>> (I'll interpet Joeys silence as a good sign ;-). Is there a difference between "link to it" and "path to it"? If we assume autoindex produces bonafide "first class" links there shouldn't be one!?
+>>>>
+>>>> So far your idea sounds great, says me without any knowledge of the source. I'll try to grok it. Is there a medium for silly questions, a wiki seems not the right fit for that? -- [[holger]]
+>>>>> Yes, there *has* to be a difference between a first class wikilink
+>>>>> and the thing to which `map` and `inline` can contribute.
+>>>>> `map` and `inline` use a pagespec to decide what they include,
+>>>>> and pagespecs can't be evaluated and get a correct answer until the
+>>>>> set of links has been collected, because their results often depend
+>>>>> on the set of links. Otherwise, suppose you had a page `foo` whose only
+>>>>> contents were this:
+>>>>>
+>>>>> \[[!inline pages="!backlink(foo)"]]
+>>>>>
+>>>>> If `inline` generated links, it would inline exactly those pages that
+>>>>> it doesn't inline. That's never going to end well :-) --[[smcv]]
+>>>>>> We have to differentiate between what users of ikiwiki consider first class links and what internally is happening. For the user any link contributing to the structured access tree is first class. The code on the other hand has to differentiate between the static links, then generated links, then orphan links. Three "passes", even your proposed solution could be seen as adding another pass since the orphan plugin has to run after all the plugins generating (first class user) links. -- [[holger]]
+
diff --git a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
index 90e2c7900..dd5016619 100644
--- a/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
+++ b/doc/bugs/notifyemail_fails_with_some_openid_providers.mdwn
@@ -65,3 +65,27 @@ It would probably be better to add a comment on the field as indicated above, bu
Any other ideas? --[[anarcat]]
> Note: it seems that my email *is* given by my OpenID provider, no idea why this is not working, but the fix proposed in my branch works. --[[anarcat]]
+
+>> Note: this is one of two patches i need to apply at every upgrade. The other being [[can__39__t_upload_a_simple_png_image:_prohibited_by_allowed__95__attachments___40__file_MIME_type_is_application__47__octet-stream...]]. --[[anarcat]]
+
+>>> Is there any sort of check that the owner of the given email address
+>>> wants to receive email from us, or way for the owner of that email
+>>> address to stop getting the emails?
+>>>
+>>> With passwordauth, if someone maliciously subscribes my email
+>>> address to high-traffic pages or something (by using it as the
+>>> email address of their wiki login), I can at least use
+>>> password-recovery to hijack their account and unsubscribe myself.
+>>> If they're signing in with an OpenID not associated with my
+>>> email address and then changing the email address in the userdb
+>>> to point to me, I don't think I can do that.
+>>>
+>>> With OpenID, I think we're just trusting that the OpenID provider
+>>> wouldn't give us an unverified email address, which also seems
+>>> a little unwise.
+>>>
+>>> It might be better to give ikiwiki a concept of verifying an
+>>> email address (the usual send-magic-token flow) and only be
+>>> willing to send notifications to a verified address?
+>>>
+>>> --[[smcv]]
diff --git a/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn b/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn
new file mode 100644
index 000000000..073c10d14
--- /dev/null
+++ b/doc/bugs/openid_login_fails_wirth_Could_not_determine_ID_provider_from_URL.mdwn
@@ -0,0 +1,200 @@
+On some ikiwikis that I run, I get the following error on OpenID logins:
+
+ no_identity_server: Could not determine ID provider from URL.
+
+> Is this fixed now that [[!debbug 738493]] has been fixed? --[[smcv]]
+
+> > No, it isn't. I still get: `no_identity_server: Could not determine ID provider from URL.` from the latest ikiwiki in jessie (3.20140831), with liblwpx-paranoidagent-perl 1.10-3. Debugging tells me it's still related to the `500 Can't verify SSL peers without knowing which Certificate Authorities to trust` error, so probably because `Mozilla::CA` is not packaged ([[!debbug 702124]]). I still had to apply the patch to disable SSL verification at the end of this file. However, setting `$ENV{PERL_LWP_SSL_CA_PATH} = '/etc/ssl/certs';` seems to work now, so the following dumb patch works:
+> >
+> > ~~~~
+> > --- /usr/bin/ikiwiki.orig 2014-09-08 15:48:35.715868902 -0400
+> > +++ /usr/bin/ikiwiki 2014-09-08 15:50:29.666779878 -0400
+> > @@ -225,4 +225,5 @@
+> > }
+> > }
+> >
+> > +$ENV{PERL_LWP_SSL_CA_PATH} = '/etc/ssl/certs';
+> > main;
+> > ~~~~
+> >
+> > may not be the best place to fiddle around with this, but then again it makes sense that it applies to the whole program. it should probably be reported upstream as well. also in my git repo. -- [[anarcat]]
+> >
+> > > This seems Debian-specific. I would be inclined to consider this to be
+> > > a packaging/system-integration (i.e. non-upstream) bug in
+> > > `liblwpx-paranoidagent-perl` rather than an upstream bug in IkiWiki;
+> > > it certainly seems inappropriate to put this Debian-specific path
+> > > in upstream IkiWiki. If it can't be fixed in LWPX::ParanoidAgent for
+> > > whatever reason, applying it via some sort of sed in ikiwiki's
+> > > `debian/rules` might be more reasonable? --[[smcv]]
+> > >
+> > > > by "upstream", i did mean `liblwpx-paranoidagent-perl`. so yeah, maybe this should be punted back into that package's court again. :( --[[anarcat]]
+> > > >
+> > > > done, by bumping the severity of [[!debbug 744404]] to release-criticial. --[[anarcat]]
+> > > >
+> > > > > ooh cool, the bug was fixed already with an upload, so this should probably be considered [[done]] at this point, even without the patch below! great! -- [[anarcat]]
+
+[[!template id=gitbranch branch=anarcat/dev/ssl_ca_path author="[[anarcat]]"]]
+
+I seem recall having that error before, and fixing it, but it always seems to come back and I forget how to fix it. So I'll just open this bug and document it if i can figure it out... -- [[users/anarcat]]
+
+The Perl module manual says:
+
+> "no_identity_server"
+> (CV) Tried to do discovery on a URL that does not seem to have any providers at all.
+
+Yet on the server side, I see no request coming in on the OpenID provider...
+
+Adding debugging helps in figuring out wtf is going on:
+
+~~~~
+anarcat@marcos:~$ diff -u ~/src/ikiwiki/IkiWiki/Plugin/openid.pm /usr/share/perl5/IkiWiki/Plugin/openid.pm
+--- /home/anarcat/src/ikiwiki/IkiWiki/Plugin/openid.pm 2014-02-03 20:21:09.502878631 -0500
++++ /usr/share/perl5/IkiWiki/Plugin/openid.pm 2014-04-13 11:45:25.413297420 -0400
+@@ -257,6 +256,7 @@
+ return Net::OpenID::Consumer->new(
+ ua => $ua,
+ args => $q,
++ debug => 1,
+ consumer_secret => sub { return shift()+$secret },
+ required_root => auto_upgrade_https($q, $cgiurl),
+ );
+~~~~
+
+In my case, I see:
+
+
+~~~~
+[Sun Apr 13 11:45:35.796531 2014] [cgi:error] [pid 7299] [client 162.223.3.24:39547] AH01215: [DEBUG Net::OpenID::Consumer] Cache MISS for https://id.koumbit.net/anarcat, referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:45:35.842520 2014] [cgi:error] [pid 7299] [client 162.223.3.24:39547] AH01215: [DEBUG Net::OpenID::Consumer] Cache MISS for https://id.koumbit.net/anarcat, referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:45:35.845603 2014] [cgi:error] [pid 7299] [client 162.223.3.24:39547] AH01215: [DEBUG Net::OpenID::Consumer] semantic info (https://id.koumbit.net/anarcat) = , referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:45:35.845672 2014] [cgi:error] [pid 7299] [client 162.223.3.24:39547] AH01215: [DEBUG Net::OpenID::Consumer] fail(no_identity_server) Could not determine ID provider from URL., referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+~~~~
+
+There are three places in the code the original error message happens:
+
+* Net::OpenID::claimed_identity
+* Net::OpenID::verified_identity
+* Net::OpenID::_find_openid_server
+
+We'll look at the last one because it's where the URL data is actually fetched.
+
+[[!format perl """
+sub _find_openid_server {
+ my Net::OpenID::Consumer $self = shift;
+ my $url = shift;
+ my $final_url_ref = shift;
+
+ my $sem_info = $self->_find_semantic_info($url, $final_url_ref) or
+ return;
+
+ return $self->_fail("no_identity_server") unless $sem_info->{"openid.server"};
+ $sem_info->{"openid.server"};
+}
+"""]]
+
+From there we look at `_find_semantic_info()`, which is supposed to hit the OpenID server, but doesn't somehow.... By cranking up debugging, we can see that the consumer fails to verify the HTTPS signature on the host:
+
+~~~~
+[Sun Apr 13 11:58:30.284511 2014] [cgi:error] [pid 11141] [client 162.223.3.24:39563] AH01215: [DEBUG Net::OpenID::Consumer] url dump (https://id.koumbit.net/anarcat, SCALAR(0x3275ac0)) = 500 Can't verify SSL peers without knowing which Certificate Authorities to trust, referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:58:30.284551 2014] [cgi:error] [pid 11141] [client 162.223.3.24:39563] AH01215: , referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:58:30.284573 2014] [cgi:error] [pid 11141] [client 162.223.3.24:39563] AH01215: This problem can be fixed by either setting the PERL_LWP_SSL_CA_FILE, referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:58:30.284593 2014] [cgi:error] [pid 11141] [client 162.223.3.24:39563] AH01215: envirionment variable or by installing the Mozilla::CA module., referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+[Sun Apr 13 11:58:30.284597 2014] [cgi:error] [pid 11141] [client 162.223.3.24:39563] AH01215: , referer: http://cats.orangeseeds.org/ikiwiki.cgi?do=signin&action=verify&openid_identifier=https%3A%2F%2Fid.koumbit.net%2Fanarcat
+~~~~
+
+To get this little wonder, I had to change the `_find_semantic_info()` as followed:
+
+[[!format perl """
+sub _find_semantic_info {
+ my Net::OpenID::Consumer $self = shift;
+ my $url = shift;
+ my $final_url_ref = shift;
+
+ my $doc = $self->_get_url_contents($url, $final_url_ref);
+ $self->_debug("url dump ($url, $final_url_ref) = " . $doc) if $self->{debug};
+ my $info = _document_to_semantic_info($doc);
+ $self->_debug("semantic info ($url) = " . join(", ", map { $_.' => '.$info->{$_} } keys %$info)) if $self->{debug};
+
+ return $info;
+}
+"""]]
+
+A minimal test case would be:
+
+~~~~
+perl -e 'use LWPx::ParanoidAgent;
+ print $LWPx::ParanoidAgent::VERSION, " $]: ";
+ print length(LWPx::ParanoidAgent->new->get
+ ("https://id.koumbit.net/anarcat")
+ ->decoded_content), "\n";'
+~~~~
+
+And the results vary according to the version of perl:
+
+* wheezy: 1.07 5.014002: 5720
+* jessie: 1.10 5.018002: 398
+
+Thanks [jwz](http://www.jwz.org/blog/2014/03/apple-broke-lwp-in-a-new-and-exciting-way-on-10-9-2/) for that.. Mozilla::CA *could* have been packaged in Debian, except it overlaps with the `ca-certificates` package, so it was [basically barred entry](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=702124).
+
+I tried the workaround of hardcoding the path to the CA root, using `PERL_LWP_SSL_CA_PATH=/etc/ssl/certs`, but then I hit *another* bug in LWP: [#738493](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=738493).
+
+Note that this bug is similar to [[bugs/ssl_certificates_not_checked_with_openid/]], but backwards: it checks the SSL certs but then fails to verify.
+
+I filed this bug in the Debian BTS as [#702124](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=702124). Downgrading to wheezy's version of LWPx::ParanoidAgent doesn't fix the problem, instead i get this error:
+
+ 500 Can't read entity body: Resource temporarily unavailable
+
+... yet the commandline client works fine... I'm out of ideas for this sucker.
+
+Update: i found a way to reproduce the problem even with LWPx::ParanoidAgent 1.07:
+
+~~~~
+$ perl -e 'use LWPx::ParanoidAgent;
+ print $LWPx::ParanoidAgent::VERSION, " $]\n";
+ $ua = new LWPx::ParanoidAgent; for (my $i = 0; $i< 10 ; $i++) { $c = LWPx::ParanoidAgent->new->get
+ ("https://id.koumbit.net/anarcat")
+ ->decoded_content; if (length($c) < 100) { print $c; } else { print length($c),"\n";}}'
+1.07 5.018002
+5720
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+500 Can't read entity body: Ressource temporairement non disponible
+~~~~
+
+Workaround - disable error checking:
+
+~~~~
+--- /home/anarcat/src/ikiwiki/IkiWiki/Plugin/openid.pm 2014-02-03 20:21:09.502878631 -0500
++++ /usr/share/perl5/IkiWiki/Plugin/openid.pm 2014-04-13 16:00:06.875744596 -0400
+@@ -237,7 +237,7 @@
+
+ my $ua;
+ eval q{use LWPx::ParanoidAgent};
+- if (! $@) {
++ if (! $@ && 0) {
+ $ua=LWPx::ParanoidAgent->new;
+ }
+ else {
+~~~~
+
+> I get the same trouble with OpenID and some locally installed versions of IkiWiki on Debian wheezy (server) as well as on 13.10 Ubuntu (laptop). To be precise I hit the *other* bug in LWP: [#738493](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=738493).
+>
+> My only workaround for now was to fix `PERL_LWP_SSL_VERIFY_HOSTNAME` to 0 directly in `ikiwiki` :-( -- [[users/bbb]]
+
+~~~~
+--- /usr/bin/ikiwiki.orig 2014-09-08 15:48:35.715868902 -0400
++++ /usr/bin/ikiwiki 2014-09-08 15:48:38.895947911 -0400
+@@ -225,4 +225,5 @@
+ }
+ }
+
++$ENV{PERL_LWP_SSL_VERIFY_HOSTNAME} = 0;
+ main;
+~~~~
+
diff --git a/doc/bugs/password_reset_fails_with___34__Wide_character_in_subroutine_entry__34__.mdwn b/doc/bugs/password_reset_fails_with___34__Wide_character_in_subroutine_entry__34__.mdwn
new file mode 100644
index 000000000..b9452a5ef
--- /dev/null
+++ b/doc/bugs/password_reset_fails_with___34__Wide_character_in_subroutine_entry__34__.mdwn
@@ -0,0 +1,29 @@
+Similar to [[bugs/syslog_fails_with_non-ASCII_wikinames]], this bug happens when the wiki name has non-ascii characters in the site name. In my case, it has the "Câ¶TS" string.
+
+We get the following error in a password reset:
+
+ Error: Wide character in subroutine entry at /usr/share/perl5/Mail/Sendmail.pm line 308.
+
+Help! :) --[[anarcat]]
+
+> I assume this means Mail::Sendmail doesn't know how to send Unicode
+> strings, so any string passed to it (or any message body, or something?)
+> will need to be passed through `encode_utf8()`. It looks as though
+> Mail::Sendmail also defaults to
+>
+> Content-Type: 'text/plain; charset="iso-8859-1"'
+>
+> so it'll need a `'Content-Type' => 'text/plain; charset="utf-8"'`
+> too.
+>
+> I'm disappointed to see how many of the library modules used by ikiwiki
+> are not Unicode-clean... but then again, Mail::Sendmail was last released
+> in 2003 so it's hardly surprising. I wonder whether [[!cpan Email::Sender]]
+> is any better?
+>
+> (If you know Python 2, the analogous situation would be "doesn't
+> know how to send unicode objects, so you have to get a str object
+> with `a_unicode_object.encode('utf-8')`".) --[[smcv]]
+
+>> Shameless plug: [[todo/passwordauth:_sendmail_interface]]. Though, I have
+>> no idea whether that is UTF-8-safe. --[[tschwinge]]
diff --git a/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn b/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn
new file mode 100644
index 000000000..bb6cd17d3
--- /dev/null
+++ b/doc/bugs/possible_to_post_comments_that_will_not_be_displayed.mdwn
@@ -0,0 +1,32 @@
+[[!template id=gitbranch branch=smcv/ready/comments author="[[smcv]]"
+browse="http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/comments"]]
+[[!tag patch users/smcv/ready]]
+
+The ability to post comments depends on several factors:
+
+* `comments_pagespec` controls whether comments on a particular
+ page will be displayed
+* `comments_closed_pagespec` controls whether comments on
+ a particular page are allowed
+* the `check_canedit` call controls whether comments are allowed
+ for a particular combination of page and user
+
+If `check_canedit` says that a user can post a comment
+(in particular, if [[plugins/opendiscussion]] is enabled or
+[[plugins/lockedit]] is disabled or permissive),
+and `comments_closed_pagespec` does not contradict it,
+then users who construct a `do=comment` CGI URL manually
+can post comments that will not be displayed. I don't think
+this is a security flaw as such, which is why I'm not
+reporting it privately, but it violates least-astonishment.
+
+My `ready/comments` branch fixes this, by changing the test
+at submission time from (pseudocode)
+
+ !comments_closed_pagespec && check_canedit
+
+to
+
+ comments_pagespec && !comments_closed_pagespec && check_canedit
+
+--[[smcv]]
diff --git a/doc/bugs/preprocessing_loop_control_too_tight.mdwn b/doc/bugs/preprocessing_loop_control_too_tight.mdwn
index 807d6b7ef..7cf92af57 100644
--- a/doc/bugs/preprocessing_loop_control_too_tight.mdwn
+++ b/doc/bugs/preprocessing_loop_control_too_tight.mdwn
@@ -18,6 +18,6 @@ index 75c9579..ad0f8b0 100644
[[!tag patch]]
-> [[Seems reasonable|users/smcv/yesplease]] --smcv
+> [[Seems reasonable|users/smcv/ready]] --smcv
>> [[done]] --[[Joey]]
diff --git a/doc/bugs/pythonproxy-utf8_again.mdwn b/doc/bugs/pythonproxy-utf8_again.mdwn
new file mode 100644
index 000000000..cc6d11de7
--- /dev/null
+++ b/doc/bugs/pythonproxy-utf8_again.mdwn
@@ -0,0 +1,68 @@
+[[!template id=gitbranch branch=chrysn/more-proxy-utf8-fail author="[[chrysn]]"]]
+[[!template id=gitbranch author="[[chrysn]], [[smcv]]" branch=smcv/ready/more-proxy-utf8-fail
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/more-proxy-utf8-fail]]
+
+the recently introduced fixes for [[crashes in the python proxy even if disabled]]
+caused the typical python2 implicit conversion failures ("'ascii' codec
+can't...") on my debian sid system -- to fix it, i had to revert commit 154c4ea9e.
+
+i did not dig down all the way to the xml / xmlrpc modules, but my impression
+is that some module changed its behavior between stable and sid and now
+generates `unicode` strings instead of `str`.
+
+a [[patch]] to allow both versions by inspecting the types and en-/decoding on
+demand should work both for anarcat's and my case. i did not test the python3
+version, but i'm pretty sure it was already broken after the abovementioned
+patch.
+
+-- [[chrysn]]
+
+> update 2014-06-29: the problem persists, but i found it is not trivial to
+> reproduce. to demonstrate, use this test plugin:
+>
+> #!/usr/bin/env python
+> # -*- coding: utf-8 -*-
+>
+> from proxy import IkiWikiProcedureProxy
+>
+> def preprocess(self, proxy, *args):
+> return repr(self.rpc('pagetype', 'schön'))
+>
+> proxy = IkiWikiProcedureProxy(__name__)
+> proxy.hook('preprocess', preprocess, id='testdirective')
+> proxy.run()
+>
+> note that when the 'schön' is stored in a variable, the exception changes --
+> it seems to me that the issue is related to the way exceptions are encoded.
+>
+> the suggested patch still applies and solves the issue. --[[chrysn]]
+
+>> In this patch band:
+>>
+>> - xml = _IkiWikiExtPluginXMLRPCHandler._read(in_fd).decode('utf8')
+>> + response = _IkiWikiExtPluginXMLRPCHandler._read(in_fd)
+>> + if isinstance(response, unicode):
+>> + xml = response.encode('utf8')
+>>
+>> I think you mean `response.decode`, not `response.encode`.
+>>
+>> Other than that it looks good to me. I like the use of `repr` in debug
+>> messages. --[[smcv]]
+
+>>> afaict, encode is fine there -- the relevant methods in python2 are
+>>> `unicode.encode` which gives a `str`, and `str.decode` which usually gives
+>>> a `unicode`. (i'd happily ditch python2 and port all plugins to python3,
+>>> where this is all easier, but my [[todo/vCard rendering]] still uses an
+>>> ancient module.) --[[chrysn]]
+
+>>>> You were right about this, `encode` is appropriate to go from `unicode`
+>>>> to `str` under Python 2. However, Python 3 is still broken.
+>>>>
+>>>> My `ready/more-proxy-utf8-fail` branch, based on yours,
+>>>> [[fixes the `rst` test when run under Python 3|bugs/rst_plugin_hangs_when_used_with_Python_3]]
+>>>> and hopefully also fixes this one. Please check that it still
+>>>> fixes your test-case too.
+>>>>
+>>>> Joey, I think this is [[ready for merge|users/smcv/ready]] even if it
+>>>> doesn't fix chrysn's bug - it does fix Python 3 support
+>>>> in general. --[[smcv]]
diff --git a/doc/bugs/redirect.mdwn b/doc/bugs/redirect.mdwn
index 6296c3df1..87f6a67e7 100644
--- a/doc/bugs/redirect.mdwn
+++ b/doc/bugs/redirect.mdwn
@@ -24,3 +24,30 @@ then the following command should print 302
> In current ikiwiki, you can get a broadly similar effect by either
> using \[[!meta redir=foo]] (which does a HTML `` redirect)
> or reconfiguring the web server. --[[smcv]]
+
+>> The CGI spec (http://www.ietf.org/rfc/rfc3875) says that a CGI can cause a redirect by returning a Location: header.
+>> So it's possible; desirable (due to your point about conflicting with git-annex support) is a different matter.
+
+>>> One of the major things that separates ikiwiki from other wiki software
+>>> is that ikiwiki is a wiki compiler: ordinary page-views are purely
+>>> static HTML, and the CGI only gets involved when you do something
+>>> that really has to be dynamic (like an edit).
+>>>
+>>> However, there is no server-independent static content that ikiwiki
+>>> could write out to the destdir that would result in that redirect.
+>>>
+>>> If you're OK with requiring the [[plugins/404]] plugin (and a
+>>> web server where it works, which I think still means Apache) then
+>>> it would be possible to write a plugin that detected symlinks,
+>>> stored them in the `%wikistate`, and used them to make the
+>>> [[plugins/404]] plugin (or its own hook similar to the one
+>>> in that plugin) do a 302 redirect instead of a 404.
+>>> Similarly, a plugin that assumed a suitable Apache
+>>> configuration with fairly broad `AllowOverrides`,
+>>> and wrote out `.htaccess` files, would be a feasible thing
+>>> for someone to write.
+>>>
+>>> I don't think this is a bug; I think it's a request for a
+>>> feature that not everyone will want. The solution to those
+>>> is for someone who wants the feature to
+>>> [[write a plugin|plugins/write]]. --[[smcv]]
diff --git a/doc/bugs/rst_plugin_fails_with___34__uncaught_exception:___39__ascii__39___codec_can__39__t_encode_character__34__.mdwn b/doc/bugs/rst_plugin_fails_with___34__uncaught_exception:___39__ascii__39___codec_can__39__t_encode_character__34__.mdwn
new file mode 100644
index 000000000..1893e7089
--- /dev/null
+++ b/doc/bugs/rst_plugin_fails_with___34__uncaught_exception:___39__ascii__39___codec_can__39__t_encode_character__34__.mdwn
@@ -0,0 +1,40 @@
+ I get this error when enabling the `rst` plugin. I am running IkiWiki
+3.20130904.1ubuntu1 on Ubuntu 14.04 in a non-English UTF-8 locale; the
+pages can also contain characters in UTF-8 encoding.
+
+ uncaught exception: 'ascii' codec can't encode character u'\xa9' in position 13: ordinal not in range(128)
+ Traceback (most recent call last):
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 309, in run
+ self._in_fd, self._out_fd)
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 192, in handle_rpc
+ ret = self._dispatcher.dispatch(method, params)
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 84, in dispatch
+ return self._dispatch(method, params)
+ File "/usr/lib/python2.7/SimpleXMLRPCServer.py", line 420, in _dispatch
+ return func(*params)
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 253, in hook_proxy
+ "{0} hook `{1}' returned: [{2}]".format(type, name, ret))
+ UnicodeEncodeError: 'ascii' codec can't encode character u'\xa9' in position 13: ordinal not in range(128)
+
+ Traceback (most recent call last):
+ File "/usr/lib/ikiwiki/plugins/rst", line 86, in
+ proxy.run()
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 317, in run
+ self.error('uncaught exception: {0}\n{1}'.format(e, tb))
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 298, in error
+ self.rpc('error', msg)
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 233, in rpc
+ *args, **kwargs)
+ File "/usr/lib/ikiwiki/plugins/proxy.py", line 173, in send_rpc
+ raise GoingDown()
+ proxy.py.GoingDown
+
+A fix is akin to the one for
+: change
+`...format(type, name, ret)` in `proxy.py` line 253 to `format(type,
+name, repr(ret))` (which should not hurt since it's a message
+for debugging purposes only).
+
+
+> this is [[fixed|done]] in commit [154c4ea9](http://source.ikiwiki.branchable.com/?p=source.git;a=commit;h=154c4ea9e65d033756330a7f8c5c0fa285380bf0)
+> (november 2013), which is included in 3.20140227. --[[chrysn]]
diff --git a/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn b/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn
new file mode 100644
index 000000000..ca0738ad5
--- /dev/null
+++ b/doc/bugs/rst_plugin_hangs_when_used_with_Python_3.mdwn
@@ -0,0 +1,35 @@
+During ikiwiki make phase the rst process hangs:
+[ps output](http://dpaste.com/21TQQKT)
+[gdb backtrace 1](http://dpaste.com/0VQBW6D)
+[gdb backtrace 1](http://dpaste.com/1VHS88Y)
+
+working with python 2.7
+[http://dpaste.com/0985A91](http://dpaste.com/0985A91)
+not working with python3.3~3.4
+[http://dpaste.com/0ACNK3W](http://dpaste.com/0ACNK3W)
+
+> Retitled this bug report since it seems to be specific to Python 3.
+>
+> The `rst` plugin is probably more commonly used with Python 2.
+> It seems likely that there is some Python-3-specific bug in `proxy.py`,
+> perhaps introduced by [commit 154c4ea
+ "properly encode and decode from/to utf8 when sending rpc to ikiwiki"](
+http://source.ikiwiki.branchable.com/?p=source.git;a=commitdiff;h=154c4ea9e65d033756330a7f8c5c0fa285380bf0).
+>
+> I can reproduce this on Debian by installing `python3-docutils`
+> and changing the first line of `plugins/proxy.py`, the first
+> line of `plugins/pythondemo`, the first line of `plugins/rst`
+> and the `system()` call in `t/rst.t` to use `python3` instead
+> of `python`. --[[smcv]]
+
+looks like the problem is in proxy.py
+ml = _IkiWikiExtPluginXMLRPCHandler._read(in_fd).decode('utf8')
+
+without decode('utf8') is working
+
+> That call was introduced
+> [[to fix a bug under Python 2|bugs/crashes_in_the_python_proxy_even_if_disabled]]
+> so it cannot just be removed, but I've put a proposed branch on
+> [[this related bug|bugs/pythonproxy-utf8_again]]. [[!tag patch]] --smcv
+
+tested and fixed with patch [http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/38bd51bc1bab0cabd97dfe3cb598220a2c02550a](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/38bd51bc1bab0cabd97dfe3cb598220a2c02550a) and patch [http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/81506fae8a6d5360f6d830b0e07190e60a7efd1c](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/81506fae8a6d5360f6d830b0e07190e60a7efd1c)
diff --git a/doc/bugs/svg_and_pdf_conversion_fails.mdwn b/doc/bugs/svg_and_pdf_conversion_fails.mdwn
new file mode 100644
index 000000000..ac18fe8aa
--- /dev/null
+++ b/doc/bugs/svg_and_pdf_conversion_fails.mdwn
@@ -0,0 +1,58 @@
+[[!template id=gitbranch branch=chrysn/imgforpdf author="[[chrysn]]"]]
+
+when using the [[img plugin|plugins/img]] with an svg file, it is supposed to
+convert it into a png for display in all browsers, and because the typical use
+case is rendering small preview versions.
+
+this currently doesn't work (at least with graphicsmagick-libmagick-dev-compat
+1.3.18-1) due to the sequence imagemagick options are set, needs an extension
+to work for pdfs (or any other imagemagick compatibile file) too, and should
+have an additional parameter for page selection.
+
+i've provided a series of [[!taglink patch]]es in the chrysn/imgforpdf [[git]]
+branch.
+
+i'd prefer to go a step further, and not only convert pdf and svg files to png,
+but everything (with the possible exception of jpg files), as most other image
+formats can't be displayed in a browser anyway -- but i didn't in this patch
+series, as it would alter the file names of existing images, i don't know if
+that needs special care or breaks something i don't use; this way, my patches
+should be safe for inclusion.
+
+--[[chrysn]]
+
+> update 2014-06-29: the patch still applies and fixes the issue. in the
+> meantime, i noticed that the desired effect doesn't happen when no explicit
+> size is set. as scalable graphics don't necessarily have a natural size
+> anyway, i don't consider that a showstopper. --[[chrysn]]
+
+>> This all looks good in principle, but I would like to do a more detailed
+>> review, and test it with "real ImageMagick" in case its behaviour differs
+>> from GraphicsMagick.
+>>
+>> An automated regression test for the desired behaviour in `t/` would
+>> be great. There are SVGs and PNGs in the docwiki already; there are no
+>> JPEGs or PDFs, but perhaps you could add a trivially small example
+>> of each to `t/`? Imitating `t/tag.t` or `t/trail.t`, and skipping the
+>> test if the required modules are missing like `t/podcast.t` does,
+>> seems like it would work best.
+>>
+>> I agree that everything not in an interoperable web format should be
+>> converted to PNG when it's scaled down, but yes, that's more likely
+>> to be a breaking change, so it seems best to do that as a separate
+>> branch. In practice I think this means JPEG -> JPEG and everything
+>> else -> PNG, since JPEG is commonly used for photos and photo-like
+>> images that don't compress well under lossless compression. --[[smcv]]
+
+>>> i've added a unit test and tested it with the [[!debsid perlmagick]]
+>>> package, the [[!debsid graphicsmagick-libmagick-dev-compat]] package and
+>>> the experimental [[!debpts libimage-magick-perl]] package (where the
+>>> [[!debpts libmagickcore-6.q16-2-extra]] package is required too), in the
+>>> meantime filing [[!debbug 753770]]. (why is it that it sometime seems i
+>>> find more bugs in ikiwiki's dependencies than in itself when working with
+>>> it?)
+>>>
+>>> the unit test also checks for file removal when it is not created any more,
+>>> which works, so my biggest fear about the all-to-png change is unwarranted.
+>>> i'll have a look at that some time, but i think as things are, this is
+>>> ready now, please review again. --[[chrysn]]
diff --git a/doc/bugs/syslog_fails_with_non-ASCII_wikinames.mdwn b/doc/bugs/syslog_fails_with_non-ASCII_wikinames.mdwn
index b641f2db2..0d40d232a 100644
--- a/doc/bugs/syslog_fails_with_non-ASCII_wikinames.mdwn
+++ b/doc/bugs/syslog_fails_with_non-ASCII_wikinames.mdwn
@@ -18,10 +18,15 @@ Yet I am not sure how to fix that kind of problem in Perl... --[[anarcat]]
>
> Error: Wide character in syswrite at /usr/lib/perl/5.14/Sys/Syslog.pm line 485.
>
-> I have improved a little the error handling in log_message() so that we see *something* when syslog fails, see the branch documented above. I can also confirm that reverting [[todo/syslog_should_show_wiki_name]] fixes the bug. Finally, I have a unit test that reproduces the problem in git, and a working [[!taglink patch]] for the bug, again in git.
+> I have improved a little the error handling in log_message() so that we see *something* when syslog fails, see the branch documented above. I can also confirm that reverting [[todo/syslog_should_show_wiki_name]] fixes the bug. Finally, I have a unit test that reproduces the problem in git, and a working patch for the bug, again in git.
>
> > One last note: I noticed that this problem also happens elsewhere in ikiwiki. For example, the [[plugins/notifyemail]] plugin will silently fail to send notifications if the pages contain unicode. The [[plugins/notifychanges]] plugin I am working on (in [[todo/option to send only the diff in notifyemail]]) seems to be working around the issue so far, but there's no telling which similar problem are out there.
->> [[I'd merge it|/users/smcv/yesplease]]. --[[smcv]]
+>> I'd merge it. --[[smcv]]
>>> I've merged it, but I don't feel it fixes this bug. --[[Joey]]
+
+>>>> (I removed the patch tag to take it off the patches list.)
+>>>>
+>>>> What else is needed? Systematic classification of outputs into
+>>>> those that do and don't cope with Unicode? --[[smcv]]
diff --git a/doc/bugs/template_creation_error.mdwn b/doc/bugs/template_creation_error.mdwn
index f14652ed8..d1fb788f5 100644
--- a/doc/bugs/template_creation_error.mdwn
+++ b/doc/bugs/template_creation_error.mdwn
@@ -194,27 +194,77 @@ Please, let me know what to do to avoid this kind of error.
>>>>>
>>>>> --[[chrysn]]
->>>>>> [[!template id=gitbranch author="[[smcv]]" branch=smcv/ready/templatebody
- browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/templatebody]]
->>>>>> [[!tag patch]]
->>>>>> Branch and directive renamed to `ready/templatebody` as chrysn suggested.
->>>>>> It's on-by-default now (or will be if that branch is merged).
->>>>>> Joey, any chance you could review this?
->>>>>>
->>>>>> There is one known buglet: `template_syntax.t` asserts that the entire
->>>>>> file is a valid HTML::Template, whereas it would ideally be doing the
->>>>>> same logic as IkiWiki itself. I don't think that's serious. --[[smcv]]
-
->>>>>>> Looking over this, I notice it adds a hash containing all scanned
->>>>>>> files. This seems to me to be potentially a scalability problem on
->>>>>>> rebuild of a site with many pages. Ikiwiki already keeps a lot
->>>>>>> of info in memory, and this adds to it, for what is a fairly
->>>>>>> minor reason. It seems to me there should be a way to avoid this. --[[Joey]]
-
->>>>>>>> Maybe. Are plugins expected to cope with scanning the same
->>>>>>>> page more than once? If so, it's just a tradeoff between
->>>>>>>> "spend more time scanning the template repeatedly" and
->>>>>>>> "spend more memory on avoiding it", and it would be OK to
->>>>>>>> omit that, or reduce it to a set of scanned *templates*
->>>>>>>> (in practice that would mean scanning each template twice
->>>>>>>> in a rebuild). --s
+----
+
+[[!template id=gitbranch author="[[smcv]]" branch=smcv/ready/templatebody
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/templatebody]]
+[[!tag patch users/smcv/ready]]
+Branch and directive renamed to `ready/templatebody` as chrysn suggested.
+It's on-by-default now (or will be if that branch is merged).
+Joey, any chance you could review this?
+
+There is one known buglet: `template_syntax.t` asserts that the entire
+file is a valid HTML::Template, whereas it would ideally be doing the
+same logic as IkiWiki itself. I don't think that's serious. --[[smcv]]
+
+> Looking over this, I notice it adds a hash containing all scanned
+> files. This seems to me to be potentially a scalability problem on
+> rebuild of a site with many pages. Ikiwiki already keeps a lot
+> of info in memory, and this adds to it, for what is a fairly
+> minor reason. It seems to me there should be a way to avoid this. --[[Joey]]
+
+>> Maybe. Are plugins expected to cope with scanning the same
+>> page more than once? If so, it's just a tradeoff between
+>> "spend more time scanning the template repeatedly" and
+>> "spend more memory on avoiding it", and it would be OK to
+>> omit that, or reduce it to a set of scanned *templates*
+>> (in practice that would mean scanning each template twice
+>> in a rebuild). --s
+>>> [Commit f7303db5](http://source.ikiwiki.branchable.com/?p=source.git;a=commitdiff;h=f7303db5)
+>>> suggests that scanning the same page more than once is problematic,
+>>> so that solution is probably not going to work.
+>>>
+>>> The best idea I've come up with so far is to track whether
+>>> we're in the scan or render phase. If we're in the scan
+>>> phase, I think we do need to keep track of which pages
+>>> we've scanned, so we don't do them again? (Or perhaps that's
+>>> unnecessary - commit f7303db5 removed a scan call that's in
+>>> the render phase.) If we're in the render phase, we can assume
+>>> that all changed pages have been scanned already, so we can
+>>> drop the contents of `%scanned` and rely on a single boolean
+>>> flag instead.
+>>>
+>>> `%scanned` is likely to be no larger than `%rendered`, which
+>>> we already track, and whose useful lifetime does not overlap
+>>> with `%scanned` now. I was tempted to merge them both and call
+>>> the result `%done_in_this_phase`, but that would lead to really
+>>> confusing situations if a bug led to `render` being called sooner
+>>> than it ought to be.
+>>>
+>>> My ulterior motive here is that I would like to formalize
+>>> the existence of different phases of wiki processing - at the
+>>> moment there are at least two phases, namely "it's too soon to
+>>> match pagespecs reliably" and "everything has been scanned,
+>>> you may use pagespecs now", but those phases don't have names,
+>>> so [[plugins/write]] doesn't describe them.
+>>>
+>>> I'm also considering adding warnings
+>>> if people try to match a pagespec before scanning has finished,
+>>> which can't possibly guarantee the right result, as discussed in
+>>> [[conditional_preprocess_during_scan]]. My `wip-too-soon` branch
+>>> is a start towards that; the docwiki builds successfully, but
+>>> the tests that use IkiWiki internals also need updating to
+>>> set `$phase = PHASE_RENDER` before they start preprocessing. --s
+
+>>>> reviewing those modifications, i think this is a good way to go. along
+>>>> with warning about pagespecs evaluated in scan phase, i think it should be
+>>>> an error to invoke scan in the render phase; that would mean that
+>>>> `readtemplate` needs to check whether it's invoked as a scan or not to
+>>>> decide whether to scan the template page, but would be generally more
+>>>> robust for future plugin writing.
+>>>>
+>>>> **addendum**: if the new phase state is used to create warnings/errors
+>>>> about improper ikiwiki api use of plugins (which is something i'd
+>>>> advocate), that should likewise warn if `add_link` actually adds a link in
+>>>> the render phase. such a warning would have helped spotting the
+>>>> link-related [[template evaluation oddities]] earlier. --[[chrysn]]
diff --git a/doc/bugs/template_evaluation_oddities.mdwn b/doc/bugs/template_evaluation_oddities.mdwn
new file mode 100644
index 000000000..06ef57375
--- /dev/null
+++ b/doc/bugs/template_evaluation_oddities.mdwn
@@ -0,0 +1,67 @@
+[[ikiwiki/directive/template]]s expose odd behavior when it comes to composing
+links and directives:
+
+* the parameters are passed through the preprocessor twice, once on
+ per-parameter basis and once for the final result (which usually contains the
+ preprocessed parameters).
+
+ one of the results it that you have to write:
+
+ \[[!template id="infobox" body="""
+ Just use the \\\[[!template]] directive!
+ """]]
+
+ (that'd be three backslashes in front of the opening [.)
+
+
+
+ this also means that parts which are not used by the template at all still
+ have their side effects without showing.
+
+ furthermore, the evaluation sequence is hard to predict. this might or might
+ not be a problem, depending on whether someone comes up with a less contrived
+ example (this one assumes a ``\[[!literal value]]`` directive that just
+ returns value but protects it from the preprocessor):
+
+ we can use `\[[!literal """[[!invalid example]]"""]]`, but we can't use
+ `\[[!template id=literalator value="""[[!invalid example]]"""]]` with a
+ 'literalator' template `\[[!literal """"""]]` because then the `invalid` directive comes to action in
+ the first (per-argument) preprocessor run
+
+* links in templates are not stored at all; they appear, but the backlinks
+ don't work unless the link is explicit in one of the arguments.
+
+ \[[!template id="linker" destination="foo"]]
+
+ with a 'linker' template like
+
+ Go to \[[]]!
+
+ would result in a link to 'destination', but would not be registered in the
+ scan phase and thus not show a backlink from 'foo'.
+
+ (a ``\[[!link to=...]]`` directive, as suggested in
+ [[todo/flexible relationships between pages]], does get evaluated properly
+ though.)
+
+ this seems to be due to linkification being called before preprocess rather
+ than as a part of it, or (if that is on purpose) by the template plugin not
+ running linkification as an extra step (not even once).
+
+(nb: there is a way to include the ``raw_`` value of a directive, but that only
+refers to htmlification, not directive evaluation.)
+
+both those behaviors are non-intuitive and afaict undocumented. personally, i'd
+swap them out for passing the parameters as-is to the template, then running
+the linkifier and preprocessor on the final result. that would be as if all
+parameters were queried `raw_` -- then again, i don't see where `raw_` makes
+anything not work that worked originally, so obviously i'm missing something.
+
+
+i think it boils down to one question: are those behaviors necessary for
+compatibility reasons, and if yes, why?
+
+--[[chrysn]]
diff --git a/doc/bugs/trails_depend_on_everything.mdwn b/doc/bugs/trails_depend_on_everything.mdwn
new file mode 100644
index 000000000..babb1e361
--- /dev/null
+++ b/doc/bugs/trails_depend_on_everything.mdwn
@@ -0,0 +1,14 @@
+[[!template id=gitbranch branch=smcv/ready/trail-sort
+author="[[Simon McVittie|smcv]]"
+browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/trail-sort]]
+[[!tag patch users/smcv/ready]]
+
+On [[trail's discussion page|plugins/trail/discussion]], [[kjs]] pointed out
+that [[plugins/trail]] and [[plugins/contrib/album]] get excessive
+dependencies on `internal(*)`. I tracked this down to their (ab)use of
+`pagespec_match_list` with the pagespec `internal(*)` to sort a pre-existing
+list of pages.
+
+They should just sort the pages instead; they'll already have all the
+dependencies they need. My branch adds `IkiWiki::sort_pages` but does not
+make it plugin API just yet. --[[smcv]]
diff --git a/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
index 702608831..0673aa674 100644
--- a/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
+++ b/doc/bugs/transient_autocreated_tagbase_is_not_transient_autoindexed.mdwn
@@ -6,9 +6,11 @@
Shouldn't `ikiwiki-tag-test/raw/.ikiwiki/transient/tag.mdwn` and `ikiwiki-tag-test/rendered/tag/index.html` exist?
-[[!tag patch]]
-[[!template id=gitbranch branch=smcv/ready/autoindex author=smcv]]
-[[!template id=gitbranch branch=smcv/ready/autoindex-more-often author=smcv]]
+[[!tag patch users/smcv/ready]]
+[[!template id=gitbranch branch=smcv/ready/autoindex author=smcv
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/autoindex]]
+[[!template id=gitbranch branch=smcv/ready/autoindex-more-often author=smcv
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/autoindex-more-often]]
> To have a starting point to (maybe) change this, my `ready/autoindex`
> branch adds a regression test for the current behaviour, both with
@@ -29,3 +31,44 @@ Shouldn't `ikiwiki-tag-test/raw/.ikiwiki/transient/tag.mdwn` and `ikiwiki-tag-te
> git repositories any more? My `autoindex-more` branch changes
> the logic so it will do what you want in the `autoindex_commit => 0`
> case, and amends the appropriate regression test. --[[smcv]]
+
+>> the autoindex-more-often branch looks good to me in general.
+>>
+>> i do have doubts about the 3ba2ef1a patch ("remove unnecessary special case
+>> for transient underlay"): now that we consider the complete transient
+>> directory as well, the sequence in which the refresh hooks are called starts
+>> to matter, and pages created by other plugins in a similar fashion as by
+>> autoindex will only be included the next time refresh gets called.
+>>
+>> *addendum:* i just found where i discussed the issue of fighting transient
+>> pages last, it was on [[todo/alias directive]]. the example cited there
+>> (conflicts with autotag) would probably work here as well. (imagine a
+>> `tags/project/completed` and a `tags/project/inprogress` exist, and a page
+>> is tagge `tags/project`. will that be an autoindex or an autotag?)
+>>
+>> --[[chrysn]]
+
+>>> That's a fair point. I think what happens is down to commit vs. refresh
+>>> timing.
+>>>
+>>> If pages tagged t/p/c, t/p/i and t/p are all created between one
+>>> refresh and the next, with none of those tag pages existing, I think the
+>>> answer is that they would all be autotags, because until t/p/c and
+>>> t/p/i are created, there's no reason to need t/p as an autoindex.
+>>>
+>>> If there were already pages tagged t/p/c and t/p/i at the previous
+>>> refresh, then t/p would already be an autoindex, and that's a
+>>> valid page, so autotagging wouldn't touch it.
+>>>
+>>> I can't see much reason to prefer one over the other; the ideal answer
+>>> is probably to have a tag-cloud *and* a list of child pages, but this
+>>> seems a weird enough thing to do that I'd be OK with a wiki user
+>>> having to disambiguate it themselves. "Whichever automatic process
+>>> happens first, happens" is at least easy to explain, and I consider
+>>> both autoindices and autotags to be time-saving conveniences rather
+>>> than something fundamental. --s
+
+>>>> i think a behavior that does the right thing when there is a right thing
+>>>> and *something* when there is ambiguity is ok for now; especially, it's
+>>>> not up to the autoindex branch to come up with a solution to the general
+>>>> problem. --[[chrysn]]
diff --git a/doc/contact.mdwn b/doc/contact.mdwn
index 7d31ddf10..dab092549 100644
--- a/doc/contact.mdwn
+++ b/doc/contact.mdwn
@@ -7,5 +7,4 @@ developers monitor [[RecentChanges]] closely, via the webpage, email,
and IRC, and respond in a timely fashion.
You could also drop by the IRC channel `#ikiwiki` on
-[OFTC](http://www.oftc.net/) (`irc.oftc.net`), or use the
-[identi.ca ikiwiki group](http://identi.ca/group/ikiwiki).
+[OFTC](http://www.oftc.net/) (`irc.oftc.net`).
diff --git a/doc/css_market.mdwn b/doc/css_market.mdwn
index c9c6694e7..376f81b8b 100644
--- a/doc/css_market.mdwn
+++ b/doc/css_market.mdwn
@@ -48,6 +48,8 @@ gnomes will convert them to css files..)
templates.
[[!meta stylesheet="bma"]]
+* ** http://blog.lastlog.de/, contributed by joachim schiele; please feel free to copy.
+
* **[blankoblues.css][1]**, contributed by [[Blanko]]. Can be seen on [Blankoblues Demo][2]. Local.css and templates available [here][3].
* **[contraste.css][4]**, contributed by [[Blanko]]. Can be seen on [Contraste Demo][5]. Local.css and templates available [here][6].
@@ -60,9 +62,9 @@ gnomes will convert them to css files..)
* **[ikiwiked gray-orange](https://github.com/AntPortal/ikiwiked/raw/master/theme/gray-orange/local.css)**, contributed by [Danny Castonguay](https://antportal.com/). Can be seen in action at [antportal.com/wiki](https://antportal.com/wiki/). Feel free to modify and contribute on [Github](https://github.com/AntPortal/ikiwiked)
- [1]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/src/local.css (Download Blankoblues CSS)
- [2]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/htdocs/ (Take a tour on Blankoblues Demo)
- [3]: http://blankoworld.homelinux.com/demo/ikiwiki/blankoblues/blankoblues.tar.gz (Download local.css and templates for Blankoblues theme)
- [4]: http://blankoworld.homelinux.com/demo/ikiwiki/contraste/src/local.css (Download Contraste CSS)
- [5]: http://blankoworld.homelinux.com/demo/ikiwiki/contraste/htdocs/ (Take a tour on Contraste Demo)
- [6]: http://blankoworld.homelinux.com/demo/ikiwiki/contraste/contraste.tar.gz (Download local.css and templates for Contraste theme)
+ [1]: http://olivier.dossmann.net/demo/ikiwiki/blankoblues/src/local.css (Download Blankoblues CSS)
+ [2]: http://olivier.dossmann.net/demo/ikiwiki/blankoblues/htdocs/ (Take a tour on Blankoblues Demo)
+ [3]: http://olivier.dossmann.net/demo/ikiwiki/blankoblues/blankoblues.tar.gz (Download local.css and templates for Blankoblues theme)
+ [4]: http://olivier.dossmann.net/demo/ikiwiki/contraste/src/local.css (Download Contraste CSS)
+ [5]: http://olivier.dossmann.net/demo/ikiwiki/contraste/htdocs/ (Take a tour on Contraste Demo)
+ [6]: http://olivier.dossmann.net/demo/ikiwiki/contraste/contraste.tar.gz (Download local.css and templates for Contraste theme)
diff --git a/doc/forum/Adding_a_custom_header_and_footer.mdwn b/doc/forum/Adding_a_custom_header_and_footer.mdwn
new file mode 100644
index 000000000..d9bdedc6a
--- /dev/null
+++ b/doc/forum/Adding_a_custom_header_and_footer.mdwn
@@ -0,0 +1,13 @@
+I want to do some things that I think are easiest accomplished
+by allowing me to add arbitrary HTML to be embedded on all pages
+in the site. Specifically, I want to add meta tags to the top of
+the page so that it renders pretty-like in things like Twitter,
+and I want to add Piwik tracking to the bottom of the page.
+
+So how do I do that?
+
+I could write a whole new template for the site, but I suspect
+that there's a more modular approach that is advised. And if you
+have ideas of totally different ways do do this, do tell.
+
+Thanks
diff --git a/doc/forum/Adding_a_custom_header_and_footer/comment_1_e82dbfef77ff222a7fa07aab0a19fb18._comment b/doc/forum/Adding_a_custom_header_and_footer/comment_1_e82dbfef77ff222a7fa07aab0a19fb18._comment
new file mode 100644
index 000000000..d10961c19
--- /dev/null
+++ b/doc/forum/Adding_a_custom_header_and_footer/comment_1_e82dbfef77ff222a7fa07aab0a19fb18._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="spalax"
+ ip="82.216.247.172"
+ subject="Use page.tmpl"
+ date="2014-05-16T17:11:01Z"
+ content="""
+I think the right thing to do is to copy the default `page.tmpl` to your wiki (in your template directory), and add the code you wish.
+
+-- [[Louis|spalax]]
+"""]]
diff --git a/doc/forum/Can__39__t_call_method___34__distribution__34___on_an_undefined_value_at_FirstTime.pm.html b/doc/forum/Can__39__t_call_method___34__distribution__34___on_an_undefined_value_at_FirstTime.pm.html
new file mode 100644
index 000000000..b68395856
--- /dev/null
+++ b/doc/forum/Can__39__t_call_method___34__distribution__34___on_an_undefined_value_at_FirstTime.pm.html
@@ -0,0 +1,64 @@
+This really look like a general PERL problem, but google search returns no relative result of undfined method 'distribution' at FireTime.pm at all. Answer on where to look for answer is appreciated too. Using perl 5.18 on NETBSD 6.1
+
+
+$ PERL5LIB=`pwd`/ikiwiki:`pwd`/ikiwiki/cpan:`pwd`/lib/perl5 PERL_MM_USE_DEFAULT=1 perl -MCPAN -e 'CPAN::Shell->install("Bundle::IkiWiki")'
+perl: warning: Setting locale failed.
+perl: warning: Please check that your locale settings:
+ LC_ALL = "en_US.UTF-8",
+ LANG = "en_US.UTF-8"
+ are supported and installed on your system.
+perl: warning: Falling back to the standard locale ("C").
+perl: warning: Setting locale failed.
+perl: warning: Please check that your locale settings:
+ LC_ALL = "en_US.UTF-8",
+ LANG = "en_US.UTF-8"
+ are supported and installed on your system.
+perl: warning: Falling back to the standard locale ("C").
+
+CPAN.pm requires configuration, but most of it can be done automatically.
+If you answer 'no' below, you will enter an interactive dialog for each
+configuration option instead.
+
+Would you like to configure as much as possible automatically? [yes] yes
+
+
+
+Warning: You do not have write permission for Perl library directories.
+
+To install modules, you need to configure a local Perl library directory or
+escalate your privileges. CPAN can help you by bootstrapping the local::lib
+module or by configuring itself to use 'sudo' (if available). You may also
+resolve this problem manually if you need to customize your setup.
+
+What approach do you want? (Choose 'local::lib', 'sudo' or 'manual')
+ [local::lib] local::lib
+
+Autoconfigured everything but 'urllist'.
+
+Now you need to choose your CPAN mirror sites. You can let me
+pick mirrors for you, you can select them from a list or you
+can enter them by hand.
+
+Would you like me to automatically choose some CPAN mirror
+sites for you? (This means connecting to the Internet) [yes] yes
+Trying to fetch a mirror list from the Internet
+Fetching with LWP:
+http://www.perl.org/CPAN/MIRRORED.BY
+
+Looking for CPAN mirrors near you (please be patient)
+.......................... done!
+
+New urllist
+ http://cpan.llarian.net/
+ http://mirrors.syringanetworks.net/CPAN/
+ http://noodle.portalus.net/CPAN/
+
+Autoconfiguration complete.
+
+Attempting to bootstrap local::lib...
+
+Writing /arpa/tz/w/weiwu/.local/share/.cpan/CPAN/MyConfig.pm for bootstrap...
+commit: wrote '/arpa/tz/w/weiwu/.local/share/.cpan/CPAN/MyConfig.pm'
+Can't call method "distribution" on an undefined value at /usr/pkg/lib/perl5/5.18.0/CPAN/FirstTime.pm line 1257.
+$ rm -r /arpa/tz/w/weiwu/.local/share/.cpan/
+
diff --git a/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__.mdwn b/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__.mdwn
new file mode 100644
index 000000000..44d25af70
--- /dev/null
+++ b/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__.mdwn
@@ -0,0 +1,15 @@
+I returned to one of my old ikiwiki blogs and received the above error message after entering (on the web interface of the blog) a title for a new post.
+
+I found the following three locks in the .ikiwiki directory of the blog:
+
+-rw-r--r-- 1 zoidberg zoidberg 0 May 23 15:10 cgilock
+-rw-r--r-- 1 zoidberg zoidberg 0 May 23 15:20 lockfile
+-rw------- 1 zoidberg zoidberg 0 May 23 15:10 sessions.db.lck
+
+When I delete these and and again try to create a new post the above error message reappears and the locks have been recreated.
+
+Re-running 'ikiwiki --setup myblog.setup' disclosed a couple of permission problems (files owned by root - bah), but fixing them has had no effect on hte behavior of the blog.
+
+I really would like to rehab this ikiwiki blog!
+
+*Thanks!*
diff --git a/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__/comment_1_dc99a921813d4f8adf797a900ee0a2c1._comment b/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__/comment_1_dc99a921813d4f8adf797a900ee0a2c1._comment
new file mode 100644
index 000000000..7aed7a21e
--- /dev/null
+++ b/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__/comment_1_dc99a921813d4f8adf797a900ee0a2c1._comment
@@ -0,0 +1,18 @@
+[[!comment format=mdwn
+ username="http://smcv.pseudorandom.co.uk/"
+ nickname="smcv"
+ subject="comment 1"
+ date="2014-05-23T22:02:07Z"
+ content="""
+I believe that error message indicates that the [[plugins/lockedit]]
+plugin is preventing editing. Either make the user account you're
+trying to use into a wiki admin via the `adminuser` setting:
+
+ # either or both of these
+ adminuser:
+ - yourname
+ - http://your-openid.example.com/
+
+or allow that user to edit pages by altering the `locked_pages`
+setting.
+"""]]
diff --git a/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__/comment_2_48daf77f097ed94bf78cf97b0c027129._comment b/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__/comment_2_48daf77f097ed94bf78cf97b0c027129._comment
new file mode 100644
index 000000000..f94e7785e
--- /dev/null
+++ b/doc/forum/Error___34__is_locked_and_cannot_be_edited__34__/comment_2_48daf77f097ed94bf78cf97b0c027129._comment
@@ -0,0 +1,14 @@
+[[!comment format=mdwn
+ username="http://bob-bernstein.myopenid.com/"
+ nickname="bernstein"
+ subject="comment 2"
+ date="2014-05-24T02:04:07Z"
+ content="""
+Thanks. Your prompt reply encouraged me to poke around a bit more. I found a perl module was missing (how I cannot imagine) XML/Writer.pm. Installing the relevant deb seemed to fix things up.
+
+nb This is a rather old install of ikiwiki. It dates from 2009.
+
+ps Your use of ikiwiki for your homepage is quite impressive, and tasteful!
+
+*Thanks!*
+"""]]
diff --git a/doc/forum/Export_images_when_building_the_wiki.mdwn b/doc/forum/Export_images_when_building_the_wiki.mdwn
new file mode 100644
index 000000000..9802ea4ed
--- /dev/null
+++ b/doc/forum/Export_images_when_building_the_wiki.mdwn
@@ -0,0 +1,16 @@
+My repository contains image sources made with tools like Inkspace, Dia, LibreOffice, Gimp and so on.
+
+Instead of pushing the images themselves into git or manually exporting them to PNG/SVG,
+I'd like to keep just the sources in git, and have ikiwiki compile them into the final
+images just like it compiles Markdown into HTML. Is it possible to add new files types
+and tell ikiwiki how to compile them?
+
+(After reading some plugin docs...)
+
+I just read 'perlintro' yesterday in unrelated context, but...
+could it maybe be done by writing a plugin like this one,
+which compiles textile?
+
+
+
+-- [[fr33domlover]]
diff --git a/doc/forum/Export_images_when_building_the_wiki/comment_1_f7328be9b201f3eea6b90c269781fd0b._comment b/doc/forum/Export_images_when_building_the_wiki/comment_1_f7328be9b201f3eea6b90c269781fd0b._comment
new file mode 100644
index 000000000..182be88f3
--- /dev/null
+++ b/doc/forum/Export_images_when_building_the_wiki/comment_1_f7328be9b201f3eea6b90c269781fd0b._comment
@@ -0,0 +1,18 @@
+[[!comment format=mdwn
+ username="spalax"
+ ip="82.216.247.172"
+ subject="Other plugins to study"
+ date="2014-05-16T10:38:17Z"
+ content="""
+Several plugins process data using external programs. You may have a look at:
+
+- [[plugins/teximg]] which calls LaTeX;
+- [[plugins/graphviz]] which calls graphviz;
+- [[plugins/contrib/pandoc]] which calls pandoc.
+
+The first and second plugins I mentionned create an image using an external
+tool, and integrate it in the page. It may be exactly what you want.
+
+-- [[Louis|spalax]]
+
+"""]]
diff --git a/doc/forum/Export_images_when_building_the_wiki/comment_2_99a592c8ff9d2c2094132edd27356922._comment b/doc/forum/Export_images_when_building_the_wiki/comment_2_99a592c8ff9d2c2094132edd27356922._comment
new file mode 100644
index 000000000..0c270987e
--- /dev/null
+++ b/doc/forum/Export_images_when_building_the_wiki/comment_2_99a592c8ff9d2c2094132edd27356922._comment
@@ -0,0 +1,18 @@
+[[!comment format=mdwn
+ username="fr33domlover"
+ ip="85.65.55.38"
+ subject="comment 2"
+ date="2014-05-17T11:10:35Z"
+ content="""
+Thanks, I already saw those.
+
+I need a plugin of exactly the same kind, but which calls other tools, such as Dia and Inkspace.
+In addition, embedding into a page means the same image may end up being generated
+many times. So it's best to generate the image as an attachment of some page, and then
+all other pages in the wiki can use it. What do you think?
+
+Also, if I write a plugin (and test it of course), where do I publish it so people can
+see and enjoy it? Is [[plugins]] moderated?
+
+-- [[fr33domlover]]
+"""]]
diff --git a/doc/forum/Export_images_when_building_the_wiki/comment_3_7f5a1ef639453c83748405d2b3b0b880._comment b/doc/forum/Export_images_when_building_the_wiki/comment_3_7f5a1ef639453c83748405d2b3b0b880._comment
new file mode 100644
index 000000000..48aec10ec
--- /dev/null
+++ b/doc/forum/Export_images_when_building_the_wiki/comment_3_7f5a1ef639453c83748405d2b3b0b880._comment
@@ -0,0 +1,27 @@
+[[!comment format=mdwn
+ username="spalax"
+ ip="82.216.247.172"
+ subject="comment 3"
+ date="2014-05-17T13:49:14Z"
+ content="""
+> I need a plugin of exactly the same kind, but which calls other tools, such as Dia and Inkspace.
+> In addition, embedding into a page means the same image may end up being generated
+> many times. So it's best to generate the image as an attachment of some page, and then
+> all other pages in the wiki can use it. What do you think?
+
+Then the [[plugins/contrib/pandoc]] may be a good start, since *you can configure it for Pandoc to take over processing of all .mkdn files, or only files with a different extension.* Have a look at it to make your plugin process files with a particular extension. Then, it will be possible to have several pages refer to the same file, generated only once (maybe by storing stuff in `%pagestate` or `%wikistate`.
+
+Have a look at [[plugins/write]] to write your plugin.
+
+> Also, if I write a plugin (and test it of course), where do I publish it so people can
+> see and enjoy it? Is [[plugins]] moderated?
+
+What is usually done is:
+
+- you publish your code somewhere (your server, or on github or something like that);
+- you advertise your plugin by creating a subpage of [[plugins/contrib]]. Use the [[templates/plugin]] [[template|templates]] (it generates the frame you can see on the right of [[one of my plugins|plugins/contrib/jscalendar]], for example):
+
+ \[[!template id=plugin name=YourFancyPlugin author=\"[[fr33domlover]]\"]]
+
+-- [[Louis|spalax]]
+"""]]
diff --git a/doc/forum/Export_images_when_building_the_wiki/comment_4_bd3b37fbee54f1bf510ef5fc6ba27e55._comment b/doc/forum/Export_images_when_building_the_wiki/comment_4_bd3b37fbee54f1bf510ef5fc6ba27e55._comment
new file mode 100644
index 000000000..5f5647a57
--- /dev/null
+++ b/doc/forum/Export_images_when_building_the_wiki/comment_4_bd3b37fbee54f1bf510ef5fc6ba27e55._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="fr33domlover"
+ ip="85.65.55.38"
+ subject="comment 4"
+ date="2014-05-17T14:46:15Z"
+ content="""
+Great, thanks. I'll take a look. But it's a bit different, because images are not HTML pages at all.
+
+Thanks for the quick replies :-)
+"""]]
diff --git a/doc/forum/File_wiki.setup.mdwn b/doc/forum/File_wiki.setup.mdwn
new file mode 100644
index 000000000..173988fd5
--- /dev/null
+++ b/doc/forum/File_wiki.setup.mdwn
@@ -0,0 +1,6 @@
+Hi,
+
+I'd like to know if there were a way to get the user's directory path where the file wiki.setup is, and the name of this file. Because i'm working on an improvement of the plugin userlist, and i want to modify the .setup file but i haven't found a way to dynamically get this file.
+
+> The [[plugins/websetup]] plugin rewrites the setup file. You may find your
+> answer in its code. [[Louis|spalax]]
diff --git a/doc/forum/Formatting_algorithms.mdwn b/doc/forum/Formatting_algorithms.mdwn
new file mode 100644
index 000000000..c7f4aaa76
--- /dev/null
+++ b/doc/forum/Formatting_algorithms.mdwn
@@ -0,0 +1,54 @@
+I'm using ikiwiki for a software project, and in the design process one of the things I sometimes write
+algorithms. It doesn't happen much, but for components of functional nature it's very useful.
+
+I've been thinking how to write them in the wiki. I can use a numbered list and manually make
+keywords __bold__, but it's not optimal. I could also use plain text formatting and indent using tabs,
+but again there is no highlighting of any keywords or formatting of structures.
+Before I do that, I'd like to know if there are better options.
+
+One option I know is LaTeX, which has some very nice packages. You write pseudo-code which looks
+very much like source code, and the result looks great, very readable and high quality.
+
+I saw the [[plugins/teximg]] plugin, but the explanation there is poor: Does the plugin handle things
+that aren't formulas? Could it work with a LaTeX document or with an algorithm environment?
+
+Of course, of you have other suggestions I'll be happy to hear. I want to make a careful choice before
+I start writing many algorithms :-)
+
+> You may try to see if you can select a pseudo-code languages in one of the
+> highlight plugins ([[plugins/contrib/highlightcode]],
+> [[plugins/contrib/sourcehighlight]], [[plugins/highlight]], other ?). The
+> list of supported languages with the [[plugins/highlight]] plugin is
+> [[here|http://www.andre-simon.de/doku/highlight/en/langs.php]], and if you
+> cannot find your languages, I think you can define your own
+> [[here|http://www.andre-simon.de/doku/highlight/en/plugins.php]].
+>
+> -- [[Louis|spalax]]
+
+>> Thanks, I looked at it. I don't think there's any special language for algorithms
+>> (anyway I couldn't find any), but for the record I found the following possibilities:
+>>
+>> 1. LaTeX: Not very readable in source form, but could be highlighted, didn't try
+>> 2. Writing in a subset of Python/Pascal/Fortran and using their highlighting
+>> 3. Define a new highlight syntax
+>>
+>> What about [[plugins/teximg]]? If it can be used to generate algorithms from LaTeX, it would be
+>> an easy excellent solution.
+>>
+>> --[[fr33domlover]]
+
+> [[plugins/teximg]] is the best thing that currently exists. Since it isn't
+> enabled on this wiki, and the author's ikiwiki has disappeared, I put one of
+> the test formulas into a private test wiki of mine. Here's a screenshot:
+>
+>
+> I think it would be great if someone [[wrote a
+> plugin for something nicer|todo/Add_nicer_math_formatting]]. -- [[Jon]]
+
+>> [[plugins/teximg]] is fine for math (al least for GUI browsers, I didn't try with w3m etc.),
+>> but what I'm looking for is a solution for formatting **algorithms**. If teximg can help
+>> with that, great, otherwise there's the 3 workarounds I mentioned above.
+>>
+>> Do you have any ideas not mentioned? :-)
+>>
+>> -- [[fr33domlover]]
diff --git a/doc/forum/How_to_properly_create_--_in_a_wiki_--____39__page__47__index.html__39___files.mdwn b/doc/forum/How_to_properly_create_--_in_a_wiki_--____39__page__47__index.html__39___files.mdwn
new file mode 100644
index 000000000..4b7f468bf
--- /dev/null
+++ b/doc/forum/How_to_properly_create_--_in_a_wiki_--____39__page__47__index.html__39___files.mdwn
@@ -0,0 +1,17 @@
+I am trying **in a wiki** to "manually," i.e. not using the web interface, use the "page/index.html" type of page creation.
+
+In my working clone src dir I can use, in succession, commands such as:
+
+mkdir MyNewPage
+
+touch MyNewPage/index.mdwn
+
+git add MyNewPage MyNewPage/index.mdwn
+
+[here I edit the new index.mdwn]
+
+git commit -a
+
+git push
+
+These are, roughly, the steps I have taken, and they seem to work. But surely there is a more elegant, **Ikiwiki-ish** solution.
diff --git a/doc/forum/How_to_properly_create_--_in_a_wiki_--____39__page__47__index.html__39___files/comment_1_d9ee358ded5d5307ba73a8c11f81549d._comment b/doc/forum/How_to_properly_create_--_in_a_wiki_--____39__page__47__index.html__39___files/comment_1_d9ee358ded5d5307ba73a8c11f81549d._comment
new file mode 100644
index 000000000..7412aa936
--- /dev/null
+++ b/doc/forum/How_to_properly_create_--_in_a_wiki_--____39__page__47__index.html__39___files/comment_1_d9ee358ded5d5307ba73a8c11f81549d._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="https://me.yahoo.com/a/eetjWe8B34ZeUsHyFzpwC5QvBcEuVxllSvpJHw--#376d7"
+ nickname="Bob"
+ subject="SOLVED. NEVER MIND. SORRY."
+ date="2014-06-03T00:33:59Z"
+ content="""
+I get it. All I have to do is create NewPage.mdwn, add, commit, git pull and then git push, and lo and behold, NewPage/index.html is in my destination dir.
+"""]]
diff --git a/doc/forum/How_to_rename_all_markdown_files_from___42__.mdwn_to___42__.md__63__.mdwn b/doc/forum/How_to_rename_all_markdown_files_from___42__.mdwn_to___42__.md__63__.mdwn
new file mode 100644
index 000000000..d11f7a3c5
--- /dev/null
+++ b/doc/forum/How_to_rename_all_markdown_files_from___42__.mdwn_to___42__.md__63__.mdwn
@@ -0,0 +1,3 @@
+Github does not take .mdwn as Markdown files: https://github.com/github/markup/blob/b865add2e053f8cea3d7f4d9dcba001bdfd78994/lib/github/markups.rb#L1
+
+I'd like to use filename extensions complaint to GitHub. My question is after renaming all markdown files from *.mdwn to *.md how do I correct all the dependencies? Does simply committing the renamed files to the source repository suffice?
diff --git a/doc/forum/How_to_rename_all_markdown_files_from___42__.mdwn_to___42__.md__63__/comment_1_c2720ebfe56ad816f241693d9e2e5072._comment b/doc/forum/How_to_rename_all_markdown_files_from___42__.mdwn_to___42__.md__63__/comment_1_c2720ebfe56ad816f241693d9e2e5072._comment
new file mode 100644
index 000000000..c458b5345
--- /dev/null
+++ b/doc/forum/How_to_rename_all_markdown_files_from___42__.mdwn_to___42__.md__63__/comment_1_c2720ebfe56ad816f241693d9e2e5072._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="fr33domlover"
+ ip="85.65.55.38"
+ subject="comment 1"
+ date="2014-05-16T08:49:30Z"
+ content="""
+I don't know, but I remember there's a setting in the setup file which sets the extension for Markdown files. I would create a dummy wiki for tests, where I'd create some files with .md extension and change that setting in the setup file. Then try rebuilding the wiki and see what happens.
+
+I'm just a user, I don't know beyond that.
+"""]]
diff --git a/doc/forum/I_do_not_know_anything_abut_git.mdwn b/doc/forum/I_do_not_know_anything_abut_git.mdwn
new file mode 100644
index 000000000..31358bbb1
--- /dev/null
+++ b/doc/forum/I_do_not_know_anything_abut_git.mdwn
@@ -0,0 +1,22 @@
+I want to learn how to use a text editor in addition to the web interface. I am stuck on pushing changes back to where they're supposed to go.
+
+I have done:
+
+ git clone Zoidwicky.git Zoidwicky.src
+
+and then, after editing sidebar.mdwn in that new Zoidwicky.src directory
+
+ git commit sidebar.mdwn
+
+Now I believe I must use git push to move that change to I am not sure where.
+
+I learn best by example. Would someone be good enough to post an example of what that 'git push" command might look like?
+
+Here are some samples of what I have tried:
+
+ $ git push sidebar.mdwn Zoidwicky.git
+ fatal: Invalid gitfile format: sidebar.mdwn
+
+ $ git push sidebar.mdwn /home/zoid/Zoidwicky.git/
+ fatal: remote part of refspec is not a valid name in /home/zoidberg/Zoidwicky.git
+
diff --git a/doc/forum/I_do_not_know_anything_abut_git/comment_1_2efdf8563bcdeba73b11282157aba72d._comment b/doc/forum/I_do_not_know_anything_abut_git/comment_1_2efdf8563bcdeba73b11282157aba72d._comment
new file mode 100644
index 000000000..7649feece
--- /dev/null
+++ b/doc/forum/I_do_not_know_anything_abut_git/comment_1_2efdf8563bcdeba73b11282157aba72d._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="http://kerravonsen.dreamwidth.org/"
+ ip="203.206.140.235"
+ subject="comment 1"
+ date="2014-05-24T23:30:43Z"
+ content="""
+Just use \"git push\" without any arguments at all.
+
+ git push
+"""]]
diff --git a/doc/forum/I_do_not_know_anything_abut_git/comment_2_3dd0fa0612a5fac785cc7d5ea23d42a5._comment b/doc/forum/I_do_not_know_anything_abut_git/comment_2_3dd0fa0612a5fac785cc7d5ea23d42a5._comment
new file mode 100644
index 000000000..18617ac9e
--- /dev/null
+++ b/doc/forum/I_do_not_know_anything_abut_git/comment_2_3dd0fa0612a5fac785cc7d5ea23d42a5._comment
@@ -0,0 +1,8 @@
+[[!comment format=mdwn
+ username="http://bob-bernstein.myopenid.com/"
+ nickname="bernstein"
+ subject="comment 2"
+ date="2014-05-25T03:39:41Z"
+ content="""
+Ah. That is simple enough even for me! Thank you so much!
+"""]]
diff --git a/doc/forum/Right-to-left_support.mdwn b/doc/forum/Right-to-left_support.mdwn
new file mode 100644
index 000000000..7ca4f9ad6
--- /dev/null
+++ b/doc/forum/Right-to-left_support.mdwn
@@ -0,0 +1,15 @@
+Does ikiwiki support RTL languages? I read somewhere it does, but I don't see
+any mention of that here (or anywhere else... that info may be wrong).
+
+I'd like to add RTL support to my wiki, for text direction and maybe for the
+page layout too. Before I edit my CSS, page.tmpl and possibly Perl for
+automatic direction setting - does ikiwiki support this in any way?
+
+On my wiki (ikiwiki version from Debian 7 stable) everything is aligned to
+the left, and unicode RTL characters cannot change that - the .tmpl and
+css files would need to be changed, it seems.
+
+I will happily share my insights and code, if I manage to get anything
+useful to work :-)
+
+--[[fr33domlover]]
diff --git a/doc/forum/Right-to-left_support/comment_1_5b2bf4d037ae8db940296e6f58884927._comment b/doc/forum/Right-to-left_support/comment_1_5b2bf4d037ae8db940296e6f58884927._comment
new file mode 100644
index 000000000..1e9558f96
--- /dev/null
+++ b/doc/forum/Right-to-left_support/comment_1_5b2bf4d037ae8db940296e6f58884927._comment
@@ -0,0 +1,21 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawmfcr1X7TXwuCju7vCBG6vii455SX1Qxro"
+ nickname="Mesar"
+ subject="comment 1"
+ date="2014-08-30T17:53:53Z"
+ content="""
+Hi,
+
+You need ikiwiki 3.20140227 or newer, which includes the patch to expose the language code/direction see [[todo/expose_html_language_and_direction/]]
+After which you need to modify the templates to make use of the tags.
+
+I'm currently running this on http://addons.nvda-project.org
+The config/templates can be [found here](https://bitbucket.org/nvdaaddonteam/ikiwiki-ctl)
+
+I haven't investigated how this functions when the po plugin is disabled, but I am guessing that you can simply enable the po plugin, define your master language, and miss out any slave languages.
+
+I would be intrested to hear feedback/what you got to work, as we are a bunch of blind people running the project above, so the correct markup was the goal in our case, I haven't had any feedback on its visual appearance.
+
+
+--[[mhameed]]
+"""]]
diff --git a/doc/forum/Spaces_in_URLs.mdwn b/doc/forum/Spaces_in_URLs.mdwn
new file mode 100644
index 000000000..4749f4dc5
--- /dev/null
+++ b/doc/forum/Spaces_in_URLs.mdwn
@@ -0,0 +1,14 @@
+There is one file on my site that had a space in the name;
+on my old site, this link worked:
+[http://dada.pink/scarsdale/Statement 20140527.pdf](http://dada.pink/scarsdale/Statement 20140527.pdf)
+
+Now that I've moved to Ikiwiki, that doesn't work. So I moved the file here:
+[http://dada.pink/scarsdale/Statement_20140527.pdf](http://dada.pink/scarsdale/Statement_20140527.pdf)
+
+Is there a better approach to maintaining this link than setting an alias in Apache?
+
+> You can add the space character to the `wiki_file_chars` argument in your setup file. -- [[Jon]]
+
+>> a space character is not allowed in a url; user agents that read one usually represent it in percent encoded form (`%20`). if you use that, things also work for direct links, which i assume caused the problem (for both the links in the original description render correctly): `\[[http://dada.pink/scarsdale/Statement%2020140527.pdf]]` renders [[http://dada.pink/scarsdale/Statement%2020140527.pdf]].
+>>
+>> spaces are allowed in internal pages because wiki page names are not urls per se but converted using conversion rules -- this allows people to not think about url rules and just link to pages, but when you're linking outside ikiwiki, some strings just aren't valid urls. --[[chrysn]]
diff --git a/doc/forum/Trail_plugin_links_with_Actiontabs_theme.mdwn b/doc/forum/Trail_plugin_links_with_Actiontabs_theme.mdwn
new file mode 100644
index 000000000..decaaa1bc
--- /dev/null
+++ b/doc/forum/Trail_plugin_links_with_Actiontabs_theme.mdwn
@@ -0,0 +1,47 @@
+I'm using the trail plugin with the actiontabs theme, and the prev/next links
+seem to appear in a strange way on the page.
+
+I use modified CSS, but it changes just the colors and some font sizes.
+Nothing related to positions and trails.
+
+Here's an example - the top prev/next links appear above the action tabs.
+Is this normal? I'm using the ikiwiki version from Debian 7 stable.
+
+- If you use OpenNIC:
+- If you don't (will work only until the IP changes):
+
+I can look at the CSS and try to figure this out, but I don't know much CSS or
+how the trail plugin works. If anyone uses trails, especially with actiontabs, and
+can help me - it will be great.
+
+Thanks in advance!
+
+--[[fr33domlover]]
+
+> I looked at the file *page.tmpl* and it seems I may be able to change
+> the trail link location if I edit that file. Would it be a good/possible solution to
+> edit it and put it in the git repo to be used instead of the default one?
+>
+> --[[fr33domlover]]
+
+>> That's how I intended trails to look with actiontabs:
+>> is
+>> another example.
+>>
+>> With the way the actiontabs theme works, if you want to move the
+>> trail bits down into the content area (white background in the
+>> unedited theme) you might have to alter both `page.tmpl`
+>> and the actiontabs CSS. You'll see that the actiontabs CSS
+>> has a special case for trails, because the tabs and the trail
+>> links would overlap otherwise - you might have to remove
+>> that special case. --[[smcv]]
+
+>>> Thanks, I'll try that. But I've been using those trails in the last
+>>> several hours and I'm beginning to get used to the current
+>>> layout. Maybe I'll just keep it :-)
+>>>
+>>> (Anyway the way trail links look on my wiki is valid, it's exactly
+>>> like on your link, only with different colors. I suppose it's
+>>> just a cosmetic issue then)
+>>>
+>>> --[[fr33domlover]]
diff --git a/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_3_f402fb426e0460ce927b7847246f699f._comment b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_3_f402fb426e0460ce927b7847246f699f._comment
new file mode 100644
index 000000000..8b976fac2
--- /dev/null
+++ b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_3_f402fb426e0460ce927b7847246f699f._comment
@@ -0,0 +1,19 @@
+[[!comment format=mdwn
+ username="amcalvo"
+ ip="78.53.114.169"
+ subject="Workaround for Nginx"
+ date="2014-05-05T21:49:10Z"
+ content="""
+Thank you for the analysis. I have worked around the issue by using the , something like:
+
+~~~
+location {
+ # Proxy stuff...
+ sub_filter 'http://example.com' 'https://example.com';
+
+}
+~~~
+
+Best regards,
+amc.
+"""]]
diff --git a/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_4_db726bc81ec5feac76d17ea81f0f80a5._comment b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_4_db726bc81ec5feac76d17ea81f0f80a5._comment
new file mode 100644
index 000000000..d0b2952b0
--- /dev/null
+++ b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_4_db726bc81ec5feac76d17ea81f0f80a5._comment
@@ -0,0 +1,13 @@
+[[!comment format=mdwn
+ username="amcalvo"
+ ip="78.53.114.169"
+ subject="comment 4"
+ date="2014-05-05T21:56:36Z"
+ content="""
+A correction to the above comment, one needs activate multiple replacements:
+
+~~~
+ sub_filter 'http://example.com' 'https://example.com';
+ sub_filter_once off;
+~~~
+"""]]
diff --git a/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_5_674f56100c0682eba36cc5327fbdae4a._comment b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_5_674f56100c0682eba36cc5327fbdae4a._comment
new file mode 100644
index 000000000..1546c67a0
--- /dev/null
+++ b/doc/forum/Using_reverse_proxy__59___base_URL_is_http_instead_of_https/comment_5_674f56100c0682eba36cc5327fbdae4a._comment
@@ -0,0 +1,61 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk6z7Jsfi_XWfzFJNZIjYUcjgrthg4aPUU"
+ nickname="Alejandro"
+ subject="Same Trick in Apache"
+ date="2014-09-10T18:58:24Z"
+ content="""
+I got it working with Apache 2.4 and Virtual Hosts on both HTTP 1.1 and HTTPS (SNI). The procedure is somewhat analogous to the nginx procedure above. So here is my set-up in the hopes will help other avoid this pain.
+
+## Set-up
+
+ CLIENT <---- HTTPS ----> REVERSE PROXY <---- HTTP ----> IKIWIKI
+
+
+## The HTTP to HTTPS Redirect
+
+To assure that all your HTTP requests are being redirected to HTTPS I chose to use mod_rewrite because simple Redirect does not pass query parameters. You will want an HTTP VHost that will redirect with something like the one below (notice the subtle ? before query string). **Note: This will NOT re-write ikiwiki's http:// URLs (base tag, etc.)**. For that I use a content filter like you will see below. This HTTP to HTTPS redirect is required though for both security and for the /foo/?updated URI form in this set-up.
+
+
+
+## The SSL Virtual Host
+
+This part is a bit more tricky. First I am using SNI as I don't care for non-SNI user agents. Second, you need to use a filter that replaces all http:// to https:// before the response is set. Note that this alone won't deal with ?update so you will need the HTTP to HTTPS set-up above anyway. Third, I use HTTP Auth so I don't know if this will work with your particular Auth set-up (although it should IMHO), YMMV:
+
+
-This is some preformatted text. Each line is proceeded by four spaces.
+What follows is some preformatted text. Each line is proceeded by four spaces.
Test
@@ -109,7 +114,7 @@ This is some preformatted text. Each line is proceeded by four spaces.
-...Now why doesn't it work like that on my copy of ikiwiki? :(
+...Now why doesn't it work like that on my own copy of ikiwiki? :(
Räksmörgås.
diff --git a/doc/sandbox/discussion.mdwn b/doc/sandbox/discussion.mdwn
new file mode 100644
index 000000000..ec651a5b3
--- /dev/null
+++ b/doc/sandbox/discussion.mdwn
@@ -0,0 +1,7 @@
+Whilst discussing Ikiwiki on IRC, someone pointed out that "This is the SandBox, a page anyone can edit to try out ikiwiki" is not strictly true, or is debatably so, since they must log in to edit. This proved to be enough of a barrier that said person didn't consider ikiwiki any further. -- [[Jon]]
+
+> I personally think we'd be better off with a separate demo wiki
+> (sandbox.ikiwiki.info?) that has its own git repo and
+> `nofollow` configuration, so edits to that wiki aren't archived
+> in ikiwiki's git history forever; perhaps with a cron job to
+> reset the sandbox every few days? --[[smcv]]
diff --git a/doc/sandbox/new__95__test.mdwn b/doc/sandbox/new__95__test.mdwn
new file mode 100644
index 000000000..90bfcb510
--- /dev/null
+++ b/doc/sandbox/new__95__test.mdwn
@@ -0,0 +1 @@
+this is a test
diff --git a/doc/shortcuts.mdwn b/doc/shortcuts.mdwn
index b4f6d8ef4..ca529c296 100644
--- a/doc/shortcuts.mdwn
+++ b/doc/shortcuts.mdwn
@@ -27,7 +27,7 @@ This page controls what shortcut links the wiki supports.
* [[!shortcut name=debrt url="https://rt.debian.org/Ticket/Display.html?id=%s"]]
* [[!shortcut name=debss url="http://snapshot.debian.org/package/%s/"]]
* Usage: `\[[!debss package]]` or `\[[!debss package/version]]`. See for details.
-* [[!shortcut name=debwiki url="https://wiki.debian.org/%s"]]
+* [[!shortcut name=debwiki url="https://wiki.debian.org/%S"]]
* [[!shortcut name=fdobug url="https://bugs.freedesktop.org/show_bug.cgi?id=%s" desc="freedesktop.org bug #%s"]]
* [[!shortcut name=fdolist url="http://lists.freedesktop.org/mailman/listinfo/%s" desc="%s@lists.freedesktop.org"]]
* [[!shortcut name=gnomebug url="https://bugzilla.gnome.org/show_bug.cgi?id=%s" desc="GNOME bug #%s"]]
@@ -55,7 +55,7 @@ This page controls what shortcut links the wiki supports.
* [[!shortcut name=whois url="http://reports.internic.net/cgi/whois?whois_nic=%s&type=domain"]]
* [[!shortcut name=cve url="https://cve.mitre.org/cgi-bin/cvename.cgi?name=%s"]]
* [[!shortcut name=flickr url="https://secure.flickr.com/photos/%s"]]
-* [[!shortcut name=man url="http://linux.die.net/man/%s"]]
+* [[!shortcut name=man url="http://manpages.debian.org/%s"]]
* [[!shortcut name=ohloh url="https://www.ohloh.net/p/%s"]]
* [[!shortcut name=cpanrt url="https://rt.cpan.org/Ticket/Display.html?id=%s" desc="CPAN RT#%s"]]
* [[!shortcut name=novellbug url="https://bugzilla.novell.com/show_bug.cgi?id=%s" desc="bug %s"]]
diff --git a/doc/spam_fighting.mdwn b/doc/spam_fighting.mdwn
index 6e04dcf8f..712eb0740 100644
--- a/doc/spam_fighting.mdwn
+++ b/doc/spam_fighting.mdwn
@@ -30,4 +30,6 @@ cba01c2 | 2013/09/15 | spain1001 | 80.187.106.136
702a3e5 | 2014/01/02 | Toni | 124.105.173.121
c2924ce | 2014/01/02 | domtheo9110 | 182.253.51.174
cd81b9f | 2014/01/03 | domtheo9110 | ?
+e3376ce | 2014/08/19 | Nng_L (OpenID) | 58.186.127.104
+104c606 | 2014/08/19 | tlevine (OpenID) | 82.153.13.48
"""]]
diff --git a/doc/templates/discussion.mdwn b/doc/templates/discussion.mdwn
index c7115e4d6..dddab48d4 100644
--- a/doc/templates/discussion.mdwn
+++ b/doc/templates/discussion.mdwn
@@ -25,3 +25,4 @@ Is there a list of all the available variables somewhere, or do I just grep the
I pulled a list of variables and posted it, its in the history for [[templates]] under my name. [[justint]]
+I am trying to override `page.tmpl` by providing `templates/page.tmpl` in my `srcdir`- this works, but now `templates/page.tmpl` is created in my `destdir` as well! Is this expected? Is there a way to avoid this? --chenz
diff --git a/doc/theme_market.mdwn b/doc/theme_market.mdwn
index e9bdaa056..4ac41cb0a 100644
--- a/doc/theme_market.mdwn
+++ b/doc/theme_market.mdwn
@@ -11,3 +11,5 @@ Feel free to add your own [[theme|themes]] here, but first consider writing a si
* **[[Night city theme|http://anarcat.ath.cx/night_city/README/]]**, contributed by [[anarcat]], see an example [[on his homepage|http://anarcat.ath.cx/]]
* **[[Bootstrap theme|http://anonscm.debian.org/gitweb/?p=users/jak/website.git;a=summary]]**, contributed by [[JAK LINUX|http://jak-linux.org/about/]], based on [[Twitter Bootstrap|http://twitter.github.com/bootstrap/]]
+
+ * **[[Bootstrap 3|https://github.com/ramseydsilva/ikiwiki-bootstrap-theme]]**, contributed by [[ramsey]], based on [[Twitter Bootstrap 3|http://getbootstrap.com]]
diff --git a/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn b/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn
index 58940b89f..c1529c7a0 100644
--- a/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn
+++ b/doc/tips/Git_repository_and_web_server_on_different_hosts.mdwn
@@ -3,6 +3,8 @@ server located at different hosts. Here's a description for such
a setup, using password-less SSH as a way of communication between
these two hosts.
+[[!img separate-webserver.svg size=490x align=right]]
+
Git server
==========
diff --git a/doc/tips/Git_repository_and_web_server_on_different_hosts/separate-webserver.svg b/doc/tips/Git_repository_and_web_server_on_different_hosts/separate-webserver.svg
new file mode 100644
index 000000000..a9a428158
--- /dev/null
+++ b/doc/tips/Git_repository_and_web_server_on_different_hosts/separate-webserver.svg
@@ -0,0 +1,716 @@
+
+
+
+
diff --git a/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines.mdwn b/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines.mdwn
index 35feacb71..e6277d338 100644
--- a/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines.mdwn
+++ b/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines.mdwn
@@ -17,6 +17,7 @@ I assume the [[rcs]] used is [[rcs/git]], but it might be done for other rcs.
# Similar and related tips and problems
+- [[tips/distributed_wikis]] References different way of distributing wikis (including this one).
- [[http://www.icanttype.org/blog/ikiwiki_git_remote_repo/]] Similar to what I
am describing, excepted that you must be able to connect to the machine
hosting Ikiwiki using ssh.
@@ -37,6 +38,8 @@ it on a remote machine, and tell Ikiwiki to use it instead of its local one. We
will also ensure that the wiki is rendered whenever a commit is done to the git
repository.
+[[!img separate-web-git-servers.svg size=400x]]
+
# Conventions
- We are building a wiki called *SITE*.
@@ -143,14 +146,12 @@ the IkiWiki machine, and here is the deadlock. Explanations of the command:
## Going further
- *Web server on a third machine* It should be possible to use a third machine
- to host the web server. A hook might be used to export the rendered wiki on
- this server, or use a nfs repository as the destination repository of
- ikiwiki. However, allowing web modifications (using CGI) might be trickyâ¦
+ to host the web server, using [[this documentation|tips/Git_repository_and_web_server_on_different_hosts/]].
- *Using [[gitolite|https://github.com/sitaramc/gitolite]] to manage
repositories on the git machine* Simply replace the manipulations of git on
the git machine by the corresponding manipulations using gitolite.
* With gitolite, you can use this line in a `post-update` hook:
- `[ x"$GL_USER" = x"`*`gitolite-user`*`" ] || wget ...`
+ `[ x"$GL_USER" = x"`*`gitolite-user`*`" ] || wget ...` where *gitolite-user* is the name of the public key registered through gitolite.
- thus, you filter out precisely the events that originate from the server-to-be-pinged, no matter what the commit id says. (For example, if you push commits you created on a local CGI ikiwiki, they'd be called '@web' as well).
+ Thus, you filter out precisely the events that originate from the server-to-be-pinged, no matter what the commit id says. (For example, if you push commits you created on a local CGI ikiwiki, they'd be called '@web' as well).
diff --git a/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines/discussion.mdwn b/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines/discussion.mdwn
new file mode 100644
index 000000000..12565fd6a
--- /dev/null
+++ b/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines/discussion.mdwn
@@ -0,0 +1,14 @@
+It may be clear to experienced/technical gitolite users, but it confused me so I'd like to ask:
+
+In the comment about gitolite mentioning the line with $GL_USER, I assume "gitolite-user"
+needs to be replaced with the name of the gitolite user with which ikiwiki pushes
+changes? For example, if I have a key 'ikiwiki.pub', I use "ikiwiki" in the hook.
+
+If that's what the comment means, I'd be happy if it was made clear, so it's easier
+to understand. Or I can edit it myself, once I make sure I really understand.
+
+--[[fr33domlover]]
+
+> You are right. I [[updated|http://source.ikiwiki.branchable.com/?p=source.git;a=blobdiff;f=doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines.mdwn;h=6bbaf3e6e818e2e286c0cf9d357c9b03f649e146;hp=af4438bd5f6ac4f64cb443c6cfa3ba52e12da4f0;hb=54d47eb26ae41ff23932b9c0e3f15e698cb56ada;hpb=fc24df96c10b804d3022eb92caf687729921adbb]] the page to make it more precise, but feel free to continue to improve it.
+>
+> -- [[Louis|spalax]]
diff --git a/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines/separate-web-git-servers.svg b/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines/separate-web-git-servers.svg
new file mode 100644
index 000000000..b6095a2b7
--- /dev/null
+++ b/doc/tips/Hosting_Ikiwiki_and_master_git_repository_on_different_machines/separate-web-git-servers.svg
@@ -0,0 +1,783 @@
+
+
+
+
diff --git a/doc/tips/Right-to-left___40__RTL__41___page_text.mdwn b/doc/tips/Right-to-left___40__RTL__41___page_text.mdwn
new file mode 100644
index 000000000..2b176c811
--- /dev/null
+++ b/doc/tips/Right-to-left___40__RTL__41___page_text.mdwn
@@ -0,0 +1,49 @@
+Here's a simple way to create pages in which the page body (or a part of it) goes right-to-left.
+This includes things you insert into the page, such as polls and blockquotes and
+lists and a progress bar and so on. Some things don't work perfectly, but if
+you want to have some RTL pages in your wiki, this will probably do.
+
+It does not modify the things around the body, such as the page header and the
+footer. Only what is rendered from the mdwn file is affected.
+
+# 1 Add an RTL Template
+
+Create a new template page *templates/rtl.mdwn* with the following content:
+
+
+
+
+
+ Use this template to insert RTL text into a page.
+ This template has one parameter:
+