--- /dev/null
+I'm trying to put a list of tags in a table, so I carefully make a newline-free taglist.tmpl and then do:
+
+ | \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=taglist]] |
+
+but there's a line in `inline.pm` that does:
+
+ return "<div class=\"inline\" id=\"$#inline\"></div>\n\n";
+
+And the extra newlines break the table. Can they be safely removed?
+
+> If you want an HTML table, I would suggest using an HTML table, which
+> should pass through Markdown without being interpreted further:
+>
+> <table><tr>
+> \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtd]]
+> </tr></table>
+>
+> where tagtd.tmpl is of the form `<td>your markup here</td>`; or even just
+>
+> \[[!inline pages="link(/category/env)" feeds=no archive=yes sort=title template=tagtable]]
+>
+> where tagtable.tmpl looks like
+>
+> <TMPL_IF FIRST>
+> <table><tr>
+> </TMPL_IF>
+>
+> <td>your tag here</td>
+>
+> <TMPL_IF LAST>
+> </tr></table>
+> </TMPL_IF>
+>
+> I don't think you're deriving much benefit from Markdown's table syntax
+> if you have to mix it with HTML::Template and ikiwiki directives,
+> and be pathologically careful with whitespace. "Right tool for the job"
+> and all that :-)
+>
+> When I edited this page I was amused to find that you used HTML,
+> not Markdown, as its format. It seems oddly appropriate to my answer, but
+> I've converted it to Markdown and adjusted the formatting, for easier
+> commenting.
+> --[[smcv]]
> >
> > --[[anarcat]]
+> > > [[!template id=gitbranch branch=ready/more-magic author="[[smcv]]" browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/ready/more-magic]]
> > > If the regex match isn't necessary and it's just about deleting the
-> > > parameters, I think I'd prefer something like
+> > > parameters, I think I'd prefer
> > >
> > > if (! defined $mimetype) {
> > > ...
> > > }
> > > $mimetype =~ s/;.*//;
> > >
-> > > but I'd be hesitant to do that without knowing why Joey implemented it
-> > > the way it is. If it's about catching a result from file(1) that
+> > > as done in my `ready/more-magic` branch.
+> > >
+> > > I'm a little hesitant to do that without knowing why Joey implemented it
+> > > the way it is, but as far as I can tell it's just an oversight.
+> > >
+> > > Or, if the result of the s/// is checked for a reason, and it's
+> > > about catching a result from file(1) that
> > > is not, in fact, a MIME type at all (empty string or error message
> > > or something), maybe something more like this?
> > >
> > > (or whatever the allowed characters in MIME types are). --[[smcv]]
> > > > I don't mind either way, but i feel this should be fixed for the next release, as I need to reapply this patch at every upgrade now. -- [[anarcat]]
+
+> > > > > This is still a problem in 3.20140831. -- [[anarcat]]
+
+> > > > > > I still don't think appending a semicolon is the right answer:
+> > > > > > at best it's equivalent to what I suggested, and at worst it's
+> > > > > > disabling a check that does have some reason behind it.
+> > > > > > I've turned the version I suggested above into a proper branch.
+> > > > > > Review by someone who can commit to ikiwiki.git would be appreciated.
+> > > > > > --[[smcv]]
--- /dev/null
+E.g. [[!debwiki Derivatives/Guidelines]].
+
+Maybe we should use `%S` instead of `%s` in the shortcut definition?
+
+> seems reasonable, [[done]] --[[smcv]]
--- /dev/null
+since my latest jessie upgrade here, charsets are all broken when editing a page. the page i'm trying to edit is [this wishlist](http://anarc.at/wishlist/), and it used to work fine. now, instead of:
+
+`Voici des choses que vous pouvez m'acheter si vous êtes le Père Nowel (yeah right):`
+
+... as we see in the rendered body right now, when i edit the page i see:
+
+`Voici des choses que vous pouvez m'acheter si vous �tes le P�re Nowel (yeah right):`
+
+... a typical double-encoding nightmare. The actual binary data is this for the word "Père" according to `hd`:
+
+~~~~
+anarcat@marcos:ikiwiki$ echo "Père" | hd
+00000000 50 c3 a8 72 65 0a |P..re.|
+00000006
+anarcat@marcos:ikiwiki$ echo "P�re" | hd
+00000000 50 ef bf bd 72 65 0a |P...re.|
+00000007
+~~~~
+
+> I don't know what that is, but it isn't the usual double-UTF-8 encoding:
+>
+> >>> u'è'.encode('utf-8')
+> '\xc3\xa8'
+> >>> u'è'.encode('utf-8').decode('latin-1').encode('utf-8')
+> '\xc3\x83\xc2\xa8'
+>
+> A packet capture of the incorrect HTTP request/response headers and body
+> might be enlightening? --[[smcv]]
+>
+> > Here are the headers according to chromium:
+> >
+> > ~~~~
+> > GET /ikiwiki.cgi?do=edit&page=wishlist HTTP/1.1
+> > Host: anarc.at
+> > Connection: keep-alive
+> > Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
+> > User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36
+> > Referer: http://anarc.at/wishlist/
+> > Accept-Encoding: gzip,deflate,sdch
+> > Accept-Language: fr,en-US;q=0.8,en;q=0.6
+> > Cookie: openid_provider=openid; ikiwiki_session_anarcat=XXXXXXXXXXXXXXXXXXXXXXX
+> >
+> > HTTP/1.1 200 OK
+> > Date: Mon, 08 Sep 2014 21:22:24 GMT
+> > Server: Apache/2.4.10 (Debian)
+> > Set-Cookie: ikiwiki_session_anarcat=XXXXXXXXXXXXXXXXXXXXXXX; path=/; HttpOnly
+> > Vary: Accept-Encoding
+> > Content-Encoding: gzip
+> > Content-Length: 4093
+> > Keep-Alive: timeout=5, max=100
+> > Connection: Keep-Alive
+> > Content-Type: text/html; charset=utf-8
+> > ~~~~
+> >
+> > ... which seem fairly normal... getting more data than this is a little inconvenient since the data is gzip-encoded and i'm kind of lazy extracting that from the stream. Chromium does seem to auto-detect it as utf8 according to the menus however... not sure what's going on here. I would focus on the following error however, since it's clearly emanating from the CGI... --[[anarcat]]
+
+Clicking on the Cancel button yields the following warning:
+
+~~~~
+Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215.
+~~~~
+
+> Looks as though you might be able to get a Python-style backtrace for this
+> by setting `$Carp::Verbose = 1`.
+>
+> The error is that we're taking some string (which string? only a backtrace
+> would tell you) that is already flagged as Unicode, and trying to decode
+> it from byte-blob to Unicode again, analogous to this Python:
+>
+> some_bytes.decode('utf-8').decode('utf-8')
+>
+> --[[smcv]]
+> >
+> > I couldn't figure out where to set that Carp thing - it doesn't work simply by setting it in /usr/bin/ikiwiki - so i am not sure how to use this. However, with some debugging code in Encode.pm, i was able to find a case of double-encoding - in the left menu, for example, which is the source of the Encode.pm crash.
+> >
+> > It seems that some unicode semantics changed in Perl 5.20, or more precisely, in Encode.pm 2.53, according to [this](https://code.activestate.com/lists/perl-unicode/3314/). 5.20 does have significant Unicode changes, but I am not sure they are related (see [perldelta](https://metacpan.org/pod/distribution/perl/pod/perldelta.pod)). Doing more archeology, it seems that Encode.pm is indeed where the problem started, all the way back in [commit 8005a82](https://github.com/dankogai/p5-encode/commit/8005a82d8aa83024d72b14e66d9eb97d82029eeb#diff-f3330aa405ffb7e3fec2395c1fc953ac) (august 2013), taken from [pull request #11](https://github.com/dankogai/p5-encode/pull/11) which expressively forbids double-decoding, in effect failing like python does in the above example you gave (Perl used to silently succeed instead, a rather big change if you ask me).
+> >
+> > So stepping back, it seems that this would be a bug in Ikiwiki. It could be in any of those places:
+> >
+> > ~~~~
+> > anarcat@marcos:ikiwiki$ grep -r decode_utf8 IkiWiki* | wc -l
+> > 31
+> > ~~~~
+> >
+> > Now the fun part is to determine which one should be turned off... or should we duplicate the logic that was removed in decode_utf8, or make a safe_decode_utf8 for ourselves? --[[anarcat]]
+
+The apache logs yield:
+
+~~~~
+[Mon Sep 08 16:17:43.995827 2014] [cgi:error] [pid 2609] [client 192.168.0.3:47445] AH01215: Died at /usr/share/perl5/IkiWiki/CGI.pm line 467., referer: http://anarc.at/ikiwiki.cgi?do=edit&page=wishlist
+~~~~
+
+Interestingly enough, I can't reproduce the bug here (at least in this page). Also, editing the page through git works fine.
+
+I had put ikiwiki on hold during the last upgrade, so it was upgraded separately. The bug happens both with 3.20140613 and 3.20140831. The major thing that happened today is the upgrade from perl 5.18 to 5.20. Here's the output of `egrep '[0-9] (remove|purge|install|upgrade)' /var/log/dpkg.log | pastebinit -b paste.debian.net` to give an idea of what was upgraded today:
+
+http://paste.debian.net/plain/119944
+
+This is a major bug which should probably be fixed before jessie, yet i can't seem to find a severity statement in reportbug that would justify blocking the release based on this - unless we consider non-english speakers as "most" users (i don't know the demographics well enough). It certainly makes ikiwiki completely unusable for my users that operate on the web interface in french... --[[anarcat]]
+
+Note that on this one page, i can't even get the textarea to display and i immediately get `Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215`: http://anarc.at/ikiwiki.cgi?do=edit&page=hardware%2Fserver%2Fmarcos.
+
+Also note that this is the same as [[forum/"Error: cannot decode string with wide characters" on Mageia Linux x86-64 Cauldron]], I believe. The backtrace I get here is:
+
+~~~~
+Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215. Encode::decode_utf8("**Menu**\x{d}\x{a}\x{d}\x{a} * [[\x{fffd} propos|index]]\x{d}\x{a} * [[Logiciels|software]]"...)
+called at /usr/share/perl5/IkiWiki/CGI.pm line 117 IkiWiki::decode_form_utf8(CGI::FormBuilder=HASH(0x2ad63b8))
+called at /usr/share/perl5/IkiWiki/Plugin/editpage.pm line 90 IkiWiki::cgi_editpage(CGI=HASH(0xd514f8), CGI::Session=HASH(0x27797e0))
+called at /usr/share/perl5/IkiWiki/CGI.pm line 443 IkiWiki::__ANON__(CODE(0xfaa460))
+called at /usr/share/perl5/IkiWiki.pm line 2101 IkiWiki::run_hooks("sessioncgi", CODE(0x2520138))
+called at /usr/share/perl5/IkiWiki/CGI.pm line 443 IkiWiki::cgi()
+called at /usr/bin/ikiwiki line 192 eval {...}
+called at /usr/bin/ikiwiki line 192 IkiWiki::main()
+called at /usr/bin/ikiwiki line 231
+~~~~
+
+so this would explain the error on cancel, but doesn't explain the weird encoding i get when editing the page... <sigh>...
+
+... and that leads me to this crazy patch which fixes all the above issue, by avoiding double-decoding... go figure that shit out...
+
+[[!template id=gitbranch branch=anarcat/dev/safe_unicode author="[[anarcat]]"]]
+
+> [[Looks good to me|users/smcv/ready]] although I'm not sure how valuable
+> the `$] < 5.02 || ` test is - I'd be tempted to just call `is_utf8`. --[[smcv]]
>>> remains in read-entire-file mode afterwards. To avoid odd side-effects,
>>> I would suggest using `readfile()` like `t/trail.t` does.
>>>
+>>> [[!template id=gitbranch branch=smcv/ready/imgforpdf-and-more author="[[chrysn]], [[smcv]]"
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/imgforpdf-and-more]]
+>>> I've used `readfile()` (but not done anything about the ImageMagick file type)
+>>> in my copy of the branch.
+>>>
>>> --[[smcv]]
> Is this fixed now that [[!debbug 738493]] has been fixed? --[[smcv]]
+> > No, it isn't. I still get: `no_identity_server: Could not determine ID provider from URL.` from the latest ikiwiki in jessie (3.20140831), with liblwpx-paranoidagent-perl 1.10-3. Debugging tells me it's still related to the `500 Can't verify SSL peers without knowing which Certificate Authorities to trust` error, so probably because `Mozilla::CA` is not packaged ([[!debbug 702124]]). I still had to apply the patch to disable SSL verification at the end of this file. However, setting `$ENV{PERL_LWP_SSL_CA_PATH} = '/etc/ssl/certs';` seems to work now, so the following dumb patch works:
+> >
+> > ~~~~
+> > --- /usr/bin/ikiwiki.orig 2014-09-08 15:48:35.715868902 -0400
+> > +++ /usr/bin/ikiwiki 2014-09-08 15:50:29.666779878 -0400
+> > @@ -225,4 +225,5 @@
+> > }
+> > }
+> >
+> > +$ENV{PERL_LWP_SSL_CA_PATH} = '/etc/ssl/certs';
+> > main;
+> > ~~~~
+> >
+> > may not be the best place to fiddle around with this, but then again it makes sense that it applies to the whole program. it should probably be reported upstream as well. also in my git repo. -- [[anarcat]]
+> >
+> > > This seems Debian-specific. I would be inclined to consider this to be
+> > > a packaging/system-integration (i.e. non-upstream) bug in
+> > > `liblwpx-paranoidagent-perl` rather than an upstream bug in IkiWiki;
+> > > it certainly seems inappropriate to put this Debian-specific path
+> > > in upstream IkiWiki. If it can't be fixed in LWPX::ParanoidAgent for
+> > > whatever reason, applying it via some sort of sed in ikiwiki's
+> > > `debian/rules` might be more reasonable? --[[smcv]]
+> > >
+> > > > by "upstream", i did mean `liblwpx-paranoidagent-perl`. so yeah, maybe this should be punted back into that package's court again. :( --[[anarcat]]
+> > > >
+> > > > done, by bumping the severity of [[!debbug 744404]] to release-criticial. --[[anarcat]]
+> > > >
+> > > > > ooh cool, the bug was fixed already with an upload, so this should probably be considered [[done]] at this point, even without the patch below! great! -- [[anarcat]]
+
+[[!template id=gitbranch branch=anarcat/dev/ssl_ca_path author="[[anarcat]]"]]
+
I seem recall having that error before, and fixing it, but it always seems to come back and I forget how to fix it. So I'll just open this bug and document it if i can figure it out... -- [[users/anarcat]]
The Perl module manual says:
> My only workaround for now was to fix `PERL_LWP_SSL_VERIFY_HOSTNAME` to 0 directly in `ikiwiki` :-( -- [[users/bbb]]
~~~~
-*** /home/bruno/opt/ikiwiki/bin/ikiwiki.bad 2014-04-17 15:41:38.868972152 +0200
---- /home/bruno/opt/ikiwiki/bin/ikiwiki 2014-04-17 15:04:56.524996905 +0200
-*************** sub main () {
-*** 226,229 ****
- }
- }
-
-! main;
---- 226,229 ----
+--- /usr/bin/ikiwiki.orig 2014-09-08 15:48:35.715868902 -0400
++++ /usr/bin/ikiwiki 2014-09-08 15:48:38.895947911 -0400
+@@ -225,4 +225,5 @@
}
- }
-
-! $ENV{PERL_LWP_SSL_VERIFY_HOSTNAME} = 0 ; main;
+ }
+
++$ENV{PERL_LWP_SSL_VERIFY_HOSTNAME} = 0;
+ main;
~~~~
[[!template id=gitbranch branch=chrysn/more-proxy-utf8-fail author="[[chrysn]]"]]
+[[!template id=gitbranch author="[[chrysn]], [[smcv]]" branch=smcv/ready/more-proxy-utf8-fail
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/more-proxy-utf8-fail]]
the recently introduced fixes for [[crashes in the python proxy even if disabled]]
caused the typical python2 implicit conversion failures ("'ascii' codec
>>> a `unicode`. (i'd happily ditch python2 and port all plugins to python3,
>>> where this is all easier, but my [[todo/vCard rendering]] still uses an
>>> ancient module.) --[[chrysn]]
+
+>>>> You were right about this, `encode` is appropriate to go from `unicode`
+>>>> to `str` under Python 2. However, Python 3 is still broken.
+>>>>
+>>>> My `ready/more-proxy-utf8-fail` branch, based on yours,
+>>>> [[fixes the `rst` test when run under Python 3|bugs/rst_plugin_hangs_when_used_with_Python_3]]
+>>>> and hopefully also fixes this one. Please check that it still
+>>>> fixes your test-case too.
+>>>>
+>>>> Joey, I think this is [[ready for merge|users/smcv/ready]] even if it
+>>>> doesn't fix chrysn's bug - it does fix Python 3 support
+>>>> in general. --[[smcv]]
> In current ikiwiki, you can get a broadly similar effect by either
> using \[[!meta redir=foo]] (which does a HTML `<meta>` redirect)
> or reconfiguring the web server. --[[smcv]]
+
+>> The CGI spec (http://www.ietf.org/rfc/rfc3875) says that a CGI can cause a redirect by returning a Location: header.
+>> So it's possible; desirable (due to your point about conflicting with git-annex support) is a different matter.
+
+>>> One of the major things that separates ikiwiki from other wiki software
+>>> is that ikiwiki is a wiki compiler: ordinary page-views are purely
+>>> static HTML, and the CGI only gets involved when you do something
+>>> that really has to be dynamic (like an edit).
+>>>
+>>> However, there is no server-independent static content that ikiwiki
+>>> could write out to the destdir that would result in that redirect.
+>>>
+>>> If you're OK with requiring the [[plugins/404]] plugin (and a
+>>> web server where it works, which I think still means Apache) then
+>>> it would be possible to write a plugin that detected symlinks,
+>>> stored them in the `%wikistate`, and used them to make the
+>>> [[plugins/404]] plugin (or its own hook similar to the one
+>>> in that plugin) do a 302 redirect instead of a 404.
+>>> Similarly, a plugin that assumed a suitable Apache
+>>> configuration with fairly broad `AllowOverrides`,
+>>> and wrote out `.htaccess` files, would be a feasible thing
+>>> for someone to write.
+>>>
+>>> I don't think this is a bug; I think it's a request for a
+>>> feature that not everyone will want. The solution to those
+>>> is for someone who wants the feature to
+>>> [[write a plugin|plugins/write]]. --[[smcv]]
for debugging purposes only).
-> this is fixed in commit [154c4ea9](http://source.ikiwiki.branchable.com/?p=source.git;a=commit;h=154c4ea9e65d033756330a7f8c5c0fa285380bf0)
+> this is [[fixed|done]] in commit [154c4ea9](http://source.ikiwiki.branchable.com/?p=source.git;a=commit;h=154c4ea9e65d033756330a7f8c5c0fa285380bf0)
> (november 2013), which is included in 3.20140227. --[[chrysn]]
--- /dev/null
+During ikiwiki make phase the rst process hangs:
+[ps output](http://dpaste.com/21TQQKT)
+[gdb backtrace 1](http://dpaste.com/0VQBW6D)
+[gdb backtrace 1](http://dpaste.com/1VHS88Y)
+
+working with python 2.7
+[http://dpaste.com/0985A91](http://dpaste.com/0985A91)
+not working with python3.3~3.4
+[http://dpaste.com/0ACNK3W](http://dpaste.com/0ACNK3W)
+
+> Retitled this bug report since it seems to be specific to Python 3.
+>
+> The `rst` plugin is probably more commonly used with Python 2.
+> It seems likely that there is some Python-3-specific bug in `proxy.py`,
+> perhaps introduced by [commit 154c4ea
+ "properly encode and decode from/to utf8 when sending rpc to ikiwiki"](
+http://source.ikiwiki.branchable.com/?p=source.git;a=commitdiff;h=154c4ea9e65d033756330a7f8c5c0fa285380bf0).
+>
+> I can reproduce this on Debian by installing `python3-docutils`
+> and changing the first line of `plugins/proxy.py`, the first
+> line of `plugins/pythondemo`, the first line of `plugins/rst`
+> and the `system()` call in `t/rst.t` to use `python3` instead
+> of `python`. --[[smcv]]
+
+looks like the problem is in proxy.py
+ml = _IkiWikiExtPluginXMLRPCHandler._read(in_fd).decode('utf8')
+
+without decode('utf8') is working
+
+> That call was introduced
+> [[to fix a bug under Python 2|bugs/crashes_in_the_python_proxy_even_if_disabled]]
+> so it cannot just be removed, but I've put a proposed branch on
+> [[this related bug|bugs/pythonproxy-utf8_again]]. [[!tag patch]] --smcv
+
+tested and fixed with patch [http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/38bd51bc1bab0cabd97dfe3cb598220a2c02550a](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/38bd51bc1bab0cabd97dfe3cb598220a2c02550a) and patch [http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/81506fae8a6d5360f6d830b0e07190e60a7efd1c](http://git.pseudorandom.co.uk/smcv/ikiwiki.git/commitdiff/81506fae8a6d5360f6d830b0e07190e60a7efd1c)
--- /dev/null
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawk6z7Jsfi_XWfzFJNZIjYUcjgrthg4aPUU"
+ nickname="Alejandro"
+ subject="Same Trick in Apache"
+ date="2014-09-10T18:58:24Z"
+ content="""
+I got it working with Apache 2.4 and Virtual Hosts on both HTTP 1.1 and HTTPS (SNI). The procedure is somewhat analogous to the nginx procedure above. So here is my set-up in the hopes will help other avoid this pain.
+
+## Set-up
+
+ CLIENT <---- HTTPS ----> REVERSE PROXY <---- HTTP ----> IKIWIKI
+
+
+## The HTTP to HTTPS Redirect
+
+To assure that all your HTTP requests are being redirected to HTTPS I chose to use mod_rewrite because simple Redirect does not pass query parameters. You will want an HTTP VHost that will redirect with something like the one below (notice the subtle ? before query string). **Note: This will NOT re-write ikiwiki's http:// URLs (base tag, etc.)**. For that I use a content filter like you will see below. This HTTP to HTTPS redirect is required though for both security and for the /foo/?updated URI form in this set-up.
+
+<pre>
+
+<VirtualHost *:80>
+ ServerName imass.name
+ RewriteEngine On
+ RewriteCond %{HTTPS} off
+ RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}?%{QUERY_STRING}
+ ErrorLog /var/log/imass.name-error.log
+ LogLevel warn
+ CustomLog /var/log/imass.name-access.log combined
+</VirtualHost>
+
+</pre>
+
+## The SSL Virtual Host
+
+This part is a bit more tricky. First I am using SNI as I don't care for non-SNI user agents. Second, you need to use a filter that replaces all http:// to https:// before the response is set. Note that this alone won't deal with ?update so you will need the HTTP to HTTPS set-up above anyway. Third, I use HTTP Auth so I don't know if this will work with your particular Auth set-up (although it should IMHO), YMMV:
+
+<pre>
+
+<VirtualHost *:443>
+ ServerName imass.name
+ ProxyHTMLEnable On
+ ProxyHTMLExtended On
+ SSLEngine on
+ SSLCertificateFile XXX
+ SSLCertificateKeyFile XXX
+ SSLCertificateChainFile XXX
+ SSLOptions +StdEnvVars
+ ProxyPreserveHost On
+ ProxyHTMLURLMap http:// https://
+ ProxyPass / http://192.168.101.101/
+ ProxyPassReverse / http://192.168.101.101/
+ LogLevel warn
+ ErrorLog /var/log/imass.name-ssl-error.log
+ TransferLog \"/var/log/imass.name-ssl-access.log\"
+ CustomLog \"/var/log/imass.name-ssl-request.log\" \"%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \\"%r\\" %b\"
+</VirtualHost>
+
+</pre>
+
+
+
+"""]]
--- /dev/null
+[[!comment format=mdwn
+ username="https://id.koumbit.net/anarcat"
+ ip="72.0.72.144"
+ subject="comment 1"
+ date="2014-09-10T03:00:22Z"
+ content="""
+i had a similar issue, reported in [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]].
+"""]]
--- /dev/null
+There is an ongoing [effort to standardise Markdown][sm]; I think it would be nice to check whether this implementation is compliant with it.
+
+[sm]: http://standardmarkdown.com/
+
+http://standardmarkdown.com/
+
+> IkiWiki's [[plugins/mdwn]] plugin does not contain an implementation
+> of Markdown: it relies on external libraries. It can currently use
+> any of these, most-preferred first:
+>
+> * [[!cpan Text::MultiMarkdown]], only if explicitly requested via
+> `$config{multimarkdown}`
+> * [[!cpan Text::Markdown::Discount]], if not explicitly disabled
+> via `$config{nodiscount}`
+> * [[!cpan Text::Markdown]]
+> * [[!cpan Markdown]]
+> * `/usr/bin/markdown`
+>
+> In practice, Discount is the implementation pulled in by the
+> Debian package dependencies, and (I suspect) the most
+> commonly used with IkiWiki.
+>
+> If the selected external library (whatever it happens to be)
+> complies with a particular interpretation of Markdown, then
+> IkiWiki will too. If not, it won't. The only influence
+> IkiWiki has over its level of compliance with a particular
+> interpretation is in how we choose which external library
+> we prefer.
+>
+> As such, if you want IkiWiki to change its interpretation of
+> Markdown, the way to do that is to either change Discount's
+> interpretation of Markdown, or contribute a patch to make
+> `mdwn.pm` prefer a different (and presumably "more compliant")
+> Markdown implementation.
+>
+> IkiWiki has one syntax extension beyond Markdown, which is
+> that text enclosed in double-square-brackets is an IkiWiki
+> [[ikiwiki/wikilink]] or [[ikiwiki/directive]]. This applies
+> to any markup language used with IkiWiki, not just Markdown.
+>
+> (There also doesn't seem to be any consensus that labelling
+> any particular fork of Markdown as "standard" can make it the
+> truth, or that this particular fork is the Correct™ fork and not
+> just <https://xkcd.com/927/>; but that's between the authors of
+> Markdown implementations and those who want to standardize
+> Markdown, and it isn't IkiWiki's job to police that.)
+>
+> --[[smcv]]
+++ /dev/null
-ikiwiki 3.20140102 released with [[!toggle text="these changes"]]
-[[!toggleable text="""
- * aggregate: Improve display of post author.
- * poll: Fix behavior of poll buttons when inlined.
- * Fixed unncessary tight loop hash copy in saveindex where a pointer
- can be used instead. Can speed up refreshes by nearly 50% in some
- circumstances.
- * Optimized loadindex by caching the page name in the index.
- * Added only\_committed\_changes config setting, which speeds up wiki
- refresh by querying git to find the files that were changed, rather
- than looking at the work tree. Not enabled by default as it can
- break some setups where not all files get committed to git.
- * comments: Write pending moderation comments to the transient underlay
- to avoid conflict with only\_committed\_changes.
- * search: Added google\_search option, which makes it search google
- rather than using the internal xapain database.
- (googlesearch plugin is too hard to turn on when xapain databases
- corrupt themselves, which happens all too frequently).
- * osm: Remove invalid use of charset on embedded javascript tags.
- Closes: #[731197](http://bugs.debian.org/731197)
- * style.css: Add compatibility definitions for more block-level
- html5 elements. Closes: #[731199](http://bugs.debian.org/731199)
- * aggregrate: Fix several bugs in handling of empty and colliding
- titles when generating filenames."""]]
\ No newline at end of file
--- /dev/null
+ikiwiki 3.20140831 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * Make --no-gettime work in initial build. Closes: #[755075](http://bugs.debian.org/755075)"""]]
\ No newline at end of file
+
This is the [[SandBox]], a page anyone can edit to try out ikiwiki
(version [[!version ]]).
Send Mail</a>
</p>
-This is some preformatted text. Each line is proceeded by four spaces.
+What follows is some preformatted text. Each line is proceeded by four spaces.
Test
* [[!shortcut name=debrt url="https://rt.debian.org/Ticket/Display.html?id=%s"]]
* [[!shortcut name=debss url="http://snapshot.debian.org/package/%s/"]]
* Usage: `\[[!debss package]]` or `\[[!debss package/version]]`. See <http://snapshot.debian.org/> for details.
-* [[!shortcut name=debwiki url="https://wiki.debian.org/%s"]]
+* [[!shortcut name=debwiki url="https://wiki.debian.org/%S"]]
* [[!shortcut name=fdobug url="https://bugs.freedesktop.org/show_bug.cgi?id=%s" desc="freedesktop.org bug #%s"]]
* [[!shortcut name=fdolist url="http://lists.freedesktop.org/mailman/listinfo/%s" desc="%s@lists.freedesktop.org"]]
* [[!shortcut name=gnomebug url="https://bugzilla.gnome.org/show_bug.cgi?id=%s" desc="GNOME bug #%s"]]
* [[!shortcut name=whois url="http://reports.internic.net/cgi/whois?whois_nic=%s&type=domain"]]
* [[!shortcut name=cve url="https://cve.mitre.org/cgi-bin/cvename.cgi?name=%s"]]
* [[!shortcut name=flickr url="https://secure.flickr.com/photos/%s"]]
-* [[!shortcut name=man url="http://linux.die.net/man/%s"]]
+* [[!shortcut name=man url="http://manpages.debian.org/%s"]]
* [[!shortcut name=ohloh url="https://www.ohloh.net/p/%s"]]
* [[!shortcut name=cpanrt url="https://rt.cpan.org/Ticket/Display.html?id=%s" desc="CPAN RT#%s"]]
* [[!shortcut name=novellbug url="https://bugzilla.novell.com/show_bug.cgi?id=%s" desc="bug %s"]]
> --[[smcv]]
>
> > Thank you for this review. -- [[Louis|spalax]]
+
+---
+
+[[smcv]], can you please go on reviewing this?
+
+> I don't think I'm really the reviewer you want, since I don't have commit
+> access (as you might be able to tell from the number of pending branches
+> I have)... but nobody with commit access seems to be available to do
+> reviews at the moment, so I'm probably the best you're going to get.
+>
+> + 0 0 * * * ikiwiki ~/ikiwiki.setup --refresh
+>
+> I think that should be `ikiwiki --setup ~/ikiwiki.setup --refresh`
+>
+> The indentation of some of the new code in `IkiWiki/Plugin/calendar.pm`
+> is weird. Please use one hard tab (U+0009) per indent step: you seem
+> to have used a mixture of one hard tab per indent or two spaces
+> per indent, which looks bizarre for anyone whose tab size is not
+> 2 spaces.
+>
+> + return unless $config{calendar_autocreate};
+>
+> This is checked in `gencalendaryear` but not in `gencalendarmonth`.
+> Shouldn't `gencalendarmonth` do it too? Alternatively, do the check
+> in `scan`, which calls `gencalendarmonth` directly.
+>
+> + my $year = $date[5] + 1900;
+>
+> You calculate this, but you don't seem to do anything with it?
+>
+> + if (not exists $changed{$params{year}}) {
+> + $changed{$params{year}} = ();
+> + }
+> + $changed{$params{year}}{$params{month}} = 1;
+>
+> `$changed{$params{year}}` is a scalar (you can tell because it starts with the
+> `$` sigil) but `()` is a list. I think you want `{}`
+> (a scalar that is a reference to an empty anonymous hash).
+>
+> However, that whole `if` block can be omitted, and you can just use
+> `$changed{$params{year}}{$params{month}} = 1;`, because Perl will automatically
+> create `$changed{$params{year}}` as a reference to an empty hash if necessary,
+> in order to put the pair `$params{month} => 1` in it (the term to look
+> up if you're curious is "autovivification").
+>
+> --[[smcv]]
> >
> > Originally, I named that parameter `backwards_links`, but then it wouldn't make sense in the long term, and isn't exactly neutral: it assume the current way is backwards! Your suggestion is interesting however, but I don't think the rtl/ltr nomenclature is problematic, with proper documentation of course... --[[anarcat]]
+> > > I still don't think `rtl`/`ltr` is the right terminology here. I think
+> > > the "API" should say what you mean: the distinction being made is
+> > > "text first" vs. "link first", so, say that.
+> > >
+> > > As far as I understand it, RTL languages like Arabic typically write
+> > > text files "in logical order" (i.e. reading/writing order - first
+> > > letter is first in the bytestream) and only apply RTL rendering on
+> > > display. IkiWiki is UTF-8-only, and Unicode specifies that all
+> > > Unicode text should be in logical order. The opposite of logical
+> > > order is is "display order", which is how you would have to mangle
+> > > the file for it to appear correctly on a naive terminal that expects
+> > > LTR; that can only work correctly for hard-wrapped text, I think.
+> > >
+> > > IkiWiki will parse files
+> > > in logical order too; so if a link's text and destination are both
+> > > written in Arabic, in text-before-link order in the source code, an
+> > > Arabic reader starting from the right would still see the text
+> > > before the link. Similarly, in your proposed link-before-text
+> > > order, an Arabic reader would still see the link before the text
+> > > (which in their case means further to the right). So I don't think
+> > > it would make sense to suggest that
+> > > one order was more appropriate for RTL languages than the other: if
+> > > it's "more correct" (for whatever opinion of "correct") in English, then
+> > > it's "more correct" in Arabic too.
+> > >
+> > > (If the destination is written in Latin then it gets
+> > > more complicated, because the destination will be rendered LTR within an
+> > > otherwise RTL document. I think the order still works though.) --[[smcv]]
+
There's a caveat: we can't have a per-wiki backwards_links option, because of the underlay, common to all wikis, which needs to be converted. So the option doesn't make much sense. Not sure how to deal with this... Maybe this needs to be at the package level? --[[anarcat]]
> I've thought about adding a direction-neutral `\[[!link]]` directive -
1. left to right (text then link) can be considered more natural, and should therefore be supported
2. it is supported in markdown using regular markdown links. in the proposed branch, the underlay wikilinks are converted to use regular markdown links
+ > Joey explicitly rejected this for a valid reason (it breaks inlining). See above. --[[smcv]]
3. ikiwiki links break other markup plugins, like mediawiki and creole, as those work right to left.
4. those are recognized "standards" used by other popular sites, like Wikipedia, or any wiki supporting the Creole markup, which is [most wikis](http://www.wikicreole.org/wiki/Engines)
>> better way based on that, maybe global configuration in `$config`.
>> --[[smcv]]
->>> [[!template id=gitbranch branch=smcv/ready/edittemplate
- browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/edittemplate
+>>> [[!template id=gitbranch branch=smcv/ready/edittemplate2
+ browse=http://git.pseudorandom.co.uk/smcv/ikiwiki.git/shortlog/refs/heads/ready/edittemplate2
author="Jonathon Anderson, [[smcv]]"]]
>>> Here is a version of that branch that I [[would merge|users/smcv/ready]] if I could.
>>> Changes since Jonathon's version:
>>>> html5 would leave old evaluations of displaytime around in the repository.
>>>> (example template: `\[[!meta date="<TMPL_VAR time>"]]I wrote this post at
>>>> \[[!displaytime "<TMPL_VAR time>"]]`). --[[chrysn]]
+
+>>>>> That's a very good point; and Joey added `\[[!date "<TMPL_VAR time">]]`,
+>>>>> which does the same thing as your hypothetical `\[[!displaytime]]`,
+>>>>> almost 5 years ago. Branch replaced by `smcv/ready/edittemplate2`
+>>>>> which drops `formatted_time` and `html_time`, and adds a suggestion
+>>>>> to use `\[[!date]]`. --[[smcv]]
> ikiwiki only allows a very limited set of characters raw in page names,
> this is done as a deny-by-default security thing. All other characters
-> need to be encoded in __code__ format, where "code" is the character
+> need to be encoded in `__code__` format, where "code" is the character
> number. This is normally done for you, but if you're adding a page
> manually, you need to handle it yourself. --[[Joey]]
>>>>>> What's your locale? I have both pl\_PL (ISO-8859-2) and pl\_PL.UTF-8,
>>>>>> but I use pl\_PL. Is it wrong? --[[Paweł|ptecza]]
+>>>>>>> IkiWiki assumes UTF-8 throughout, so escaped filename characters
+>>>>>>> should be `__x____y____z__` where x, y, z are the bytes of the
+>>>>>>> UTF-8 encoding of the character. I don't know how to achieve that
+>>>>>>> from a non-UTF-8 locale. --[[smcv]]
+
>>>> Now, as to UTF7, in retrospect, using a standard encoding might be a
>>>> better idea than coming up with my own encoding for filenames. Can
>>>> you provide a pointer to a description to modified-UTF7? --[[Joey]]
>>>>> There is a Perl [Unicode::IMAPUtf7](http://search.cpan.org/~fabpot/Unicode-IMAPUtf7-2.01/lib/Unicode/IMAPUtf7.pm)
>>>>> module at the CPAN, but probably it hasn't been debianized yet :( --[[Paweł|ptecza]]
+> Note: [libencode-imaputf7-perl][1] has made it into debian.
+>
+>> "IMAP UTF-7" uses & as an escape character, which seems like a recipe
+>> for shell injection vulnerabilities... so I would not recommend it
+>> for this particular use. --[[smcv]]
+
+> I would value some clarification, in the ikiwiki setup file I have
+>
+> wiki_file_chars: -[:alnum:][\p{Arabic}()]+/.:_
+>
+> Ikiwiki doesn't seem to produce any errors on the commandline for this, but
+> when I attempt to create a new post with Arabic characters from the web I get the following error :
+>
+> Error: Cannot decode string with wide characters at /usr/lib/x86_64-linux-gnu/perl/5.20/Encode.pm line 215.
+>
+> Should the modified regexp not be sufficient?
+> Ikiwiki 3.20140815.
+> --[[mhameed]]
+
+>> This seems like a bug: in principle non-ASCII in `wiki_file_chars` should work,
+>> in practice it does not. I would suggest either using the default
+>> `wiki_file_chars`, or digging into the code to find what is wrong.
+>> Solving this sort of bug usually requires having a clear picture of
+>> which "strings" are bytestrings, and which "strings" are Unicode. --[[smcv]]
+
+>>> mhameed confirmed on IRC that anarcat's [[patch]] from
+>>> [[bugs/garbled_non-ascii_characters_in_body_in_web_interface]] fixes this.
+>>> --[[smcv]]
+
[[wishlist]]
+[1]: https://packages.debian.org/search?suite=all§ion=all&arch=any&searchon=names&keywords=libencode-imaputf7-perl