People seem to be able to expect to enter www.foo.com and get away with it.
The resulting my.wiki/www.foo.com link was not ideal.
To fix it, use URI::Heuristic to expand such things into a real url. It
even looks up hostnames in the DNS if necessary.
$pagestate{$page}{meta}{author} = $params{claimedauthor};
}
- if (defined $params{url} and safeurl($params{url})) {
- $pagestate{$page}{meta}{authorurl} = $params{url};
+ if (defined $params{url}) {
+ my $url=$params{url};
+
+ eval q{use URI::Heuristic};
+ if (! $@) {
+ $url=URI::Heuristic::uf_uristr($url);
+ }
+
+ if (safeurl($url)) {
+ $pagestate{$page}{meta}{authorurl} = $url;
+ }
}
}
else {
* Add deprecation warning for GlobLists, which will stop working in 3.0.
* camelcase: Add camelcase_ignore setting.
* googlecalendar: Add runtime deprecation warning.
+ * comments: Deal with users entering unqualified or partial urls.
-- Joey Hess <joeyh@debian.org> Mon, 22 Dec 2008 19:02:16 -0500