> One way to do this would be to introduce variables into the pagespec, along with
> universal and/or existential [[!wikipedia Quantification]]. That looks quite complex.
>
+>> I thought about this briefly, and got about that far.. glad you got
+>> further. :-) --[[Joey]]
+
+>> Or, one [[!taglink could_also_refer|pagespec_in_DL_style]] to the language of [[!wikipedia description logics]]: their formulas actually define classes of objects through quantified relations to other classes. --Ivan Z.
+>
> Another option would be go with a more functional syntax. The concept here would
> be to allow a pagespec to appear in a 'pagespec function' anywhere a page can. e.g.
> I could pass a pagespec to `link()` and that would return true if there is a link to any
> ends, and that isn't a regular language (we can't use regular expression matching for
> easy parsing).
>
+>> Also, it may cause ambiguities with page names that contain parens
+>> (though some such ambigutities already exist with the pagespec syntax).
+>
> One simplification of that would be to introduce some pagespec [[shortcuts]]. We could
> then allow pagespec functions to take either pages, or named pagespec shortcuts. The
> pagespec shortcuts would just be listed on a special page, like current [[shortcuts]].
> Does that seem like a reasonable first approach?
>
> -- [[Will]]
+
+>> Having a separate page for the shortcuts feels unwieldly.. perhaps
+>> instead the shortcut could be defined earlier in the scope of the same
+>> pagespec that uses it?
+>>
+>> Example: `define(~bugs, bugs/* and !*/Discussion) and define(~openbugs, ~bugs and !link(done)) and ~openbugs and !link(~openbugs)`
+
+>>> That could work. parens are only ever nested 1 deep in that grammar so it is regular and the current parsing would be ok.
+
+>> Note that I made the "~" explicit, not implicit, so it could be left out. In the case of ambiguity between
+>> a definition and a page name, the definition would win.
+
+>>> That was my initial thought too :), but when implementing it I decided that requiring the ~ made things easier. I'll probably require the ~ for the first pass at least.
+
+>> So, equivilant example: `define(bugs, bugs/* and !*/Discussion) and define(openbugs, bugs and !link(done)) and openbugs and !link(openbugs)`
+>>
+
+>> Re recursion, it is avoided.. but building a pagespec that is O(N^X) where N is the
+>> number of pages in the wiki is not avoided. Probably need to add DOS prevention.
+>> --[[Joey]]
+
+>>> If you memoize the outcomes of the named pagespecs you can make in O(N.X), no?
+>>> -- [[Will]]
+
+>>>> Yeah, guess that'd work. :-)
+
+> <a id="another_kind_of_links" />One quick further thought. All the above discussion assumes that 'dependency' is the
+> same as 'links to', which is not really true. For example, you'd like to be able to say
+> "This bug does not depend upon [ [ link to other bug ] ]" and not have a dependency.
+> Without having different types of links, I don't see how this would be possible.
+>
+> -- [[Will]]
+
+>> I saw that this issue is targeted at by the work on [[structured page data#another_kind_of_links]]. --Ivan Z.
+
+Okie - I've had a quick attempt at this. Initial patch attached. This one doesn't quite work.
+And there is still a lot of debugging stuff in there.
+
+At the moment I've added a new preprocessor plugin, `definepagespec`, which is like
+shortcut for pagespecs. To reference a named pagespec, use `~` like this:
+
+ [ [!definepagespec name="bugs" spec="bugs/* and !*/Discussion"]]
+ [ [!definepagespec name="openbugs" spec="~bugs and !link(done)"]]
+ [ [!definepagespec name="readybugs" spec="~openbugs and !link(~openbugs)"]]
+
+At the moment the problem is in `match_link()` when we're trying to find a sub-page that
+matches the appropriate page spec. There is no good list of pages available to iterate over.
+
+ foreach my $nextpage (keys %IkiWiki::pagesources)
+
+does not give me a good list of pages. I found the same thing when I was working on
+this todo [[todo/Add_a_plugin_to_list_available_pre-processor_commands]].
+
+> I'm not sure why iterating over `%pagesources` wouldn't work here, it's the same method
+> used by anything that needs to match a pagespec against all pages..? --[[Joey]]
+
+>> My uchecked hypothesis is that %pagesources is created after the refresh hook.
+>> I've also been concerned about how globally defined pagespec shortcuts would interact with
+>> the page dependancy system. Your idea of internally defined shortcuts should fix that. -- [[Will]]
+
+>>> You're correct, the refresh hook is run very early, before pagesources
+>>> is populated. (It will be partially populated on a refresh, but will
+>>> not be updated to reflect new pages.) Agree that internally defined
+>>> seems the way to go. --[[Joey]]
+
+Immediately below is a patch which seems to basically work. Lots of debugging code is still there
+and it needs a cleanup, but I thought it worth posting at this point. (I was having problems
+with old style glob lists, so i just switched them off for the moment.)
+
+The following three inlines work for me with this patch:
+
+ Bugs:
+
+ [ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and ~bugs" archive="yes"]]
+
+ OpenBugs:
+
+ [ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and define(~openbugs,~bugs and !link(done)) and ~openbugs" archive="yes"]]
+
+ ReadyBugs:
+
+ [ [!inline pages="define(~bugs, bugs/* and ! */Discussion) and define(~openbugs,~bugs and !link(done)) and define(~readybugs,~openbugs and !link(~openbugs)) and ~readybugs" archive="yes"]]
+
+> Nice! Could the specfuncsref be passed in %params? I'd like to avoid
+> needing to change the prototype of every pagespec function, since several
+> plugins define them too. --[[Joey]]
+
+>> Maybe - it needs more thought. I also considered it when I was going though changing all those plugins :).
+>> My concern was that `%params` can contain other user-defined parameters,
+>> e.g. `link(target, otherparameter)`, and that means that the specFuncs could be clobbered by a user (or other
+>> weird security hole). I thought it better to separate it, but I didn't think about it too hard. I might move it to
+>> the first parameter rather than the second. Ikiwiki is my first real perl hacking and I'm still discovering
+>> good ways to write things in perl.
+>>
+>>>> `%params` contains the parameters passed to `pagespec_match`, not
+>>>> user-supplied parameters. The user-supplied parameter to a function
+>>>> like `match_glob()` or `match_link()` is passed in the second positional parameter. --[[Joey]]
+
+>>>>> OK. That seems reasonable then. The only problem is that my PERLfu is not strong enough to make it
+>>>>> work. I really have to wonder what substance was influencing the designers of PERL...
+>>>>> I can't figure out how to use the %params. And I'm pissed off enough with PERL that I'm not going
+>>>>> to try and figure it out any more. There are two patches below now. The first one uses an extra
+>>>>> argument and works. The second one tries to use %params and doesn't - take your pick :-). -- [[Will]]
+
+>> What do you think is best to do about `is_globlist()`? At the moment it requires that the 'second word', as
+>> delimited by a space and ignoring parens, is 'and' or 'or'. This doesn't hold in the above example pagespecs (so I just hard wired it to 0 to test my patch).
+>> My thought was just to search for 'and' or 'or' as words anywhere in the pagespec. Thoughts?
+
+>>> Dunno, we could just finish deprecating it. Or change the regexp to
+>>> skip over spaces in parens. (`/[^\s]+\s+([^)]+)/`) --[[Joey]]
+
+>>>> I think I have a working regexp now.
+
+>> Oh, one more thing. In pagespec_translate (now pagespec_makeperl), there is a part of the regular expression for `# any other text`.
+>> This contained `()`, which has no effect. I replaced that with `\(\)`, but that is a change in the definition of pagespecs unrelated to the
+>> rest of this patch. In a related change, commands were not able to contain `)` in their parameters. I've extended that so the cannot
+>> contain `(` or `)`. -- [[Will]]
+
+>>> `[^\s()]+` is a character class matching all characters not spaces or
+>>> parens. Since the pervious terminals in the regexp consume most
+>>> occurances of an open paren or close paren, it's unlikely for one to
+>>> get through to that part of the regexp. For example, "foo()" will be
+>>> matched by the command matcher; "(foo)" will be matched by the open
+>>> paren literal terminal. "foo(" and "foo)" can get through to the
+>>> end, and would be matched as a page name, if it didn't exclude parens.
+>>>
+>>> So why exclude them? Well, consider "foo and(bar and baz)". We don't
+>>> want it to match "and(" as a page name!
+>>>
+>>> Escaping the parens in the character class actually changes nothing; the
+>>> changed character class still matches all characters not spaces or
+>>> parens. (Try it!).
+>>>
+>>> Re commands containing '(', I don't really see any reason not to
+>>> allow that, unless it breaks something. --[[Joey]]
+
+>>>> Oh, I didn't realise you didn't need to escape parens inside []. All else I
+>>>> I understood. I have stopped commands from containing parens because
+>>>> once you allow that then you might have a extra level of depth in the parsing
+>>>> of define() statements. -- [[Will]]
+
+>>> Updated patch. Moved the specFuncsRef to the front of the arg list. Still haven't thought through the security implications of
+>>> having it in `%params`. I've also removed all the debugging `print` statements. And I've updated the `is_globlist()` function.
+>>> I think this is ready for people other than me to have a play. It is not well enough tested to commit just yet.
+>>> -- [[Will]]
+
+I've lost track of the indent level, so I'm going back to not indented - I think this is a working [[patch]] taking into
+account all comments above (which doesn't mean it is above reproach :) ). --[[Will]]
+
+> Very belated code review of last version of the patch:
+>
+> * `is_globlist` is no longer needed
+
+>> Good :)
+
+> * I don't understand why the pagespec match regexp is changed
+> from having flags `igx` to `ixgs`. Don't see why you
+> want `.` to match '\n` in it, and don't see any `.` in the regexp
+> anyway?
+
+>> Because you have to define all the named pagespecs in the pagespec, you sometimes end up with very long pagespecs. I found it useful to split them over multiple lines. That didn't work at one point and I added the 's' to make it work. I may have further altered the regex since then to make the 's' redundant. Remove it and see if multi-line pagespecs still work. :)
+
+>>> Well, I can tell you that multi-line pagespecs are supported w/o
+>>> your patch .. I use them all the time. The reason I find your
+>>> use of `/s` unlikely is because without it `\s` already matches
+>>> a newline. Only if you want to treat a newline as non-whitespace
+>>> is `/s` typically necessary. --[[Joey]]
+
+> * Some changes of `@_` to `%params` in `pagespec_makeperl` do not
+> make sense to me. I don't see where \%params is defined and populated,
+> except with `\$params{specFunc}`.
+
+>> I'm not a perl hacker. This was a mighty battle for me to get going.
+>> There is probably some battlefield carnage from my early struggles
+>> learning perl left here. Part of this is that @_ / @params already
+>> existed as a way of passing in extra parameters. I didn't want to
+>> pollute that top level namespace - just at my own parameter (a hash)
+>> which contained the data I needed.
+
+>>> I think I understand how the various `%params`
+>>> (there's not just one) work in your code now, but it's really a mess.
+>>> Explaining it in words would take pages.. It could be fixed by,
+>>> in `pagespec_makeperl` something like:
+>>>
+>>> my %specFuncs;
+>>> push @_, specFuncs => \%specFuncs;
+>>>
+>>> With that you have the hash locally available for populating
+>>> inside `pagespec_makeperl`, and when the `match_*` functions
+>>> are called the same hash data will be available inside their
+>>> `@_` or `%params`. No need to change how the functions are called
+>>> or do any of the other hacks.
+>>>
+>>> Currently, specFuncs is populated by building up code
+>>> that recursively calls `pagespec_makeperl`, and is then
+>>> evaluated when the pagespec gets evaluated. My suggested
+>>> change to `%params` will break that, but that had to change
+>>> anyway.
+>>>
+>>> It probably has a security hole, and is certianly inviting
+>>> one, since the pagespec definition is matched by a loose regexp (`.*`)
+>>> and then subject to string interpolation before being evaluated
+>>> inside perl code. I recently changed ikiwiki to never interpolate
+>>> user-supplied strings when translating pagespecs, and that
+>>> needs to happen here too. The obvious way, it seems to me,
+>>> is to not generate perl code, but just directly run perl code that
+>>> populates specFuncs.
+
+> * Seems that the only reason `match_glob` has to check for `~` is
+> because when a named spec appears in a pagespec, it is translated
+> to `match_glob("~foo")`. If, instead, `pagespec_makeperl` checked
+> for named specs, it could convert them into `check_named_spec("foo")`
+> and avoid that ugliness.
+
+>> Yeah - I wanted to make named specs syntactically different on my first pass. You are right in that this could be made a fallback - named specs always override pagenames.
+
+> * The changes to `match_link` seem either unecessary, or incomplete.
+> Shouldn't it check for named specs and call
+> `check_named_spec_existential`?
+
+>> An earlier version did. Then I realised it wasn't actually needed in that case - match_link() already included a loop that was like a type of existential matching. Each time through the loop it would
+>> call match_glob(). match_glob() in turn will handle the named spec. I tested this version briefly and it seemed to work. I remember looking at this again later and wondering if I had mis-understood
+>> some of the logic in match_link(), which might mean there are cases where you would need an explicit call to check_named_spec_existential() - I never checked it properly after having that thought.
+
+>>> In the common case, `match_link` does not call `match_glob`,
+>>> because the link target it is being asked to check for is a single
+>>> page name, not a glob.
+
+> * Generally, the need to modify `match_*` functions so that they
+> check for and handle named pagespecs seems suboptimal, if
+> only because there might be others people may want to use named
+> pagespecs with. It would be possible to move this check
+> to `pagespec_makeperl`, by having it check if the parameter
+> passed to a pagespec function looked like a named pagespec.
+> The only issue is that some pagespec functions take a parameter
+> that is not a page name at all, and it could be weird
+> if such a parameter were accidentially interpreted as a named
+> pagespec. (But, that seems unlikely to happen.)
+
+>> Possibly. I'm not sure which I prefer between the current solution and that one. Each have advantages and disadvantages.
+>> It really isn't much code for the match functions to add a call to check_named_spec_existential().
+
+>>> But if a plugin adds its own match function, it has
+>>> to explicitly call that code to support named pagespecs.
+
+> * I need to check if your trick to avoid infinite recursion
+> works if there are two named specs that recursively
+> call one-another. I suspect it does, but will test this
+> myself..
+
+>> It worked for me. :)
+
+> * I also need to verify if memoizing the named pagespecs has
+> really guarded against very expensive pagespecs DOSing the wiki..
+
+> --[[Joey]]
+
+>> There is one issue that I've been thinking about that I haven't raised anywhere (or checked myself), and that is how this all interacts with page dependencies.
+>> Firstly, I'm not sure anymore that the `pagespec_merge` function will continue to work in all cases.
+
+>>> The problem I can see there is that if two pagespecs
+>>> get merged and both use `~foo` but define it differently,
+>>> then the second definition might be used at a point when
+>>> it shouldn't (but I haven't verified that really happens).
+>>> That could certianly be a show-stopper. --[[Joey]]
+
+>> Secondly, it seems that there are two types of dependency, and ikiwiki
+>> currently only handles one of them. The first type is "Rebuild this
+>> page when any of these other pages changes" - ikiwiki handles this.
+>> The second type is "rebuild this page when set of pages referred to by
+>> this pagespec changes" - ikiwiki doesn't seem to handle this. I
+>> suspect that named pagespecs would make that second type of dependency
+>> more important. I'll try to come up with a good example. -- [[Will]]
+
+>>> Hrm, I was going to build an example of this with backlinks, but it
+>>> looks like that is handled as a special case at the moment (line 458 of
+>>> render.pm). I'll see if I can breapk
+>>> things another way. Fixing this properly would allow removal of that special case. -- [[Will]]
+
+>>>> I can't quite understand the distinction you're trying to draw
+>>>> between the two types of dependencies. Backlinks are a very special
+>>>> case though and I'll be suprised if they fit well into pagespecs.
+>>>> --[[Joey]]
+
+----
+
+ diff --git a/IkiWiki.pm b/IkiWiki.pm
+ index 4e4da11..8b3cdfe 100644
+ --- a/IkiWiki.pm
+ +++ b/IkiWiki.pm
+ @@ -1550,7 +1550,16 @@ sub globlist_to_pagespec ($) {
+
+ sub is_globlist ($) {
+ my $s=shift;
+ - return ( $s =~ /[^\s]+\s+([^\s]+)/ && $1 ne "and" && $1 ne "or" );
+ + return ! ($s =~ /
+ + (^\s*
+ + [^\s(]+ # single item
+ + (\( # possibly with parens after it
+ + ([^)]* # with stuff inside those parens
+ + (\([^)]*\))*)* # maybe even nested parens
+ + \))?\s*$
+ + ) |
+ + (\s and \s) | (\s or \s) # or we find 'and' or 'or' somewhere
+ + /xs);
+ }
+
+ sub safequote ($) {
+ @@ -1631,7 +1640,7 @@ sub pagespec_merge ($$) {
+ return "($a) or ($b)";
+ }
+
+ -sub pagespec_translate ($) {
+ +sub pagespec_makeperl ($) {
+ my $spec=shift;
+
+ # Support for old-style GlobLists.
+ @@ -1650,12 +1659,14 @@ sub pagespec_translate ($) {
+ |
+ \) # )
+ |
+ - \w+\([^\)]*\) # command(params)
+ + define\(\s*~\w+\s*,((\([^()]*\)) | ([^()]+))+\) # define(~specName, spec) - spec can contain parens 1 deep
+ + |
+ + \w+\([^()]*\) # command(params) - params cannot contain parens
+ |
+ [^\s()]+ # any other text
+ )
+ \s* # ignore whitespace
+ - }igx) {
+ + }igxs) {
+ my $word=$1;
+ if (lc $word eq 'and') {
+ $code.=' &&';
+ @@ -1666,16 +1677,23 @@ sub pagespec_translate ($) {
+ elsif ($word eq "(" || $word eq ")" || $word eq "!") {
+ $code.=' '.$word;
+ }
+ - elsif ($word =~ /^(\w+)\((.*)\)$/) {
+ + elsif ($word =~ /^define\(\s*~(\w+)\s*,(.*)\)$/s) {
+ + $code .= " (\$params{specFuncs}->{$1}="; # (exists \$params{specFuncs}) &&
+ + $code .= "memoize(";
+ + $code .= &pagespec_makeperl($2);
+ + $code .= ")";
+ + $code .= ") ";
+ + }
+ + elsif ($word =~ /^(\w+)\((.*)\)$/s) {
+ if (exists $IkiWiki::PageSpec::{"match_$1"}) {
+ - $code.="IkiWiki::PageSpec::match_$1(\$page, ".safequote($2).", \@_)";
+ + $code.="IkiWiki::PageSpec::match_$1(\$page, ".safequote($2).", \%params)";
+ }
+ else {
+ $code.=' 0';
+ }
+ }
+ else {
+ - $code.=" IkiWiki::PageSpec::match_glob(\$page, ".safequote($word).", \@_)";
+ + $code.=" IkiWiki::PageSpec::match_glob(\$page, ".safequote($word).", \%params)";
+ }
+ }
+
+ @@ -1683,8 +1701,18 @@ sub pagespec_translate ($) {
+ $code=0;
+ }
+
+ + return 'sub { my $page=shift; my %params = @_; '.$code.' }';
+ +}
+ +
+ +sub pagespec_translate ($) {
+ + my $spec=shift;
+ +
+ + my $code = pagespec_makeperl($spec);
+ +
+ + # print STDERR "Spec '$spec' generated code '$code'\n";
+ +
+ no warnings;
+ - return eval 'sub { my $page=shift; '.$code.' }';
+ + return eval $code;
+ }
+
+ sub pagespec_match ($$;@) {
+ @@ -1699,7 +1727,7 @@ sub pagespec_match ($$;@) {
+
+ my $sub=pagespec_translate($spec);
+ return IkiWiki::FailReason->new("syntax error in pagespec \"$spec\"") if $@;
+ - return $sub->($page, @params);
+ + return $sub->($page, @params, specFuncs => {});
+ }
+
+ sub pagespec_valid ($) {
+ @@ -1748,11 +1776,78 @@ sub new {
+
+ package IkiWiki::PageSpec;
+
+ +sub check_named_spec($$;@) {
+ + my $page=shift;
+ + my $specName=shift;
+ + my %params=@_;
+ +
+ + error("Unable to find specFuncs in params to check_named_spec()!") unless exists $params{specFuncs};
+ +
+ + my $specFuncsRef=$params{specFuncs};
+ +
+ + return IkiWiki::FailReason->new("Named page spec '$specName' is not valid")
+ + unless (substr($specName, 0, 1) eq '~');
+ +
+ + $specName = substr($specName, 1);
+ +
+ + if (exists $specFuncsRef->{$specName}) {
+ + # remove the named spec from the spec refs
+ + # when we recurse to avoid infinite recursion
+ + my $sub = $specFuncsRef->{$specName};
+ + delete $specFuncsRef->{$specName};
+ + my $result = $sub->($page, %params);
+ + $specFuncsRef->{$specName} = $sub;
+ + return $result;
+ + } else {
+ + return IkiWiki::FailReason->new("Page spec '$specName' does not exist");
+ + }
+ +}
+ +
+ +sub check_named_spec_existential($$$;@) {
+ + my $page=shift;
+ + my $specName=shift;
+ + my $funcref=shift;
+ + my %params=@_;
+ +
+ + error("Unable to find specFuncs in params to check_named_spec_existential()!") unless exists $params{specFuncs};
+ + my $specFuncsRef=$params{specFuncs};
+ +
+ + return IkiWiki::FailReason->new("Named page spec '$specName' is not valid")
+ + unless (substr($specName, 0, 1) eq '~');
+ + $specName = substr($specName, 1);
+ +
+ + if (exists $specFuncsRef->{$specName}) {
+ + # remove the named spec from the spec refs
+ + # when we recurse to avoid infinite recursion
+ + my $sub = $specFuncsRef->{$specName};
+ + delete $specFuncsRef->{$specName};
+ +
+ + foreach my $nextpage (keys %IkiWiki::pagesources) {
+ + if ($sub->($nextpage, %params)) {
+ + my $tempResult = $funcref->($page, $nextpage, %params);
+ + if ($tempResult) {
+ + $specFuncsRef->{$specName} = $sub;
+ + return $tempResult;
+ + }
+ + }
+ + }
+ +
+ + $specFuncsRef->{$specName} = $sub;
+ + return IkiWiki::FailReason->new("No page in spec '$specName' was successfully matched");
+ + } else {
+ + return IkiWiki::FailReason->new("Named page spec '$specName' does not exist");
+ + }
+ +}
+ +
+ sub match_glob ($$;@) {
+ my $page=shift;
+ my $glob=shift;
+ my %params=@_;
+
+ + if (substr($glob, 0, 1) eq '~') {
+ + return check_named_spec($page, $glob, %params);
+ + }
+ +
+ my $from=exists $params{location} ? $params{location} : '';
+
+ # relative matching
+ @@ -1782,11 +1877,12 @@ sub match_internal ($$;@) {
+
+ sub match_link ($$;@) {
+ my $page=shift;
+ - my $link=lc(shift);
+ + my $fulllink=shift;
+ my %params=@_;
+ + my $link=lc($fulllink);
+
+ my $from=exists $params{location} ? $params{location} : '';
+ -
+ +
+ # relative matching
+ if ($link =~ m!^\.! && defined $from) {
+ $from=~s#/?[^/]+$##;
+ @@ -1804,19 +1900,32 @@ sub match_link ($$;@) {
+ }
+ else {
+ return IkiWiki::SuccessReason->new("$page links to page $p matching $link")
+ - if match_glob($p, $link, %params);
+ + if match_glob($p, $fulllink, %params);
+ }
+ }
+ return IkiWiki::FailReason->new("$page does not link to $link");
+ }
+
+ sub match_backlink ($$;@) {
+ - return match_link($_[1], $_[0], @_);
+ + my $page=shift;
+ + my $backlink=shift;
+ + my @params=@_;
+ +
+ + if (substr($backlink, 0, 1) eq '~') {
+ + return check_named_spec_existential($page, $backlink, \&match_backlink, @params);
+ + }
+ +
+ + return match_link($backlink, $page, @params);
+ }
+
+ sub match_created_before ($$;@) {
+ my $page=shift;
+ my $testpage=shift;
+ + my @params=@_;
+ +
+ + if (substr($testpage, 0, 1) eq '~') {
+ + return check_named_spec_existential($page, $testpage, \&match_created_before, @params);
+ + }
+
+ if (exists $IkiWiki::pagectime{$testpage}) {
+ if ($IkiWiki::pagectime{$page} < $IkiWiki::pagectime{$testpage}) {
+ @@ -1834,6 +1943,11 @@ sub match_created_before ($$;@) {
+ sub match_created_after ($$;@) {
+ my $page=shift;
+ my $testpage=shift;
+ + my @params=@_;
+ +
+ + if (substr($testpage, 0, 1) eq '~') {
+ + return check_named_spec_existential($page, $testpage, \&match_created_after, @params);
+ + }
+
+ if (exists $IkiWiki::pagectime{$testpage}) {
+ if ($IkiWiki::pagectime{$page} > $IkiWiki::pagectime{$testpage}) {