X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/48f63526a2d00661e73e4d049d02a3a910733cae..651cdd4b2a85f4e5f9d298a7eea7d0e6d94442b1:/doc/todo/optimisations.mdwn?ds=inline diff --git a/doc/todo/optimisations.mdwn b/doc/todo/optimisations.mdwn index 4cf0907f5..b8c4fa0da 100644 --- a/doc/todo/optimisations.mdwn +++ b/doc/todo/optimisations.mdwn @@ -1,16 +1,15 @@ -* Render each changed page only once. Currently pages are rendered up to 4 - times in worst case (8 times if there's an rss feed). +Ikiwiki has already been optimised a lot, however.. - The issue is that rendering a page is used to gather info like the links - on the page that can effect rendering other pages. So it needs a - multi-pass system. But rendering the whole page in each pass is rather - obscene. +* Look at splitting up CGI.pm. But note that too much splitting can slow + perl down. -* Don't render blog archive pages unless a page is added/removed. Just - changing a page doesn't affect the archives as they show only the title. + > It's split enough, or possibly more than enough, now. :-) -* Look at breaking the relatively rarely used blogging stuff out of - Render.pm, into its own module. +* The backlinks calculation code is still O(N^2) on the number of pages. + If backlinks info were stored in the index file, it would go down to + constant time for iterative builds, though still N^2 for rebuilds. -* Look at splitting up CGI.pm. But note that too much splitting can slow - perl down. + > Seems to be O(Num Pages * Num Links in Page), or effectively O(N) + > pages for most wikis. + +[[done]]