X-Git-Url: http://git.vanrenterghem.biz/git.ikiwiki.info.git/blobdiff_plain/d57d2ecca722cb7c43d2b14ed31f38c3e204079c..e9f018b34007410ca08a5ff09d3360779d75ac05:/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
diff --git a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
index 8a2261543..4a7163eae 100644
--- a/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
+++ b/doc/tips/convert_mediawiki_to_ikiwiki/discussion.mdwn
@@ -6,6 +6,10 @@
----
+I wrote a script that will download all the latest revisions of a mediawiki site. In short, it does a good part of the stuff required for the migration: it downloads the goods (ie. the latest version of every page, automatically) and commits the resulting structure. There's still a good few pieces missing for an actual complete conversion to ikiwiki, but it's a pretty good start. It only talks with mediawiki through HTTP, so no special access is necessary. The downside of that is that it will not attempt to download every revision for performance reasons. The code is here: http://anarcat.ath.cx/software/mediawikigitdump.git/ See header of the file for more details and todos. -- [[users/Anarcat]] 2010-10-15
+
+----
+
The u32 page is excellent, but I wonder if documenting the procedure here
would be worthwhile. Who knows, the remote site might disappear. But also
there are some variations on the approach that might be useful:
@@ -659,4 +663,7 @@ page.--[[users/Chadius]]
> were fixed to use the right extension. --[[Joey]]
>> Here's another I found while browsing around starting from the link you gave Joey
->>
+>>
+>> As I don't run mediawiki anymore, but I still have my xz/gzip-compressed XML dumps,
+>> it's certainly easier for me to do it this way; also a file or a set of files is easier to lug
+>> around on some medium than a full mysqld or postgres master and relevant databases.