[[Fill in details about WubWikit, in particular, where to get the application, the differences between wubwikit and the traditional [wikit], etc.]] [Wiki News] - announcements and such old, obsolete problem list for [this wikit]. ---- [MHo]: * With the old wiki(t.kit), it was very simple to grab a copy of the starkit, define some simple cgi-wrapper and drive this through [Tclhttpd]. I suppose the new WubWikit only works with [Wub] - as the name says ;-) So, WubWikit never runs as a standalone CGI-process, is it right? [CMcC] You can get the code for wikit here: [http://code.google.com/p/wikitcl/source] and for Wub here: [http://code.google.com/p/wub/source] you'll need subversion to grab it. [wiki database for offline use] has "wubwikit.kit" that is all the pieces in one file. [MHo]: Will there be ''binaries'' available sometimes in the future? [LV]: Anyone have a ''recipe'' for bringing up and running this latest generation of the wikit? Anyone created a starkit with the necessary code in it? * Is a downloadable copy of the wiki pages theirselves available, such as ''wikit.gz''?? [LV] If you mean a downloadable copy of the wiki pages, I don't think this is available yet. If you mean a downloadable copy of the executing code, see CMcC's comments. [MHo] Yes I mean a copy of the database with the wiki-pages, e.g. to make it offline available or to keep a backup for restauration in case of vandalism... [CMcC] there's an interesting problem here: the wiki db is currently 95Mb, and clearly it would be a bad thing to allow too-frequent downloads (that's part of what brought the old wiki down - some clown doing multiple uncontrolled downloads - denial of service.) I'm trying to work out what a good policy might be to balance the utility of providing the download while bounding the cost. Here are some policy questions to which I have no good answers: * What if everyone on the wiki wanted a daily db backup? That's about 90Gb/day, won't work. What about hourly? * What if we put a limit on the number of downloads per day? How do we stop one of the daily (hourly?) backup crowd from grabbing one of the slots without requiring login? * Should we require actual login for db downloads? * Could/should we require everyone who downloads the db provide it for others to further download? How would we do that? -- ''[Lars H]: This is exactly the kind of problem [BitTorrent] aims to solve, is it not?'' * How often should a given user be permitted to download a copy of the db? If they are doing it frequently, they're wasting enormous amounts of bandwidth. * Should there be some kind of db delta download? Like RSS like history: ''give me all the pages which have changed since '' [LV] The primary issue here is a limited resource - your bandwidth. So why not upload the data, compressed, to google, and let people download it from there. That way, the bandwidth issue is no longer yours - you upload to google once every N period (whatever you can afford) and let people get it from google. As for what the db delta - I don't have a need for that, a simple compressed .tkd file is all I need. 30jun07 [jcw] - The quick solution is to simply delegate: get one or more mirrors in place with a reasonably recent copy (I'd vote for daily). Then you don't have to deal with it. As for delta's: rsync is very effective for this type of data (either via ssh or as rsync server). If you want to go fancier, check out wikit's "-update" command (written before the history got added to the db, btw). [KJN] 2007-11-23: I would occasionally download a copy of the database if I was going somewhere without fast internet access - it was very useful to have this facility. A daily, or even weekly, database dump to Google would be much appreciated. Page histories could be omitted. BitTorrent would also be a good solution. [jdc] 7-dec-2007: I created a [SourceForge] project to download the wiki db: [wiki database for offline use] [RZ] 11-dec-2008: Great work so far. But when I try to use wubwikit with my own *.tkd files I have on the left side the wikit categories. Is there a way to use my own ones? Is it OK to use the commandline argument "cmdport -1" to supress starting of the maintenance socket? This is needed to run more then one instance of the program. ---- [LV] Over on comp.lang.tcl, some details about this version of the wiki are being discussed. In particularly, there are particular URLs which cause specialty functionality to occur. * a URL of the form `http://wiki.tcl.tk/` returns the html for the page http://wiki.tcl.tk/18028 * a URL of the form `http://wiki.tcl.tk/.txt` returns the raw data for the page http://wiki.tcl.tk/18028.txt * a URL of the form `http://wiki.tcl.tk/.code` returns all ''code blocks'' on the page. Code blocks are displayed separated by `### ############################################################` tags. A code block is just one or more lines starting with a space or blocks delimited by `======` lines. http://wiki.tcl.tk/15312.code * a URL of the form `http://wiki.tcl.tk/.str` returns the [wubwiki stream data] for the page http://wiki.tcl.tk/18028.str * a URL of the form `http://wiki.tcl.tk/_edit/` invokes the web text widget with the wiki raw page loaded http://wiki.tcl.tk/_edit/18028 * a URL of the form `http://wiki.tcl.tk/_history/` will show the history of that page http://wiki.tcl.tk/_history/18028 * a URL of the form `http://wiki.tcl.tk/_ref/` will show all pages referring to that page http://wiki.tcl.tk/_ref/18028 * a URL of the form `http://wiki.tcl.tk/_revision/?V=` returns the html for a specific version of the page http://wiki.tcl.tk/_revision/18028?V=47 * a URL of the form `http://wiki.tcl.tk/_revision/.txt?V=` returns the raw data for a specific version of the page http://wiki.tcl.tk/_revision/18028.txt?V=47 * a URL of the form `http://wiki.tcl.tk/_revision/.str?V=` returns the [wubwiki stream data] for a specific version of the page http://wiki.tcl.tk/_revision/18028.str?V=47 * a URL of the form `http://wiki.tcl.tk/_revision/?V=&A=1` returns the html of a specific version of the page with detailed info about who made changes and when changes were made to the page annotated (the raw and [wubwiki stream data] for the difference are similarly available): http://wiki.tcl.tk/_revision/18028?V=40&A=1 * a URL of the form `http://wiki.tcl.tk/.txt?V=&D=` returns the html of the '''line''' differences between the two versions (the raw and [wubwiki stream data] for the difference are similarly available) http://wiki.tcl.tk/_diff/18028?V=42&D=43 * a URL of the form `http://wiki.tcl.tk/.txt?V=&D=&W=1` returns the html of the '''word''' differences between the two versions (the raw and [wubwiki stream data] for the difference are similarly available) http://wiki.tcl.tk/_diff/18028?V=50&D=51&W=1 ---- * [_save] * [_edit] * [_login] * [_rev] * [_cleared] * [_search] ---- !!!!!! %|[Category Wikit]|% !!!!!!