We’re playing in a Pathfinder campaign: a friend of ours is running our group through the Jade Regent adventure path. We play about once a month.
I use pcgen to keep track of our characters — the D&D 3.5 character building and advancement rules are inconsequent and confusing (and Pathfinder could be considered D&D 3.75), and the program runs me through all the necessary steps. Jade Regent has been added to the rule sets of pcgen, so all the special rules and settings are available.

We’re currently level 4. Last session, we got a bonus skill point for a Crafting skill. Of course you can note that down on your character sheet, but when you gain a level, you get the number of skill points to spend that are according to the rules — excluding the bonus skill point. That’s a bit of a problem, and I want to keep using pcgen for the administration of the characters, because I’m too lazy to learn the confusing character advancement rules.

Digging through the documentation, I found how I could configure the Jade Regent ruleset to add a skillpoint for when we advance to level 5. I added the following line (or rule — the Dutch word ‘regel’ can mean both) to the file jade_regent_players_guide.pcc:

BONUS:SKILLPOINTS|NUMBER|1|PRELEVEL:MIN=5,MAX=5

(Which means: give a single bonus skillpoint for level 5.)

I’m not sure that my regular readers have any use for this information, but it’s here mostly for future reference.

Drivel Test Post

This is a test post with my own patched version of Drivel. If all goes well (and after a few test-runs and debugging I’m fairly confident that it will work this time), this post will have three tags: two existing ones (‘programming’ and ‘open source’) and one new (‘drivel’).

Edit: Total success!

Edit the second: Trying to add a tag (‘software’) through the ‘edit recent entries’-feature.

Edit the third: OK, now it ought to work. And if not… well, there’s always tomorrow!

Through the recommendation of other people, I have recently come across two utilities that have made my life immensely easier. Depening on your computing needs, they might be of interest to you too.

First off: FreeNX. Upto now, I used VNC to control the Ubuntu desktop of Sootball (when it’s turned on) from Calcifer. However, VNC is pretty slow to update the screen. A colleague of mine demonstrated using FreeNX to log in on the Ubuntu machine of his parents over a modest ADSL line — and it was screaming fast — much faster than the VNC speeds I get over my LAN.
NX is a very efficient protocol, and it tunnels over SSH. Using FreeNX, I can use Sootball as if I’m directly logged on to it. Of course, that speeds up things tremendously. There’s also an NX client for Windows machines, so if you use Windows as your main workhorse but need to do stuff on a Linux machine, FreeNX is your best choice.
Unfortunately, there’s no NX server software for Windows — I’d love to use it for taking over Jiji. Alas, in that case, I’m stuck using VNC.

Secondly, SoapUI, a webservices testing tool. I’ve used SoapUI 1.something to test calling a webservice — pretty cool stuff if you’re writing a webservice and want to see what happens if you do an actual call. But last week I upgraded to the latest version (2.5), because the software I was working on also had to call a webservice.
Using the WSDL as input, SoapUI can generate editable requests that you can fire off at the webservice. But using that same WSDL, the newest version of SoapUI can also generate a webservice! You can define one or more response messages, and using either scripting or XPath queries, you can make the response sent back dependant on the request received.
This is a great tool if you’re on either side of a webservice, and only have a WSDL to go on.

As you all know (or maybe you don’t, but then you must be new here), I use BitTorrent to download the ungodly amounts of anime we watch. A site like BakaUpdates to keep track of the new releases, and a BitTorrent client with queues, slots, ratios and some other nice features are all that I need.

When I still ran WinXP, I used uTorrent. It’s closed source, and was acquired by BitTorrent, Inc, which is in bed with the MPAA — I used the last version before the merger, just in case.

On Linux, I tried a few clients. The built-in client in Ubuntu is not suited for my purposes, and the others were quite clunky interface-wise. In the end, I settled for KTorrent, but that one crapped out on me a lot of times. Still, it was as close as one could get to uTorrent on Linux, so I made do.

Until a few weeks ago, when I discovered Deluge. It’s an open source client, with packages available for many Linux distros and even Windows. It supports many features that were unique to uTorrent, has the same type of interface… it is, in short, perfect.

Well, there is a small thing left: when you close Deluge, it doesn’t save your upload ratios — so when you next start Deluge, all ratios are reset to 0%. Apparently this bug has been fixed in the source, so it is only a matter of time until an updated package is built with that fix.

Sic transit gloria mundi

I remember the time when Peter Norton was a world-class programmer. He made all the must-have tools that made it so much easier to squeeze the last bit of productivity or computing power out of your PC.
The packaging of his products always featured a photo of him: a man with glasses and salt-and-pepper colored hair, wearing a shirt and tie, but no jacket. The sleeves of the shirt were always slightly rolled up. Peter always had his arms crossed and looked sternly at the camera, as if he had just been plucked away from behind his desk where he was writing the next kick-ass utility. As if he was annoyed that these silly marketing people took him out of his flow just because they wanted a picture to use on the box.

Those days are long past. These days, Peter collects modern art, and he kicks ass doing that too.
His place as a programmer has been taken by lesser gods. Today, I downloaded the Norton Removal Tool for my dad. He had installed a trial version of Norton Internet Security 2006, which completely borked up his system. He couldn’t connect to the net anymore, so I had to download this thing for him — obviously packaging uninstall programs with your software is outdated.

Microsoft is quite unpopular with the online public, for some reason. It is supposed to be the ‘Evil Empire’, and Bill Gates routinely juggles with babies over open pits of fire.

Many a time, people charge that it is a monopoly, and that it uses its clout in the IT-world to bend the market to its will. Sure, Microsoft has deep pockets which enable it to gain a market share in many markets — the console market comes to mind: with every Xbox sold, Microsoft loses a bit of money. Sure, Microsoft bundled their browser with their operating system, thereby cutting into Netscape’s profits.

But think of it this way: if all Microsoft software was such utter crap, why is everyone still using it?
Why are there Xboxes being sold? Because the games on that platform appeal to a certain audience. Sure, the price is right — but if the games do not appeal to you, you wouldn’t buy the console, right? In this market, Microsoft exhibits an above-minimum level of competency and succeeds to get a share of that market.
Why did people stop downloading the Netscape browser when Internet Explorer came bundled with Windows? If it had been such a piece of crap, wouldn’t people have kept using Netscape? But here again, Microsoft produced a product that was ‘good enough’ for most people.

Why do people use Winamp, if they could also use the Windows Media Player? Because Winamp has several unique features that make it an attractive alternative to the bundled media player. Winamp manages to create a market for itself by being innovative.
In contrast, there is the RealPlayer that no-one downloads anymore because everything can be done with the WMP. Why is that? Well, frankly, the RealPlayer doesn’t do anything that the WMP can’t — and it comes bundled with lots of ads to boot!

Often, people harp on the supposed insecurity of Microsoft products. Sure, some parts of the system contain security holes that are large enough to drive a truck through (or several metric tons of spyware), but let’s be frank: that is the case with any sufficiently large and complex piece of software. The absolute number of security flaws is not so interesting — what counts is the way the vendor deals with those flaws that are found.
With WinXPSP2, I think they’ve toughened up their systems pretty well: built-in firewall, automatic updates… What’s not to like?

With everything that gets added to Windows (be it browsers, media players or security software), vendors who operated on the after-sales market by selling such tools to Windows users will find their markets shrinking. If Microsoft manages to create a product that is good enough, why bother buying a bundle from Norton? Basically, it is evolution in progress in the business space: if your competitors come out with a product that is better or more convenient, you’re hosed — better come up with something new!

And yet Real, Adobe and Symantec are complaining to the EU about the Windows Media Player, the PDF rendering features and the security console in Windows Vista. Face it guys: you’ve lost the race. If your products are not innovative or discernable enough, you’re not going to sell many of those anymore once a product comes along that is good enough and more convenient.

Is this Microsoft abusing their monopoly? Really? Somehow I’m not convinced.

When I worked at Semergy, we were building a kick-ass knowledge management application. It would blow all others out of the water — and yet we didn’t make a single sale. Why? Because Verity’s search engine was ‘good enough’ for most outfits — so why should they spend the money on a custom-built solution when something out of the box from the market leader would also suffice?
I can’t recall ever filing a motion with the EU to protect us from the market manipulations of monopolistic vendor Verity.

As I posted earlier, the MACH F is having trouble with playing H.624 encoded video content (often in a MKV or MP4 container). XviD works like a charm though — no problems with that.
I have been told that you can achieve higher quality video in H.624 with the same file size, and while that is all fine and dandy, the added overhead of separate audio and subtitling streams make the filesize actually larger than the same video encoded with XviD/MP3, with static subtitles.

Sure, being able to switch off the subtitles is great — except that I don’t understand Japanese well enough to do without the subtitles. I need those subtitles.
Switching the audio stream is great too — if I wanted to watch a dubbed DVD rip, which I don’t. And only a single audio stream bloats the file anyway!
Oh, and I can’t even play the content anyway, so thanks but no thanks anyway.

However, sometimes you just don’t have a choice if you want to catch those episodes of your favourite fansub. So what to do? Well, re-encode the video to use XviD/MP3, of course! Surely that is a worthy use of those hyperthreaded gigahertzes available to me!

Except that this one file has had me stumped for weeks. No matter what I did, I could not get the audio and video to sync up. The audio was perfectly in sync with the subtitles, but the video always ran a bit before the audio. I calculated the framerates that should be correct, but that didn’t work either.

Today I found out that the MKV-container supports variable framerates. WTF is up with that — what is the use of variable framerates in a file format that works with keyframes and delta-frames anyway?
Apparently someone thought it was a good idea to introduce variable framerates — and while AviSynth produced 23.976 frames per second, the skipped frames were served before their time had come, which resulted in the video moving faster than the audio.
I had to upgrade to version 2.5.6 to be able to use the ‘convertfps’ parameter of DirectShowSource, which inserts frames to get a constant framerate.

And now it works. Which is cool, because now I can convert those unplayable files into something manageble. And the re-coded AVI file is 136MB smaller than the original MKV file to boot!

Wiki

Firday, we had the kick-off for a new project. Nothing out of the ordinary, except that, instead of the usual waterfall development method, we’ll be using SCRUM. It fits the process of this project much better.

Of course, we get lots of Web 2.0 hype along with the project. Maybe it’s just me, but I don’t see this whole ‘wave’ thing happening. I see linear progression from the start until now — stuff gets invented, other people add to that stuff, and it goes on and on and on. It’s not like we suddenly make completely different websites than we used to — we just use a few new techniques in addition to what we already did.
In my view, Web 2.0 doesn’t exist. It’s more like Web 0.999999 — just like LaTeX is slowly progressing in version to e, and TeX is slowly converging to pi. There will probably never be a version 3.0 of LaTeX, because development right now simply adds to the existing shell, not a complete rebuild.

However, due to this being a Web 2.0 project, I proposed we used a wiki for requirements management. There will be multiple ‘owners’ of specific subsets of functionality. Right now, it’s a Word document that, at any given time, is being editted by multiple people. Even if you use something like Sharepoint, you will get versioning conflicts, even if people have been editting separate sections of the document.
A great way to step around this problem is to use a wiki. A functionality is a single wiki page, everybody always has the latest version available, and collaboration is a lot easier.

Me and my big mouth. My proposal was accepted, and now I have to install a wiki and worry about security and all that sort of stuff…
I use WAMPServer at home to build the webshop, so I already had a WAMP-stack available to experiment with. It runs on Windows XP Pro, which is exactly what it will have to run on in the early stages of the project (just a machine tucked away somewhere, not even an official development server or something like that, because we have to be flexible).
So this morning I downloaded the MediaWiki, which is the software the Wikipedia runs. Fortunately, it uses PHP and MySQL, and after unzipping a file and a few mouse clicks, I succeeded in installing the software. And another round of reading through the documentation allowed me to close the wiki off for non-logged-in users, and to enable file uploads.

Our CMS is a servlet, so I also researched how to connect Apache to Tomcat under windows. I hope I have enough time on tuesday morning to install all that stuff…

I like tinkering with stuff like that. Especially if other people already did the hard work for me and all it takes is a bit of tweaking of configuration files!