Fairness

Michael and Greg haven’t written their blogs yet, but they whine at me about updating mine! Is that fair? I don’t think so.

Teamwork and Cooperation

There’s apparently a Cornell University and University of Virginia project called Fedora, and they’re having a spat with Red Hat over their Fedora project. The universities’ press release contains the gem:

The Cornell and Virginia teams have taken a number of steps to try to work with Red Hat regarding use of the name Fedora. At this date, Red Hat has refused our request to withdraw its trademark applications and reverse its claims of usage restrictions on the name. Cornell University and the University of Virginia are now considering various legal options in response to Red Hats actions.

Yes, that’s right: you’re “working with” someone when in response to them not immediately doing what you want, you start bitching and moaning in public and try to sue. Nice, guys.

Bonus!

Well, what do you know? Keeping all your receipts and going through them actually pays off. I’d been thinking recently that I really should get around to getting a copy of Torch the Moon, the latest Whitlam’s CD. For “recently”, read “since September” or so, as it happens I started thinking this more or less a year to the day since the album’s release. Being, well, indolent, I still hadn’t gotten around to doing anything about this until now. But today I started getting around to going through all my old receipts, and lo and behold, what do I find but a receipt from HMV for Placebo’s Sleeping with Ghosts and none other than Torch the Moon. Apparently I’d already bought it, in March. The Placebo CD has been in my mp3 (well, ogg) folder for months. The Whitlams CD, though, has evidently been lost in my desk since I bought it. Well, we can remedy that!

I Want

One feature I’d really like for my blog is a micro web.archive.org that just caches the pages I link to (along with any graphics, frames, embedded junk, stylesheets and whatever else they might contain that affects how they display), so that I can have blosxom automagically redirect the link to my cached copy if and when the link goes stale.

Only problem is there doesn’t seem to be any software around that can just spider a single page and all the gumph that’s on it, but not anything it links to. Worse, that’s a Hard Problem, requiring a real HTML parser. Oh well.

UPDATE 2003/11/14:

So Clinton pointed me at wget’s -p option, which does what I want. How cool! A modicum of futzing around is required but this is actually doable. Sweet.

Objective Curley

White Glenn linked to a Veteran’s Day story about the leadership shown by Captain Harry (Zan) Hornbuckle. It’s an impressive story: a group of soldiers are thrown together, given very little time to coalesce as a team, told to secure an area, come under heave attack (300 against 80), and end up saving the day. There’s even a point where hope seems lost, but is suddenly recovered by the arrival of reinforcements.

But hey, you don’t need me to tell you that, you can go read it. What I wanted to say was in response to the first sentence of that article:

Reader and Rocket Man colleague John Beukema directs our attention to a page-one story by Jonathan Eig in today’s Wall Street Journal, “Why you’ve heard of Jessica Lynch, not Zan Hornbukckle.”

But reading that story, I started getting a sense of deja vu — that I had actually heard it before. Turns out I was right, the story had already circulated around the Internet; eg Sgt Stryker linked to the StrategyPage description on June 10th. Advantage: blogosphere.

UPDATE 2003/11/15:

Here’s a better rendition of the WSJ story.

Cross-strapping!

So, I’ve actually done something vaguely productive with arch now; namely put debootstrap into it, and hacked on cross-strapping support. Cross-strapping is basically inspired by the needs of the Hurd project, and more particularly the crosshurd package. Basically, in order to install a Debian GNU/Hurd system you have to boot up Linux, do partitioning and a partial install, then reboot into that partial install and finish it off. You can’t do the partitioning in Hurd, and you can’t get a functional Hurd system from Linux, so you have to do it in two stages.

So naturally, that’s how debootstrap works now. The first stage can thus be executed on any Unixy platform, and just needs basic things like a POSIX sh, sed, wget and access to a Debian archive; and while the second stage does have to be run on the target host, it doesn’t require anything other than what the first stage has already setup. In fact, the procedure for completing the second stage install is just:

chroot $TARGET debootstrap/debootstrap

In theory this should be useful for a bunch of things other than the Hurd; such as setting up Linux chroots on BSD systems, installing Debian onto embedded systems, and possibly doing cool emulation stuff with things like qemu.

The only real problem with it at the moment, is that it isn’t able to create the correct devices for the target system (it creates the devices for the host system instead). Hopefully that can be fixed once Bdale finishes porting the Red Hat makedev to Debian.

A snapshot tarball is available here. If you want to play, you can do a first stage only install with:

debootstrap --arch ${TARGET_ARCH} --foreign sarge ${TARGET_DIR} ${MIRROR}

The --arch flag is required for everything interesting, the --foreign flag tells debootstrap to do the first stage only.

Larry McVoy Advocates Arch

Admittedly, that’s not what he’s trying to do when he says:

People mumble about arch until they go use it for a while and realize it is about 3-5 years behind BK.

Here’s what he was saying about BitKeeper four years ago, in 1999:

The problem with most systems is that they don’t scale. They all work great for 1-5 developers. It doesn’t matter which one you choose. However, they all tend to fall apart when you have 1000 developers. Since we have experience in source management, having designed and implemented most of Sun’s TeamWare source management system, we are quite familiar with the scaling problem and feel that we can provide a better, more scalable and more reliable answer. We did it before, and this one is better.

Without taking away from anything Larry was meaning to say, that still seems like pretty high praise for arch, to me.

Open Source Betting Market

Some more thoughts on this topic.

Ecash is actually something of a distraction in the description; there’s no particular need for people to be able to do anonymous transactions, or to transact without talking to a central market — so you can do this just as effectively with market accounts. In that case, it makes more sense to call the “bids” contracts: initially you buy, for a dollar, two contracts: one that offers you a dollar at a certain time if something doesn’t happen; the other offers you a dollar when something does happen, as long as it happens before that certain time. We’ll call these “lose” and “win” contracts, where “winning” is when the feature gets implemented.

So far, we’ve really only dealt with half the problems of financing free software: ensuring users get a fair deal for their money. That’s important, because the money has to come from somewhere, but that’s only useful if it goes somewhere too. If people put up their money, and don’t get anything back but they’re money, they’re sensibly going to stop wasting their time and go do something useful. And so far we’re requiring software developers to have a fair chunk of capital up front that they can invest in themselves. That’s nice and all, but most free software hackers aren’t all that cashed up (which is the problem we’re trying to solve here after all!) so it’s not terribly realistic.

One of the questions that most annoyed me about the Wall Street Performer Protocol and that still annoys me here is the issue of giving the winning contracts to people. If you charge for them, then you necessarily preclude all the poor starving hackers from being able to participate; but if you don’t charge for them, the poor starving hackers will have an incentive to just sell them immediately for the going rate (5c each), so they can eat. Both of which kind-of defeat the purpose of this. One way of dealing with this is probably to give out options. Say the current market value of a win contract is 5c. Say you have a friend who thinks he can implement 50% of the feature in a couple of weeks, and that that should raise the market value to 40c. Say you’re willing to invest $500 in this feature. Then what you can do is buy 500 win+lose contracts, and offer your friend an option to buy as many as he likes in two weeks’ time for 7c each. If all goes to plan, the feature gets half implemented, he exercises his options, by giving you $35, then sells them at the market rate (for $200). If things don’t go to plan, you still either get your $500 back, or your feature implemented; and your friend doesn’t have to invest any of his savings or run any financial risk.

The key point here is that if the current value is 5c, an option to buy at 7c is (almost) worthless; so you’re not actually giving anything of value away. But if you give that option to someone who can increase the value of the contracts significantly, and they do, then you win (because you got 7c for something that was only worth 5c) and they win (because they get to buy something for 7c and sell it for 40c). And you both win again, because you get your feature half implemented, and they get to spend some time doing some fun coding. This further means that you don’t have to worry as much about giving options to people you trust — since the option is worthless, they’re not going to be able to make any huge profits just by immediately reselling it. This at least gives you the possibility of dealing with people that you don’t know, without them having to put up some capital in the first place.

There’s no conclusion here, just some thoughts.

Catching Up

Hrm, haven’t made a “release manager” blog post since August. Nice. Fortunately that’s not entirely representative of how slack I’ve been, but it’s not as far off as might be nice. Due to the aforementioned chaos I’ve been almost completely out of the loop for the past few weeks. On the upside, this has given Colin Walters and Steve Langasek a good chance to show off their mad skillz, which they’ve handled masterfully.

Greeting my return was also news that the glibc maintainers had finalised most of their NPTL work, so that was ready to go into unstable (they’d been developing it in experimental). It seems to have been fairly successful; the only major bug that’s shown up was expected (more or less; it’s been closed with the addition of a FAQ entry; but hopefully there’s a patch from Red Hat that’ll work around the problem automatically).

And as well, the KDE folks have decided that it’s all too much work and reorganised themselves. On the upside, KDE’s been having problems — it hasn’t been updated in testing since woody released (which is to say the updates in unstable have never been bug free enough to be put in the hands of users, although the blame for this has to be shared with the toolchain) — and getting some more effort put into it should hopefully help. On the downside, reorganisations tend to mean absolutely nothing happens for a while, and last I asked, the new KDE team seems to be in a quandry about exactly how to setup their CVS archive. And then, of course, there’s the possibility of group tensions (that are hopefully being defused a bit). Oh well. On the upside, the problems holding KDE out of testing at the moment didn’t seem very severe, so I forced it in. Future updates will be blocked pending more glibc updates, so hopefully that’s timely, and gives them some good news while they’re getting their act together.

One of the things that we really need to start doing (and have needed to do for a while now) is some aggressive removals. In aid of that, and thanks to some hax0ring of britney, Colin, Steve and Joey all have the ability to schedule removals from testing on their own. Sweet.

Some Criticism of the Allen Report

So, since Lessig has commented on the Allen Report, I thought I might too. They mostly have an interesting view on things, but there are two particular claims that stand out as being, well, delusional.

In section 4.1, starting on page 26, the report makes the argument that the increased incentives provided by copyright extension are significant — that is, that they have a measurable effect. This is to counter the obvious economic argument that the difference is so far in the future compared to when the work was created that, using standard discounting techniques, their present value is effectively zero (just as $1 invested now might be worth thousands in 100 years, $1000s of dollars in hundred years is worth just $1 now — and how many people are going to decide not to write a book, just because they miss out on a whole dollar’s compensation?).

The report makes its argument by establishing that the copyright industry is a tough one, and it’s difficult to be profitable, but “when a title is successful it can be quite profitable, and these profits subsidise losses from unsuccessful titles”, and thus “Copyright, and the increased copyright term, affects the situation by increasing the profitability of the successful titles”. This is true, but unfortunately it only increases the current value of the successful titles by a few dollars, which is insignificant — the record industry could save that much by take the thousands they gave the Allen Consulting Group and investing it. Where there’s real money to be made is increasing the protection for existing works whose copyright term is due to expire soon: since the effect is in the near future, the present value calculations don’t trivialise the value.

This might, in fact, be a good thing. It might be useful for movie studios and publishers and record companies to get a bit more money they can spend on new bands, or better promotion, or fighting P2P, or whatever else. But what it isn’t is an incentive: the benefit comes solely as a result of ownership of existing copyright, and is not dependent on creating new works in future.

It’s reasonable to consider copyright term extension in two ways. As they apply to new works, there are possible added incentives for creation of works, which might be beneficial to society, and possible benefits to society by encouraging artists to be based in Australia, rather than various alternatives. The best economic argument available (and present value calculations are widely accepted in financial areas) indicates these issues are insignificant, and there is no evidence to contradict this argument. Which makes it seem pretty sound. As they apply to existing works, on the other hand, there are two sets of results. One is a windfall to existing owners. Which explains why they care, at least, and there’s not really anything wrong with that. There’s a corresponding loss to the public in general, although I’ve no idea how you’d go about calculating that. (Presumably it’s greater than the windfall to owners — if it weren’t the public domain would be a loss to society, which seems implausible — but how much?) And on the other hand, making stuff into property encourages someone to take some care over it, which is usually better than having no one take any care of it at all. A third issue is that of consistency in the copyright term of various works, but, really that’s of very little value, given all the existing inconsistencies anyway (between works whose authors live to different ages, amongst countries, between works who entered the public domain before copyright terms got extended, etc).

The other utterly bizarre claim the report makes is that term extension will reduce rent seeking costs. The claim is:

Rent-seeking costs may include the resources expended by individuals and groups to lobby government for favourable regulation and sepcification of the propery system. It is clear that significant resources have been expended by copyright owners globally in an effort to encourage legislators to extend existing copyright terms. Failure to extend the copyright term in Australia now will likely result in increased costs until it is extended to overseas levels. Thus, term extension would likely reduce future rent-seeking costs.

It’s dishonest on its face, since there will always be the exact same incentive to extend copyrights for works about to expire, and unless we remove the public domain entirely, various copyright owners will continue to lobby for extensions. The present value of an N-year extension for a work that’s earning X dollars per year whose copyright expires in M years, that was created O years ago, depends only on N, M and X, not on O — so if the extension is granted, and the work continues to earn X dollars per year the value of lobbying for extended terms will be exactly the same N years after the N-year extension is granted. So unless record companies are irrational, they’ll spend the exact same amount on lobbying government then, as now. You can even see this happening in the US now, as copyright owners lobby for further term extensions.

But it’s dishonest in another way too: frankly, no one but the copyright owners care about reduced rent seeking costs to copyright owners. Rent seeking is bad when the government gives in to it: by definition rent seeking is when groups try to persuade the government to give them a new income stream without them providing anything valuable in return. That imposes a cost on society (obviously), without giving any benefit to society, which obviously, sucks for society.

The best way to reduce rent seeking costs isn’t to cave more easily to the copyright industry (which will simply encourage them to spend the same amount or more on additional rent seeking) but rather to make it clear that such investments won’t have any return, and are thus wasted.

But in any event, reduced rent seeking costs as defined in the report are of no public interest whatsoever.

Open Source versus Capitalism

Martin notes that:

One fairly silly argument sometimes advanced against Linux is that by reducing towards zero the cost of getting a good operating system, it is somehow communist or anti-capitalist.

He’s right: people do make that argument, and it’s silly. It’s especially silly because people already do it for a profit, and even sillier when you realise that there isn’t even anything anti-capitalist about charity. And it’s even sillier when you note that Linux and open source have demonstrated that they have a viable business model: there’s plenty of open source being produced given existing incentives; even if those incentives aren’t insanely great.

What’s even sillier though, is that there is a real problem here. The business model for developing free software isn’t a very good one. It’s either financed by charity (whether it be the author’s own, or users, or some other group), or it’s parasitic symbiotic with some other business model, like hardware sales or support or just as in-house development. To be fair, that really is great.

But. The problem with this is that it’s just not reliable. It’s not reliable for the employees, who are at serious risk of being restructured as not being part of the “core business”, and it’s not reliable for users of the software that are willing to part with their cash to keep the development going, but aren’t interested in the primary product these companies are selling.

The real issue here is providing a stand-alone business model for free software development. To backfill a little, Joel’s essay (if you didn’t already click the link above) talks about complementary products; where if one product is available, or cheaper, the other product is likely to sell more. He gives the example of dinners and babysitters:

And babysitters are a complement of dinner at fine restaurants. In a small town, when the local five star restaurant has a two-for-one Valentine’s day special, the local babysitters double their rates. (Actually, the nine-year-olds get roped into early service.)

The thought here is that if babysitters aren’t available, married couples aren’t going to have as many nice romantic dinners. Whereas if they’re cheap, easy and reliable, they’re going to go out for dinner more often. It’s easy to imagine a restaurant putting two and two together and deciding to setup a creche to encourage more couples to come along. Now imagine that that’s the only way to get a babysitter. That’s pretty much the situation we have with open source development. It’s bearable, but it’s not ideal.

By contrast if we could setup a viable stand-alone business model for free software development then we could do things like setup a Microsoft-like campus filled with a hundred well paid free software developers, working full time on doing old stuff better, and making new stuff possible. Given we can compete with Microsoft without those resources, having them should be a significant improvement, and considering the sorts of energy you can get at developers conferences that’s not even completely theoretical.

A question is why consumers would want to give money to developers. Obviously not for the product itself, or for support or other add-ons; since the former’s ruled out by open source licenses and competition, and the latter’s ruled out by assumption. The only thing that seems reasonable to “buy” is what developers spend their time working on. And that’s hard to buy because it’s unpredictable — both because developers don’t tend to have consistent output, and it’s hard to work out how much time and effort is needed to get a useful product out. That is it’s hard to know what you’re actually getting, and it’s hard to know what you should expect to get for your money. But hard’s not the same as impossible.

I’ve been interested in gambling lately, or more descriptively using markets to predict things, such as the next Pope or Governor of California. I’ve been thinking that one way of handling such things is with ecash. Say you setup two currencies, A and B, and allow people to buy 1A and 1B for $1, and allow them to freely trade them. You keep the dollars you collect until the end, at which point if Arnold wins, you buy back the A currencies for $1 each, and let the B currency become worthless. If Bustamante wins, you buy back the B currencies for $1 each, and let the A currency become worthless. If you buy an A and a B, you know it’s worth paying a full dollar, because you’ll get a full dollar back at the end. In a well informed market with the opportunity to average out risks over a long period, the cost of each currency should end up matching the probability that the appropriate candidate will win. It seems nice and simple, and doesn’t requiring any gambling on the house’s behalf — the market works the odds out naturally, and the house just has to charge for its overhead in managing the moneys, which can either come via transaction fees, or by charging slightly more for the initial A+B bets than it pays for the eventual A/B payouts. Simple, honest and effective, as far as I can see.

Putting this together with the above thoughts on software development seems interesting too. One way of doing it would be to let people bet on a particular feature being implemented by a particular date. Fun and exciting, for sure. Let’s suppose you’re a user, who’d really like that feature by that date — enough that you’d part with $1000 for it. Okay, so you buy 1000 W + 1000 L tickets for $1000. You keep the 1000 L tickets — that way if your feature doesn’t get implemented, you’re no worse off. You can either give the 1000 W tickets to someone you know who can work on it, or can sell it on the market — both have their drawbacks, but oh well. At least you win either way: you get your feature or your money back. Interestingly, there’s even some incentive to invest: if your feature doesn’t get implemented, you’ll make a profit on your initial $1 of whatever you can sell your W ticket for. Not sure if that could compete with other investment opportunities, but hey, anything that improves the accuracy of release schedules has got to be a win, right?

For developers, there are two incentives. Obviously if you can afford to implement the feature for $49,000 and can afford say $1,000 upfront to buy 50k W tickets at 2% odds, you’re fine. Spend the money, implement the feature, win. There’s even an incentive to half-implement a feature: if the odds are initially at 5%, and implementing a feature will get those odds up to 40%, you can make a 7x profit on your initial investment, by buying low and selling high after you announce. It doesn’t even really discourage cooperation: if someone else beats you to the punch, you still get the winnings. Unfortunately you have to have some money up front to play, but that’s kind-of true of any business.

There are more significant problems though. One is working out whether a feature has actually been implemented or not — you probably don’t want to let whoever decides that have too much money locked up in either W or L shares; imagine if everyone thought the feature was implemented, so were selling L shares for 1c, and imagine buying 1000 such shares for $10, declaring the feature wasn’t implemented, and getting $1000 back. Yick. At least we quite specifically don’t have to worry about insider trader (we encourage it after all!), so it’s not all bad news.

The ideas here are very similar to those developed in Chris Rasch’s Wall Street Performer Protocol. The main functional difference is in the setting of a time-limit, and the possibility of getting your money back.

UPDATE 2003/11/04:

Martin followed up drawing the analogy to Jim Bell’s Assassination Politics. Why should this sort of scheme work for open source and fail for assassinations? First, there’s nothing questionable about writing open source software: it’s legal and it’s moral. Assassinations presumably aren’t. As such, you presumably need fewer incentives to promote open source than you do for assassinations, and further, if assassination politics becomes successful due to this betting mechanism, you’re likely to see the betting mechanism become either banned outright, or tightly regulated (removing anonymity, eg).

Also worth a quick link is the Rational Street Performer Protocol, which rather than offering software outcomes, guarantees that other people will donate a certain amount for every cent you donate. The theory is that it should create a virtuous cycle of everyone donating more money.

Finally, a big hello to the Carnival of the Capitalists!

Oh! There’s more here.

Bugger

The magic smoke seems to have escaped from my laptop. Doh. God’s way of saying “hey, it’s time to review your backup policies”.

UPDATE 2003/11/03:

Well, I’ve had my laptop back for a while now. I’m not really convinced it’s as good as it used to be — it had a hardware crash for no apparent reason the day after I got it back (just powered right down) which worried me enough to spend all afternoon and most of the evening making backups over the network from single user mode. Yuck. On the upside nothing bad’s happened since.

Also on the upside was the warranty repair service. The only glitch was that Apple hadn’t put me down for the “AppleCare Protection Plan” I bought, which extends the warranty period from one year to three. Apparently the text on the warranty card that said “If you bought this direct from Apple, you don’t need to send in your registration card” actually meant “Apple in the United States”. As the woman who eventually answered the phone when I called the Apple freecall number said, “But how are you supposed to know that?” That all got worked out within an hour, happily, and Nextbyte were able to get the part and give me back my laptop within the next few days, without having me pay money. Yay! (Double yay, considering I was quoted something over $700 just for the part that needed replacing. Ouch.)

On the downside, Apple came out with G4 iBooks the day after I put my laptop in for servicing. Hrm. Surely it’s blasphemous for me to ignore a Sign of that magnitude? Certainly, sometime in the next six or seven months I’m going to have to succumb.

UPDATE 2003/11/11:

Ah. Well, it seems to have become clear exactly what the continuing problem is: the battery isn’t working right. After a while, the connection between the battery and the laptop seems to just fail, and if you happen to shift enough that the AC adaptor slips out from the slot on the laptop, as it occassionally does, *bzzt*, byebye session, hello fsck. It’s not entirely predictable, but the battery works far less often than not. Guess it’s time for another visit to the magic repair shop.

Footnote URLs

So, one thing which annoys me a bit about writing blog entries is the way the horrible markup for links screws up the formatting of the entries as I see it in the editor. It makes a couple of words take up a whole line, and screws up the word-wrapping, and makes things unreadable. Which sucks. So I wrote a cool new plugin that lets me instead write things like:

... So I wrote a <#1>cool new plugin</#1> that lets me instead write things like: ...

<!-- links:
#1: http://azure.humbug.org.au/~aj/blosxom/plugins/footnoteurl/
--!>

Which seems much more pleasant. I had actually thought there was already a plugin that did this, but apparently not. I certainly can’t see anything like it now.

Yay Me

Hey, there’s my submission! How bizarre though, that they seem to have printed out the (126kB) PDF that I sent them, then scanned it back in as a new (906kB) PDF. Arguably it makes sense to make sure that people looking at the website are getting exactly what the review are reading, but even so.

Currently there’s 65 public submissions (if the scrollover doesn’t work, just look at the source). We might see about doing a summary of some of them later. Many of the comments are pretty predictable just based on the organisation doing the submitting. Booooring. Not overly surpising though.

Economics is so interesting

Cypher was a fun movie, twisty and a little noir, with some suspense and revulsion and action and sexual tension and everything. The only problem with it was a strange mis-step in the script: a couple of scenes were about incredibly boring lectures at conferences, and while they at least weren’t so out of touch to try setting it at a Linux conference, one of their topics was enthralling enough to make me wish they could’ve drawn out that scene just a little longer.

Ah, central banks and monetary policy, so thrilling.

Also intriguing is this article on experimental economics. This is the theory that was behind the “market in terror” that got shot down by knee-jerk ignoramuses not too long ago.

IN ONE EXPERIMENT they create minimarkets, with real cash rewards, that allow employees to bet on future sales and revenue. So far these internal futures markets have done a better job of forecasting than polls of top executives. The disruptive implications for the corporate pecking order, and not only at Hewlett-Packard, arent lost on anyone.

In other news, did you know that Governor Schwarzenegger has an economics degree and an honorary doctorate?