Tuesday 14 December 2010

The importance of Chrome OS to Google

People say the strangest things. For example, I read almost continuously that industry analysts are questioning the need for another Google OS after Android: that the tablet and handheld are the way of the future, and that the [cloud|net][book|top]/whatever is yesterday's format, ready for burial in the graveyard of corporate computing along with Windows.

Yawn.

Do these people really write 1,000-page market reports on their iPhones? Or do they spend most of their time gallivanting about from one conference to another, occasionally firing off one-line "strategy" emails to their researchers and imagining that, like themselves, nobody really needs a keyboard?

Well that's that off my chest. In what follows, I'll simply assume that no one is stupid enough to imagine that the keyboard-inclusive form factor is dead, at least not for production / work rather than for some consumption / leisure uses.

So let's agree that Chrome OS has at least a future in the cloudtop niche. Why is that important to Google, and what to make of the comment that Android and Chrome OS may, over time, grow together?

I think I found the key in a throwaway sentence in Google revives ‘network computer’ with dual-OS assault on MS by one "Wireless Watch" on The Register:

Set-up, log-in and user interface are the same on all Chrome devices because everything is synced in the browser.

The problem with Android, everyone agrees, is that the platform is dreadfully fragmented, both on a hardware level and more importantly for identity and familiarity, on a software / presentation level. Whatever the proposed importance of Chrome OS for Google in the past, its current, and likely growing, importance is that it will put all things Google front and centre of the user experience.

Yes, it's open source. So yes, although the Google-approved home page is likely to live somewhere on Google's servers (and they'd be mad not to make a special home page optimised for Chrome OS users), evil manufacturers are likely to work out how to substitute their own start page, all cluttered up with locally-installed apps touting their partners' wares. But unlike with Android, the user will be able to leave behind all that bought-and-paid-for "enrichment" guff with the click of a bookmark. And what's the betting that any Chrome OS hardware will come with access to a better-than-free version of Google Apps? Perhaps not quite up to enterprise level, but definitely better than the free version.

So if the importance of the OS is that it will emphasise the Google brand, what about the comment that Android and Chrome OS may grow together? Well, we are all rather more sophisticated nowadays with the notion of an OS's user interface being separate from the guts of the OS itself: anyone who's experimented with Gnome and KDE on Linux will be familiar with that idea. I don't know if the OS foundations of Chrome OS and Android are the same or not [it's now on my to-do list to find out], but considering that they will share many of the same constraints with regard to power management and security (and perhaps, in the future, automatic updating — what a rude awakening for the current crop of Android customisers that would be!) then it might not be stretching things too far to see Android and Chrome OS as simply alternate usaer interfaces on a deeper OS stack.

Wednesday 15 September 2010

Pictures of the Socialistic Future

Reading 'Pictures of the Socialistic Future' by Eugen Richter, originally published in 1893. Imagines a future Germany under socialism and extrapolates its conditions and actions based on the proclaimed policies of the socialists of his day.

Richter was a libertarian in the "Manchester liberal" tradition, in a Germany where totalitarian socialism was rapidly gaining intellectual traction. He gets quite a lot of things eerily right -- his socialist dystopia looks just like East Germany around 1960, complete with border guards shooting anybody trying to get away.

Read it for free here.

"Made me hate socialism all over again" -- Jeffrey Tucker, of the Ludwig von Mises Institute.

Thursday 2 September 2010

Theories of "altruistic" behaviour in eusocial insects

An article in Wired about a new spat over whether the theory of kin selection is justified, or necessary, sees E. O. Wilson doing battle once again against the kin selection brigade.

Something about this article set me thinking. I've never liked this idea of "altruism" in the eusocial insects, but I've never quite been able to say why, but tonight something — perhaps something in the language of the article — suddenly struck me.

Perhaps it was this: "according to the idea of kin selection, workers without young more than compensate by sharing in the reproductive success of relatives, with whom they share genes". I don't see how sharing in the success of a relative with whom you share less than 100% of your genes can ever even equal having your own offspring, never mind improve upon it.

Then it struck me what I don't like about it: there's no mechanism for selection if you look at the workers, because the workers don't reproduce! If you look at a bee hive, only one individual reproduces, and that's the queen. So rather than asking how or why workers would "evolve" a particular behaviour (because how can they "evolve" anything, when they have no offspring?) we should be asking how can it come about that one individual can "hijack" the labour of her offspring?

So I'm not quite convinced that there even *is* an "altruistic" behaviour here that needs to be explained. Quite the opposite: the evolved behaviour that needs to be explained is how the queens coopted their offspring and prevented them from laying eggs of their own.

Notes

We already know of course that it's mostly done by pheromones. The point is, selection pressure must act on the queen bee, not on the workers.

We also know that it's not quite true that the queen always directs when and where new future queens are reared. Though it mostly happens that the queen will lay eggs in cells specially marked for queens, at a time of her choosing (she leaves them empty when she doesn't feel that the conditions are quite right), the workers retain a behaviour, used in emergencies such as the premature death of the queen, to feed royal jelly to any larvae and thereby turn them into queens. It's interesting though that the queens so formed are not as large or as vigorous as the normal ones.

It would be interesting to know if the queen can somehow differentiate between the sperm of the males that she has mated with (10 to 25 males, over 2 to 3 days), stored and kept viable in her body for up to three years. Some males might be selected for producing workers, others for producing queens. Just a thought.

Has anyone looked at human menopause in relation to kin selection? I'm not talking about looking at how grandmothers may improve the reproductive success of grandchildren, I know that sort of stuff has already been done. I'm interested in the *timing* of menopause. Now it's clear that menopause happens while the woman still has eggs in her fallopian tubes. It's also known that the fertility of those eggs decreases as she ages. It strikes me that, for any particular woman, there's going to be a declining curve of "potential reproductive success" against age, based on her own egg fertility. There's also going to be a constant, horizontal line of potential reproductive success *for her* based on the reproductive success of her offspring. (OK, it's not going to be exactly a straight line, but as a first approximation...). Now at some point those lines are going to cross, and I think the kin selection people might be on to somethign if they could show that menopause happens precisely when those lines do cross, i.e. precisely when the mother benefits more from her *offspring's* offspring than she does from having more herself.

Saturday 21 August 2010

Immortal cells, horizontal gene transfer and the tree of life

Had an interesting phone conversation with Aldo last night, talking about horizontal gene transfer amongst single-celled organisms (yes it also happens in multicellular organisms, but I guess it's not so important there unless the new material gets into the germ cells). He mentioned that the traditional tree-of-life style classification of the prokaryotes is a bit ... can't remember the exact word he used, let's say provisional ... when compared to its application to multicellular organisms. I replied that, with multicellular organisms, there's a clear sense in which offspring are not the same individuals as their parents, but there's no such distinction in the case of single-celled organisms. In a sense, a single-celled organism today is the same individual (or at least, one of many clones of it) as its parent from a billion years ago — a parent which we might not even classify as being in the same species as its "descendant"!

Thursday 19 August 2010

Interesting post on Chile's pension system

Gonzalo Lira posts an interesting article on how Chile's pension system benefits everyone:

  • By funneling pension contributions into the stock market it increases the amount of risk capital available to companies generally.
  • It keeps employers' hands off their workers' pensions: there's no company-wide fund to raid.
  • Similarly, there's no national fund for the state to pillage either.
  • It increases labour mobility because people can take their pensions with them when they change jobs.
  • Finally, it protects firms from crippling themselves by their own stupidity because they never have to make pension promises that they can't keep.

Wednesday 18 August 2010

Babies, trees, clones and living forever

Like most nerdy kids, my first introduction to sex was in a library. We're not talking bodies writhing among the books: this was all part of my master plan to acquire All Knowledge before I was thirteen, by reading every book in the local lending library. Amidst all the texts and illustrations, of egg cells, spermatozoa, wombs, pre-natal development, colourful pictures of codpieces throughout the ages and articles on sex-segregated communal living in the South Seas, one question gradually surfaced in my mind: why do we start off microscopic?

Years passed and I didn't acquire All Knowledge, but that question kept coming back to me, and I was reminded earlier today of the answer that I gradually arrived at, when I read an article on declining fertility in clone trees, as reported by the BBC. Trees can't live forever without sex, study shows, they say. The research they're quoting was published in the free, online jounal PLoS Biology: Aging in a Long-Lived Clonal Tree. (Just to make things clear, these trees don't only reproduce clonally. Each tree may be either male or female, and if you put them near each other and supply some bees, they'll reproduce sexually. BUT they can also reproduce by sending out lateral root suckers, and sometimes a single tree can take over an entire location in that way.)

The two things may not immediately seem to be related but they are, and the key is that it's to do with "getting around" the second law of thermodynamics (the one about entropy always increasing with time).

The idea is, the second law says that in a macroscopic object such as a tree or a person, order should always be breaking down. Now animals and plants have repair mechanisms to deal with some of the ways that their bodies break down, and even the cells of which we are made have, themselves, extremely complex and efficient ways of repairing their genetic material when it is damaged. But our bodies consist of billions of cells, and in some of them, inevitably, the repair process fails and they accumulate damage: mutations.

What do we know about mutations? Mostly, they stop things working. Often the things that stop working are critical to the survival of the cell, and the cells die. Sometimes the things that stop working are critical to the body's control over the cell in the environment of the body, and the cell doesn't die when it should, and that can cause cancer. So despite the fact that we owe all the variety of life in the world to mutations, the fact is that most mutations are harmful, and, from the individual's point of view (be it the multicellular being or the cell itself), we don't want them.

Let's put some numbers on this. Each human being inherits about 3 x 10^9 base pairs of DNA from each parent, that's 6 x 10^9 base pairs altogether. Now one estimated is that in humans and other mammals, uncorrected errors occur at the rate of about 1 in every 50 million (5 x 10^7) nucleotides. And note that that's uncorrected errors, true mutations that have survived the repair processes. That mean that each new cell gets about 120 new mutations! The numbers for the Trembling Aspen (the variety of tree that the researchers examined) will be slightly different given its different genome, but from our point of view, essentially similar: the basic point is that every cell is a mutant, no two cells are the same.

Now consider what happens when plants reproduce (if that's quite the right word) clonally. Cells forming a portion of a root produce a bud that starts growing upwards and becomes a new stem or trunk; or, depending on the species, cells forming a portion of a branch produce a bud that starts growing downwards and becomes a new root. In either case, those cells are from the general population and may have been replicated many times since the original seed, accumulating mutations with each fission. It seems to follow from that, that the new individual (trunk, let's say) may already have accumulated significant genetic damage by the time it's "born".

Things aren't looking good for the Aspen. Stands of populus tremuloides can spread asexually for hundreds of thousands of years however, without noticeably degenerating. To some degree this is because of selection at the level of individual trunks or treelets ("ramets", in the parlance), there's also the phenomenon of rejuvenation familiar from coppicing in non-clonal trees, which doubtless applies here too. However the researchers cleverly argued that while surviving ramets in an old clone might maintain genetic fitness due to selection pressure for things like, oh, producing bark, making chlorophyl and so on, there wouldn't be any selection pressure against mutations in sites to do with sexual reproduction. (They are basically saying that if every other tree for a mile around is another male, a clone of yourself, then it doesn't matter if your flowers or pollen work properly or not, there ain't going to be any babies.)

So they looked for evidence of declining fertility amongst male aspen that had been reproducing clonally for a long time, and used that as a proxy for senescence generally. Unsurprisingly, they found it. It seems that, though an individual aspen and its clones may hang around for as long as a million years, they must still find a member of the opposite sex and produce seed before they eventually die.

So what's the connection with microscopic babies? It's just this: that whereas in a macroscopic-sized clone some of its many cells will be viable whereas others won't, in a human baby at the stage of a fertilised ovum either that one cell will be viable or it won't. If the cell isn't viable, then the embryo will not come to term, but if the cell is viable then you have a guarantee that you've started a new cell line without significant accumulated damage. In effect, you've managed to filter out your damaged DNA.

Nothing comes for free though, and it's never really possible to defeat the second law of thermodynamics. In this context it's relevant to consider what might otherwise be a surprising fact: just how frequent miscarriages are even in the developed world. Hunting around the internet I read that one pregnancy in seven miscarries, and it's estimated that the true figure may be as high as one in four: the difference being due to miscarriages that happen before the mother is even aware that she is pregnant.

Today's superplant - the pumpkin

I've been thinking for a while about using the back garden to grow food. Recently, several people I know have also mentioned that they are thinking about doing some vegetable gardening or getting an allotment, so it may be that the idea is in the air (like some mental pollen looking for susceptible brains to pollinate!), perhaps impelled by how broke people seem to be feeling nowadays, what with the double-dip just round the corner...

ENN-EE-WAY I'm looking forward to what I should be planting next year. I want to maximise the utility I get from each type of plant (translation: I'm lazy) and I also like the idea of companion planting to increase productivity, decrease the amount of weeding necessary (see a theme developing here? actually it just appeals on the grounds of basic efficiency, honest) so I'm toying with some kind of variation on the Three Sisters technique.

Now the classical form of Three Sisters calls for growing maize, beans and squash. The maize grows alone until it reaches about 15 inches in height, then ytou plant beans and squash alternately between the maize plants. As the maize continues to grow, the beans grow up it (so, looks like you'll need a strong-stemmed variety of maize!) while the squash vine winds around below, providing ground cover and discouraging weeds. The beans also fix nitrogen, helping to assure fertility from year to year. (I suspect you could also plough in the maize stalks, and bits of the other plants, as a green fertilizer, but then you'd need to carefully rotate locations in order to avoid diseases overwintering in the soil. Maybe it would be best to just burn them or compost them.)

One problem is I'm not quite sure what I'd do with a lot of maize. I can't see myself producing ethanol in the garage, really, and I'm not sure if there is anywhere nearby that could grind them into meal or flour (though it might be worth looking, just in case there is). And would I really be up to putting the maze kernels through the nixtamalization that's necessary to extract all the goodness? — How easy is it to buy relatively small quantities of calcium hydroxide? Then again, maybe I could get by just grinding up some eggshells (mostly calcium carbonate) in a pestle and mortar and adding that to the simmering kernels? And anyway, nixtamalisation is only really necessary if you have maize as the major part of your diet, otherwise you'll get the niacin, lysine and tryptophan you need simply by having a varied diet.

So maize is probably still quite a good possibility, maybe even leaning towards a popcorn variety. Another possibility for the tall-plant role is the occasional sunflower. Beans are, of course, beans: there's any number of varieties that I could try. But what shall I choose for the squash?

Well I'm currently leaning towards the humble pumpkin. If you look at Leaflet No. 12 - 1986 - Pumpkin, you'll see that it's easily grown — the report says an old rubbish heap is ideal! — and that you can eat almost every part of the plant: the fruit (of course), the leaves, the flowers (use the petals, avoid the centres of the flowers), the growing tips of the vine itself, and the seeds.

The pumpkin's vitamin and mineral content is also high, with the leaves being stellar sources of vitamin A and, especially, C. You can eat them with fats (e.g. cream, oil, fatty meat) in order to promote uptake of the vitamin A into the body.

Friday 9 July 2010

The chronicles of Desem, round 2

So yesterday's second big mistake, unreported at the time, was that I didn't take away half the dough before mixing in the new flour and putting it back. That's probably because I conflated two things: cutting off the dry rind from the outside, and managing the quantity of dough. Because my dough didn't have much in the way of a hard rind I mixed 100% of it with the new flour and put it back in its bed until lunchtime today. The result was that the dough ball had risen enough to break through the flour covering it in about six hours, rather than taking until this morning to do it.

Still and all, when I finally got to take the dough ball out of its bowl today, it had managed to create a bit of a rind all over, though the ball still showed structural weakness, with a distressing tendency to almost tear as I manhandled it out of the bowl. I'm not sure if that's par for the course or if I'm doing something wrong...

This time I had all my flour and water ready! Instead of the small, wooden bread-board that I was mixing stuff on before, I decided to use a round, lap-sized, white plastic tray—the sort of thing you put your meal on when eating in front of the TV :). About 20 inches across, this had plenty of room for putting one cup wholemeal flour directly on to it at one side for the ultimate in sticky-handed convenience later on, and a half-inch rim meant that I was going to be able to chase the last bits of flour around the tray without losing a lot of it over the edge. I also had a half cup of water in a measuring jug just next to the tray.

I'm not sure if I'm just clumsy, or if the lack of stiffness of the risen dough was a factor, but I think the pile of cut-off rind contained about 60% of the original ball, rather than the 50% I was expecting. The 40% from the interior was once again quite wet and sticky, but this time once I started mixing in the cupful of flour, it dried out quite quickly (today is the hottest day of the year, if that's a factor), and I ended up adding about half the water in the jug—so about 1/4 of a cup—to get a nice, stiff, and reasonably-sized ball of dough.

Learning #2. Cut the rind off if it forms one, but whether you end up with discarded dough or not, you still have to reduce the risen dough ball by about 50% in volume altogether before adding today's cup of flour and today's water: the dough ball that gets buried in the bowl of flour should be about the same size each time!

I've also had a go at making desem dosas from the cut-off rind. As my wok is a living masterpiece of old burger fat and bits of burned protein from a hundred different sources, I'll pass over the actual frying of the dosa itself in silence and simply comment that although after a good deal of stirring, and cutting and crushing especially recalcitrant bits of dough rind with a knife, I got something that resembled a batter, the dough shows no signs of "dissolving" as I'd been led to expect. Perhaps I just haven't waited long enough: Makiko says to let the batter stand for three hours at room temperature before making the dosas and, of course, I just couldn't wait.

Thursday 8 July 2010

The chronicles of Desem, round 1

After several days collecting tools and ingredients (nobody sells simple room thermometers any more! I ended up buying an enormous garden thermometer in town—at least it's easy to read with my poor vision), I started my first desem dough last Monday, at about noon.

The process is quite straightforward: mix two parts of wholemeal flour with about one part of filtered water and knead it into a ball of fairly stiff dough, and then (this is the fantastic bit) bury it for a couple of days in a bed of dry flour so that there's about three inches of flour all around and below and above the dough. The idea is that wild yeasts and symbiotic bacteria that are floating around in the air and on your hands and what-not will inoculate the dough and start concentrating themselves inside it. In practice, it felt a bit like planting a seed in some soil and waiting for it to germinate and grow.

The recipe calls for wholemeal flour both for the dough and for the bed of flour it hides in, but being on a budget I'd used Sainsbury's strong, stoneground, 100% Wholemeal Bread Flour for the dough and the simple white Plain Flour from their Basics range for the packing. To give you an idea of how much this saves, the wholemeal flour is 95 pence for a 1.5 kilo bag, whereas the plain flour is only 42 pence for the same amount.

Now this all has to go into some kind of container, and I used a huge salad bowl—the sort of thing that might contain a communal salad for the dinner table. I gave it a glass lid using the base of an old microwave cooking container; the result was a mite precarious but it meant that I could see what was going on without having to take the lid off. I put the bowl in a corner of the living room on an old audio stack that I never use any more. The coolest place inside the house, it was still right at the top of the recommended temperature range of 50°–65° Fahrenheit. However it's the height of summer here in England at the moment, and the house central heating has been off for weeks, so I guess that that temperature range is going to be achievable all year round.

I spent the next two days sneaking peeks at the bowl every few hours. Tuesday passed uneventfully, which was only to be expected, but by Wednesday noon I was fairly sure that something was different: the bed of flour seemed to be just slightly coming away from the side of the bowl in a couple of places. Not very exciting, but things got better very quickly. By late afternoon there were definite cracks in the flour at the centre of the bowl. By early evening the cracks were bigger, and the centre of the cracks was definitely raised slightly. By late evening I could just see brown dough peeping out between the cracks, and by the time I went to bed just after midnight, the top of the ball had pushed itself solidly out of the bed of white flour.

So today at noon I took the bowl out of the living room, put it on a work surface in the kitchen and took the lid off. The live dough was by now bulging monstrously out of the surrounding flour, and close up, and without the glass lid, it was possible to see the little bubbles of air (or, probably, carbon dioxide) that were causing it to rise. What I wasn't prepared for though, was the scent.

I'd speculated, of course, as to how it was going to smell. Possible candidates seemed to be: bready; doughy; yeasty (I thought this might be quite likely); musty (I thought it might smell musty if something had gone wrong perhaps). In fact it smelled absolutely delightful, but nothing like you would expect from dough. Imagine freshly laundered bed sheets that have been air dried on a line in the garden, now add a little bit of the smell of a banana, ripe but not too ripe, and finally add a fresh wind coming off a wheat field in the summer. Clean, fresh, and banana-fruity.

I also wasn't prepared for it being wet.

When I'd put the original dough mixture into the bed of flour, it had been a fairly stiff mix. Two (American-size) cups of flour to one cup of water to start off with, and then I'd added more flour until it stopped sticking to my fingers and the work surface. It was easy to knead and work, but definitely on the dry side. What came out of the bowl two days later was lighter of course, being full of air bubbles, but I'd expected it to lose water both to the yeast and bacteria growing inside it and perhaps to the flour surrounding it. On the contrary, it now felt as wet and sticky as it had when I'd originally made the 50%-hydrated 2:1 mixture, and it promptly started tearing as I lifted it out of the bowl. I can only speculate why this happened. Monday had been a very hot, dry day, but today is rather cooler and overcast, and the air feels damper. I'd almost guess that the dough had somehow extracted moisture from the air, but I can't see how that might have happened while it was buried in three inches of flour...

And this is where learning #1 makes its appearance. I now had a sticky, floppy dough which promptly stuck both to the work surface (a large wooden bread board that I'd specially cleaned and left to dry the previous evening) and to my hands, and yet the half cup of wholemeal flour that I was supposed to be adding to it was still in the bag in the cupboard, and the quarter cup of water was still in the filter jar. Disaster! I ended up taking some of the packing flour from the nearby bowl and using that instead, until the dough was dry enough to get it of my hands. All in all that was about a quarter cup of cheap, white flour, and then I washed and dried my hands and got another quarter cup of the wholemeal flour instead of the half a cup that I'd intended, and worked that in. I hope the admixture of a small amount of cheap, bleached flour won't spoil it. In the event, I didn't add any more water at all, as the dough with all the new flour added now felt about as stiff as it had when I'd originally put it into the bowl on Monday.

So learning #1 is this: I need to have the flour and water that I am going to add already laid out in containers nearby, so that I can get them when my hands are covered in flour and bits of wet dough.

Once the dough felt satisfactory I made it into a ball and re-buried it in the bowl of plain flour, making a bigger pit this time since the ball is now about twice the size of the original one. Hopefully this will continue to rise overnight, and increase the concentration of the yeasts and lactobacilli that give it its flavour and leavening power. I'm hoping that by tomorrow it will have got on plan, and have developed an outer rind that I can cut off to expose the inner soft dough, before feeding that with more flour and water and re-burying it again. Makiko (the author of the posts at justhungry.com that I linked to at the top) has a recipe for dosas made from the cut-away rind that I really want to try.

Friday 28 May 2010

Apple, Google, Microsoft: the tipping point is nigh

And so here we are in the middle of 2010 with the mass computing market at another tipping point. Everybody can feel it, the calm before the storm; the only question is, which way is it going to go?

On the day that Apple's "magical and revolutionary" iPad goes on sale in the rest of the world, pity poor Microsoft. After pushing the tablet format (in one incarnation or another) fruitlessly for well over a decade, Microsoft has finally lived to see the market stolen from it with an audacity that we haven't witnessed since, well, they themselves stole it from GO Corp. in the 1990s.

Microsoft now seems curiously inactive. Once trumpeted as the inevitable inheritor of the smartphone operating system mantle, it has become so wrapped up in its failing attempt to take search away from Google that it hasn't even noticed that it's almost completely irrelevant, its mind share zero outside its business stronghold, like an early-90s IBM.

Apple is riding high with, for the moment at least, a larger capitalisation than Microsoft. Apple is cool once more, its products aspirational — and not just to tech dweebs this time but, with the spread of computing power into every area of life, to snobs both young and old, jocks and cheerleaders, pop stars and presidents: cool to the cool. A design company as much as a tech company, and a fashion company as much as a design company, the Apple brand is the King of Cool right now.

I can't help feeling, though, that Apple is making exactly the same mistake, by tying together hardware and software, that it did with its personal computer business. Apple was first among equals in the invention of the personal computer market at the start of the last quarter of the 20th century, and in the late 70s and 80s its Apple II line was certainly the best known and the most successful of all personal computers. With the loudest and most inventive developer community, it was the product that every other product strove to be.

However in the 1990s Apple was no longer content to profit just from sales of hardware. Deciding that it wanted a cut from everything that touched its machines, Apple replaced the open-architecture Apple II with the pretty, pricey, and indubitably closed Macintosh line. Unfortunately, Apple forced its developer community into a technology change just as the Windows-compatible market was taking off. Hardware and software makers accustomed to developing to a well-known spec for a large market and being able to sell their products without anyone's say-so suddenly found themselves developing for a gated, curated platform with puny market share. Their costs soared and their profits collapsed, and the Apple ecosystem ceded ground to cheap-and-cheerful commodity PCs running Windows. As investment in Windows-compatible hardware and software exploded synergistically, Apple fell further and further behind, eventually being almost completely chased out of the market.

And then there's Google. Eager, even cocky, it has felt the electric potential in the air and concluded that there's now the kind of opportunity that only comes along once in a generation. The hand-held (in its broadest sense) platform has been a fragmented mix of offerings. The alpha player, Nokia, is losing ground to a new challenger, and its response looks dated and incoherent. The new challenger though is limited to a single manufacturer, with all the production and innovation bottlenecks that this implies. All the other entries are also-rans. The landscape is ripe for the same kind of consolidation that was achieved on the desktop in the 90s but Microsoft, the natural company to do this, is asleep at the wheel. Bing!, and you're dead.

Of course, even if Microsoft had come up with a must-have handheld OS device manufacturers would still have been wary, given Microsoft's history of sucking the profitability out of hardware via licensing costs (though if it had achieved critical mass it would have undoubtedly have picked them off one by one, with its divide and conquer strategy of old).

A carrier meditating on its product strategy.

Now in a world red in tooth and claw, a world of eagles and of sharks, carriers and handset manufacturers are the puffins. If there's one thing they really want, it's to differentiate their handsets from everybody else's. And if there's one thing that they really dread, it's the costs associated with actually being different. Oh how they quail at the thought of being the owner of a proprietary platform! ("Why, look what happened to Compaq, to Sun, and to p-p-p-Palm!") Odd though, since that's exactly how Apple makes all its money. They must undoubtedly perceive, correctly, that their risk-averse, extremely short-term, MBA-laden, engineering-lite corporate cultures render them completely incapable of the sustained effort required to "do an Apple" with any hope of success, so they substitute product churn, wacky names, and our old favourite, financial "innovation" for true excellence.

So Google spots the gap and brings out Android, its master stroke. Now the carriers and manufacturers can tick one box because it's Google that's paying for the development, not them (and it's not so bad for Google either because Android must be relatively cheap to develop, being based on linux and java). Android is also open source, so that's another couple of boxes that they can tick, this time labelled "Absence of lock-in" and "Low per-unit cost". Finally, thanks to Google's permissive attitude, they can realise their dream of inexpensive product differentiation via extensive customisation. Indeed, some of them customise their handsets so extensively that they are unable to upgrade them to the new versions of the operating system that Google is bringing out seemingly every couple of months. But that's no matter, just release a new handset (with a fabulous new name!) every quarter, and leave the suckers who bought the previous (now obsolete) product to cry into their two-year contracts. Sheesh, maybe "Google" is Lakota for "dances-with-fools".

The point for Google of course, is to avoid the creation of a new Microsoft (especially one that is also the old Microsoft) while relegating Apple to an expensive, upmarket, and hopefully investment-starved niche. Google doesn't have to profit from the unifying operating system per se in order to win, they just have to ensure that nobody else does. Then, by insisting that for an offering to be labelled as "Android" with all the consumer confidence that that instills, the product must use their web services, Google intend to rule, like some deity of old, from the clouds.

Best sentence of the day (so far)

"Yet the surface is where the movie stays, like an old submarine with dead batteries" (the WSJ reviewing Sex and the City 2).

Well, like an old something with dead batteries, anyway.

Thursday 20 May 2010

Google open-sources VP8, Flash will play it

See webmonkey and The Register for details.

The open sourcing aspect was long expected, but by tying in Adobe Google seems to me to have made a typically smart end-run around Apple (and Microsoft, but it's really Apple whose face we're loving to punch, isn't it guys?).

So Google will update YouTube and suddenly half the video on the web will be in the VP8 format, and Adobe will update Flash and suddenly 90% of all browsers will be able to view it. Since it's all open source, the rest of the web will follow in dribs and drabs.

Who then will need H.264? Only Apple and its walled garden of iPhones, iPads and iDontPlayVP8s I guess. That garden's looking a little droopier than it did a moment ago, and Apple could even end up having to pay publishers to supply content in H.264, ouch!

Monday 17 May 2010

Famous delusions of grandeur

Angela Merkel: "[It's] a battle of the politicians against the markets ... I am determined to win."

King Canute: "Sea, I command you to come no further! Waves, stop your rolling! Surf, stop your pounding! Do not dare touch my feet!"

Saturday 8 May 2010

Gordon Brown - a modest proposal

I see Gordon Brown is still causing trouble for the country, this time by squatting in number 10 and refusing to go.

If he needs a hint then we could ask Her Majesty to send in the yeomen from the Tower of London to winkle him out with their pikes. In fact if they then took him to the tower and put him in the stocks, I for one would pay good money to throw ordure at him. It could be his little contribution to help pay off the national debt that he did so much to increase.

How we used to laugh when we heard him, as Tony Blair's chancellor, talking about "prudent" "investment" over the business cycle. We knew it would all end in tears, and now it has.

Friday 30 April 2010

A common inconsistency

It always surprises me how many people think it is wrong to burden future generations with problems like exhausted natural resources, or a polluted biosphere or an atmosphere full of carbon dioxide, when so few people seem to care about leaving them with a mountain of debt.

Thursday 18 February 2010

Greece on the edge of the abyss

It's now looking as though Greece could be in default at any time. Those delicious-looking interest-rate swaps that they entered into with Goldman Sachs ["I'll just have one more!"] may be about to come back and bite them in the tail even more firmly than we thought.

Zero Hedge has an article revealing that one more credit rating downgrade either for Greece itself or (and even more easily done) for the legal vehicle which it used to execute the deal, will trigger an obligation to pony up collateral. Which Greece probably won't be able to do, and if it can't, it's going to default, and if it defaults there it's not going to be able to roll over any other debts and then it's going to default everywhere. Poof! Bye-bye Greece.

I can't say I'm very surprised. I visited Greece for two weeks about twenty years ago; one week in Athens for work, another tacked on the end to take a classical tour of the whole country. I went there with my head full of foolish and romantic notions about the birthplace of democracy, theatre, poetry and the arts, architecture, rosy-fingered dawn over the Parthenon, mount Olympus and the wine-dark sea. That had all been knocked out of me by the end of the third day.

Even then, two decades ago, it was impossible for someone recognisablly a foreigner [i.e. me] to sit and eat at a table outside a restaurant in Athens without being continually bothered by shady-looking men coming up and asking if I wanted to go to a club and meet girls. And I mean continually. Every ten or twenty minutes. I also remember reading an English-language Greek newspaper and seeing a report of one of the national politicians praising membership of the EU on the grounds that Greece would be able to squeeze every last drop of regional and structural grants and then spend it all on whatever they wanted.

All rather depressing, and now it seems we're at the far end of that arc. EU funds, and latterly membership of the Euro (and all lubricated by endemic greed and corruption) have done for Greece what the curse of oil has done elsewhere, and now it's time to pay the piper. I'd like to be glad at least that it probably means that Greek property will be cheap in a couple of years, but the truth is that any foreigner who buys property in Greece is likely to be seen as a piggy bank by local officials and politicians, there to be raided whenever the need for cash demands it.

Monday 15 February 2010

Maemo + Moblin = MeeGo

So Intel has finally thrown in the towel on creating its own mobile platform based on Linux. Disguised, naturally, as a triumphant victory, uniting its Moblin product with Nokia's Maemo so that together they may go onwards to storm the bastions of Android and iPhone. But everyone can see that MeeGo, the combined offering, will really be Maemo with a few extra bells and whistles. Most imortantly from a developer viewpoint, the development API remains Maemo's Qt — which is absolutely right, there's nothing else as good around.

The name change is a shame though. "Maemo" feels vaguely Nordic with its echoes of Mimir, Aesir, Ymir and so on. MeeGo? Fresh out of the marketing department's paper shredder if you ask me. The best thing independent developers can do is ignore the name change and stick with calling it Maemo. Or, alternatively, I think I like the sound of "Haemogoblin".