Tag Archives: Apple

A Kuhnian look at the iPad

In case anyone has missed it; Apple announced their new tablet computer last week. They call it the iPad. This has released a massive wave of comments of all kinds across the internet, where some has been calling the device over-hyped and under-performing, and others have hailed it as the device that will change mobile computing, and still other haven’t got the point at all (Stoke’s comparison is hilarious, he obviously doesn’t see what Apple is aiming for). But I don’t aim to get into detail on whether the iPad is a good or bad product, at least not before I’ve touched one myself. Instead, I will use two iPad-related articles as starting point for a discussion on the paradigms of computer environments, in a Kuhnian manner.

The philosophy of Kuhn

Thomas Samuel Kuhn was a physicist that is most known for his work on the theory of science and the book “The Structure of Scientific Revolutions“. His main ideas was that science periodically undergoes what he called paradigm shifts. The new paradigm introduces a new set of concepts and interpretations that can explain previous problems in the previous paradigm(s). The new paradigm can also bring a number of problems that scientist have never considered before. This paradigm-view invalidates the often perceived view that science progress linearly and continuously. Kuhn was, for example, sceptical about textbooks of science, which he thought distorted how science was carried out. He also argued that science can never be done in a completely objective way, instead findings will always be interpreted in the present paradigm, making our conclusions subjective.

The iPad

The iPad combines the touch interface of the iPhone and iPod Touch with a larger screen and more powerful hardware, which enables desktop-like applications to be run on the tablet. While this is not revolutionary in itself, Dan Moren of MacWorld argues that the philosophy of the iPad user interface is revolutionary for how we use computers in general. He concludes: “Regardless of how many people buy an iPad, it’s not hard to look forward a few years and imagine a world where more and more people are interacting with technology in this new way. Remember: even if it often seems to do just the opposite, the ultimate goal of technology has always been to make life easier.”

Steven F takes it one step further in his article reflecting on the iPad, and argues that “In the Old World, computers are general purpose, do-it-all machines (…) In the New World, computers are task-centric”, and concludes his introduction with the words: “Is the New World better than the Old World? Nothing’s ever simply black or white.” Steven then make a very interesting argument about why task-centric, easy-to-use, computers will slowly replace today’s multi-purpose devices.

Both of these articles are very much worth reading and brings up numerous important views on the future of computing, and I highly recommend reading them throughly. Even though they both use the iPad as a starting-point, Apple’s new gadget is not really their subject of study. Instead, what both Steven and Dan are reflecting on is the future of computer user interfaces. Which is a much more important subject than the success of the iPad, or if it is a good product or not.

The Old World Paradigm

To use Steven’s terminology, we have for the last 20-25 years resided in the Old World Paradigm of computing. This paradigm is what people usually think of as the normal graphical interface of a computer. It may be Windows, Mac OS X or any common Linux distribution – they are all essentially used in more or less exactly the same way. The same way the original Macintosh introduced in 1984. Not much has changed really. Using my MacBook today, is essentially the same as using my Mac Plus in 1990. It’s just easier to carry with me, slightly faster, and (importantly) has access to the internet. But I interact with it basically using the same metaphors; windows, menus, desktop icons, a pointer, buttons etc.

Before the Mac, most computing was done using command-line tools in a DOS or Unix environment. While this was convenient for many purposes, it was a huge abstraction to the new computer user, and scared people off. Still, I have access to the Unix shell in my MacBook, using the Terminal application, which I use frequently in my bioinformatics work. As computers became more common, this kind of abstraction was becoming a great wall. A crisis in computer interface development started to surface. And according to Kuhn, a crisis feed … a new paradigm. In 1984, Apple started off this paradigm, using techniques partially taken from research made at the Xerox labs. Five to ten years later, most of the computer users was taking part in this graphical paradigm (using Windows 3.1, Windows 95, or Mac System 6 and 7).

Things that scared people about the command-line, were to a large extent solved using the point-and-click metaphors. However, a lot of people still find computers hard to use. Computers need virus cleaning and re-installation. They get bogged down by running too many applications, and having them on for too long results in memory fragmentation. Just keeping the computer running is a hard task for many people. This creates another distraction, further fuelled by the addition of extra buttons, and extra functions directed at power users. Such extra features is just confusing for new computer users. And while Mac OS X and Linux is not as haunted by viruses and malware as Windows is, they are still very complex. Desktops get cluttered quickly by documents, the screen is too small to handle the windows of five applications running simultaneously. With the increasing computing power, a lot of extra functionality is added, which many times just obscures the main task of the computer. A new crisis is emerging.

The New World Paradigm: An era of simplicity

Apple sees this crisis, and also has a solution for it. They call it: simplicity. For Apple, the most important thing is not if we can have access to the latest technology to play around with its internals. Apple wants its average users to never worry about the internals of the device. It should just work. Steven nails it like this: “In the New World, computers are task-centric. We are reading email, browsing the web, playing a game, but not all at once. Applications are sandboxed, then moats dug around the sandboxes, and then barbed wire placed around the moats. As a direct result, New World computers do not need virus scanners, their batteries last longer, and they rarely crash, but their users have lost a degree of freedom. New World computers have unprecedented ease of use, and benefit from decades of research into human-computer interaction.” This means that we computer-savvy guys of the old paradigm will loose something. We loose our freedom to tinker. But this loss comes at great gains.

Kuhn would say that personal computing have reached a new crisis, which opens up for a new paradigm. Apple is among the first companies to try to create a device that defines this paradigm, but they are not alone. Google’s Chrome OS aims at the same thing – to define the next paradigm in computing. And both Apple and Google are willing to bet that the kids born today, who never saw the Old World Paradigm of computing, will never miss it. They will never ask what happened to the file system metaphors, the window metaphors, and the multi-tasking of today’s computers. Because they will never have seen it. Instead, they will ask how we could stand using the buggy and unstable computers we have today.

Apple redefined the smart phone three years ago. They definitely have the potential to redefine the experience of computing. Not that the this new paradigm would mean no more Unix command-line tools. Not that it will mean that the current desktop computers will immediately die out. But what we first think of as a computer in ten years, might very likely be a much more task-centric device than the laptops we use today. And even though this is a loss of freedom, it will surely be a great gain in usability. Until the next paradigm comes along…

Where are the Mac viruses?

Quite often I hear the explanation that Macs don’t get infected by viruses, because Apple’s market share is so small, it wouldn’t be worth the time and effort write a proper Mac OS X virus. This implies that once Mac OS X has reached a critical market share level, there will be a sudden outbreak of hundreds of viruses. My simple question is this: how come there has (to my knowledge) been no actual Mac virus affecting Mac OS X while there have been a couple of viruses affecting Linux, despite its even smaller market share? Wikipedia lists the following Linux viruses:

  • Alaeda – Virus.Linux.Alaeda
  • Bad Bunny – Perl.Badbunny
  • Binom – Linux/Binom
  • Bliss
  • Brundle
  • Bukowski
  • Diesel – Virus.Linux.Diesel.962
  • Kagob a – Virus.Linux.Kagob.a
  • Kagob b – Virus.Linux.Kagob.b
  • MetaPHOR (also known as Simile)
  • Nuxbee – Virus.Linux.Nuxbee.1403
  • OSF.8759
  • Podloso – Linux.Podloso (The iPod virus)
  • Rike – Virus.Linux.Rike.1627
  • RST – Virus.Linux.RST.a
  • Satyr – Virus.Linux.Satyr.a
  • Staog
  • Vit – Virus.Linux.Vit.4096
  • Winter – Virus.Linux.Winter.341
  • Winux (also known as Lindose and PEElf)
  • Wit virus
  • ZipWorm – Virus.Linux.ZipWorm

Can someone, please, explain to me in a rational way how this list can be so long, despite Linux being such a terribly small platform? I suppose, as I do not know for certain myself, that most of these viruses are rather harmless, and that most wouldn’t work on modern Linux systems, as they probably explore vulnerabilities that have been patched in revisions of the OS. I also am aware of that there have been proof-of-concept viruses for Mac, that utilize vulnerabilities that later have been fixed. Some of the viruses in the list above may be similar proof-of-concept examples for Linux.

Personally, I think OSX and Linux match up quite well when it comes to virus security, and that this has nothing to do with the size of the platform, but everything to do with the UNIX/UNIX-like foundation underneath. In both cases, the worst threat is the users themselves, who often allow to run malicious code without knowing what they are doing. This is a big threat to any computer platform, regardless of the security measures taken by programmers. As long as the user can install new software, this will be a potential threat (even though sandboxing and securely signing applications can decrease the risk of malware infection).

That being said, Mac OS X is incredibly easy to hack once you have access to the computer. This is a problem, and Apple really should be busy fixing that. But please aim your guns at the right issues. Mac viruses is not a real threat for the moment, just as Linux viruses is not really a big threat to Ubuntu users. That a Mac can be hacked to gain root access in a minute – that is a problem, which have everything to do with OS architecture. However, making the Mac market share smaller will not solve this problem, nor will it get worse as the platform expands. If we’re in luck, though, Apple may acknowledge the problem as its user base grows, and address it before it gets too late.

The problem with Linux

XKCD sums up the big problem with Linux, and every other open source/free project driven by enthusiasts. You tend to solve the cool things (in a nerdy way – like supporting 4096 processor cores), or the required things (once again, in the enthusiast world) first, and there is no real driving force in solving problems that regular consumers want. Thus, things like flash support, graphics software, games etc. takes years despite the huge open source programmer community working with Linux distributions.

This illustrates well why Linux never has taken off, despite being free, while Mac OS X is steadily eating into Windows market share. The core of the situation is that Apple is a company that would fail miserably if it wasn’t listening to its consumers. Many times, Apple’s manners upset consumers (like me), but even more often they tend to leverage ideas before everyone else, or in a better way than most other tech companies. Or simply at the right time. The iMac, the iPod, the iPhone and the portable Macs all prove that this strategy works. On the other hand, Apple TV have not taken off, probably because it does not have a big enough audience (which Apple acknowledged from the beginning, calling the whole project a “hobby”).

Open source is a great idea, and should be practiced in many situations. But free is not always the same as great, and a business strategy may just what is needed to create what consumers want. And the open source community lacks a such strategy, instead delivering what they need themselves, at the moment. Thus, Linux will never take off on its own. However, initiatives based on Linux, like the Google Chrome OS, targeted specifically at consumers have great chances in challenging both Mac OS X and Windows, because they are free, and supported by a huge company (Google), making its profits on something else. This situation is somewhat similar to the Apple–Mac OS X situation, where Apple is making OSX great to sell more computers, where their real revenue comes in. Probably this is the business model of the future, selling one thing cheap to have consumers buy something else. Drops have already realized this for knitting, giving patterns away for free, hoping that consumers buy their (relatively cheap) yarn. But this is another story I might go into another time. Thanks XKCD for summing it up!

Why is Palm so interested in iTunes?

I am puzzled in more than one way about the conflict arising around Apple and Palm regarding Palm’s “hack” to gain access to iTunes and Apple’s willingness to fight back. What puzzles me most, however, is why Palm is interested in iTunes-support at all.

Palm’s call
Several things are odd in this dispute between the two tech companies. First, instead of writing an own sync client for the Pre, which would be incredibly simple to do, Palm insisted on using a flaw in iTunes device recognition to identify the Pre as an iPod. The only real advantage of this, as far as I can see it, would be to spare iTunes users of installing an additional application. Maybe it could also encourage old iPod users to buy a Pre instead of an iPhone. However, the obvious disadvantage was that it certainly would upset Apple. Also, support for this hack couldn’t be sustained with certainty, so advertising it as a feature, which Palm did, wouldn’t exactly be a neat idea.

Apple’s response
However, I did not expect Apple to respond in the way they did. Only a short time after the Pre introduction they released an iTunes update that patched the hole that Palm used to recognise the Pre as an iPod. I don’t really see the rational argument for this from Apple’s side. It was a hack, Apple clearly stated that they could not ensure compatibility with future iTunes versions, and it didn’t really lose the company any money. Instead, it tied another product to iTunes, allowing Apple to further advertise how nice the Mac-iPod-iPhone-OSX ecosystem works. Until Palm became a major competitor in the mobile phone market, I saw nothing to worry about. Besides, in chess it is always better to keep a threat, rather than playing it.

Palm’s move
However, Apple pulled the plug on the Pre in the new iTunes update, and Palm’s response did surprise me even more. Very quickly, Palm released the 1.1 update of the Pre system software, once again allowing it to sync with iTunes. This time, Palm was using a spoofed vendor ID, which is part of the USB specification. The use of these ID:s are specifically mentioned in the specification, and using another company’s vendor ID is strictly prohibited. But, Palm did not stop here. Instead, they accused Apple for breaking the rules of the standard when not allowing any device to sync with iTunes. While this might be breaching the open intentions of the USB specification, I do not know how to judge in this issue as I am no expert on the rules, Apple is certainly not alone in checking the vendor ID:s before letting software communicate with hardware. The same is true for some music mixing software that is only to be used with specific mixers. Palm is thus aiming for big fishes (or big trouble) here.

MacRumors article

But… why?
No matter who is right and who is wrong, the big point still remains unanswered: why is Palm doing this? What makes them so desperately interested in iTunes compatibility, so that they not only break the USB standards themselves, but also accuse Apple for doing the same? In what way is it worth fighting another big tech company over something that could be sorted out be some licensing, or by Palm writing their own software?

The only reason I can think of is publicity. Palm is in desperate need to get some traction to their products. The Pre is two years behind the iPhone, it does not really run native apps like the iPhone, it will have a hard time attracting developers as it has a so small installed user base, it is on the wrong network so it needs to be modified to be sold in Europe, etc. etc. etc. Thus, Palm really need something that attracts attention. And they need the Palm to look cool. So, why not attacking some other big tech company, accusing them for locking in users in a iTunes-iPhone relationship? Why not pretend to be the rule-breaking rebel, that stands up for freedom of the USB peripherals? Why not doing this in a way where clever “hacks” are advertised as features? Probably, this publicity can generate some hype around the Pre, which in many ways is the same phone Apple already released two years ago, with the same small user base as the iPhone had back then, and still using web-based apps instead of a native set of API:s. It’s sure some creative thinking from a company that desperately needs a success product.

Further reading
For further reading on the Palm Pre – Apple issue, I recommend Daniel Eran Dilger’s series on the subject:

Palm Pre: The Emperor’s New Phone
The Imagined War between Apple and Palm: Pre vs. iPhone
Why Apple’s Tim Cook Did Not Threaten Palm Pre
Why Apple is killing the Pre via iTunes
The Palm Pre/iPhone Multitasking Myth