Tag Archives: Mac OS X

A Kuhnian look at the iPad

In case anyone has missed it; Apple announced their new tablet computer last week. They call it the iPad. This has released a massive wave of comments of all kinds across the internet, where some has been calling the device over-hyped and under-performing, and others have hailed it as the device that will change mobile computing, and still other haven’t got the point at all (Stoke’s comparison is hilarious, he obviously doesn’t see what Apple is aiming for). But I don’t aim to get into detail on whether the iPad is a good or bad product, at least not before I’ve touched one myself. Instead, I will use two iPad-related articles as starting point for a discussion on the paradigms of computer environments, in a Kuhnian manner.

The philosophy of Kuhn

Thomas Samuel Kuhn was a physicist that is most known for his work on the theory of science and the book “The Structure of Scientific Revolutions“. His main ideas was that science periodically undergoes what he called paradigm shifts. The new paradigm introduces a new set of concepts and interpretations that can explain previous problems in the previous paradigm(s). The new paradigm can also bring a number of problems that scientist have never considered before. This paradigm-view invalidates the often perceived view that science progress linearly and continuously. Kuhn was, for example, sceptical about textbooks of science, which he thought distorted how science was carried out. He also argued that science can never be done in a completely objective way, instead findings will always be interpreted in the present paradigm, making our conclusions subjective.

The iPad

The iPad combines the touch interface of the iPhone and iPod Touch with a larger screen and more powerful hardware, which enables desktop-like applications to be run on the tablet. While this is not revolutionary in itself, Dan Moren of MacWorld argues that the philosophy of the iPad user interface is revolutionary for how we use computers in general. He concludes: “Regardless of how many people buy an iPad, it’s not hard to look forward a few years and imagine a world where more and more people are interacting with technology in this new way. Remember: even if it often seems to do just the opposite, the ultimate goal of technology has always been to make life easier.”

Steven F takes it one step further in his article reflecting on the iPad, and argues that “In the Old World, computers are general purpose, do-it-all machines (…) In the New World, computers are task-centric”, and concludes his introduction with the words: “Is the New World better than the Old World? Nothing’s ever simply black or white.” Steven then make a very interesting argument about why task-centric, easy-to-use, computers will slowly replace today’s multi-purpose devices.

Both of these articles are very much worth reading and brings up numerous important views on the future of computing, and I highly recommend reading them throughly. Even though they both use the iPad as a starting-point, Apple’s new gadget is not really their subject of study. Instead, what both Steven and Dan are reflecting on is the future of computer user interfaces. Which is a much more important subject than the success of the iPad, or if it is a good product or not.

The Old World Paradigm

To use Steven’s terminology, we have for the last 20-25 years resided in the Old World Paradigm of computing. This paradigm is what people usually think of as the normal graphical interface of a computer. It may be Windows, Mac OS X or any common Linux distribution – they are all essentially used in more or less exactly the same way. The same way the original Macintosh introduced in 1984. Not much has changed really. Using my MacBook today, is essentially the same as using my Mac Plus in 1990. It’s just easier to carry with me, slightly faster, and (importantly) has access to the internet. But I interact with it basically using the same metaphors; windows, menus, desktop icons, a pointer, buttons etc.

Before the Mac, most computing was done using command-line tools in a DOS or Unix environment. While this was convenient for many purposes, it was a huge abstraction to the new computer user, and scared people off. Still, I have access to the Unix shell in my MacBook, using the Terminal application, which I use frequently in my bioinformatics work. As computers became more common, this kind of abstraction was becoming a great wall. A crisis in computer interface development started to surface. And according to Kuhn, a crisis feed … a new paradigm. In 1984, Apple started off this paradigm, using techniques partially taken from research made at the Xerox labs. Five to ten years later, most of the computer users was taking part in this graphical paradigm (using Windows 3.1, Windows 95, or Mac System 6 and 7).

Things that scared people about the command-line, were to a large extent solved using the point-and-click metaphors. However, a lot of people still find computers hard to use. Computers need virus cleaning and re-installation. They get bogged down by running too many applications, and having them on for too long results in memory fragmentation. Just keeping the computer running is a hard task for many people. This creates another distraction, further fuelled by the addition of extra buttons, and extra functions directed at power users. Such extra features is just confusing for new computer users. And while Mac OS X and Linux is not as haunted by viruses and malware as Windows is, they are still very complex. Desktops get cluttered quickly by documents, the screen is too small to handle the windows of five applications running simultaneously. With the increasing computing power, a lot of extra functionality is added, which many times just obscures the main task of the computer. A new crisis is emerging.

The New World Paradigm: An era of simplicity

Apple sees this crisis, and also has a solution for it. They call it: simplicity. For Apple, the most important thing is not if we can have access to the latest technology to play around with its internals. Apple wants its average users to never worry about the internals of the device. It should just work. Steven nails it like this: “In the New World, computers are task-centric. We are reading email, browsing the web, playing a game, but not all at once. Applications are sandboxed, then moats dug around the sandboxes, and then barbed wire placed around the moats. As a direct result, New World computers do not need virus scanners, their batteries last longer, and they rarely crash, but their users have lost a degree of freedom. New World computers have unprecedented ease of use, and benefit from decades of research into human-computer interaction.” This means that we computer-savvy guys of the old paradigm will loose something. We loose our freedom to tinker. But this loss comes at great gains.

Kuhn would say that personal computing have reached a new crisis, which opens up for a new paradigm. Apple is among the first companies to try to create a device that defines this paradigm, but they are not alone. Google’s Chrome OS aims at the same thing – to define the next paradigm in computing. And both Apple and Google are willing to bet that the kids born today, who never saw the Old World Paradigm of computing, will never miss it. They will never ask what happened to the file system metaphors, the window metaphors, and the multi-tasking of today’s computers. Because they will never have seen it. Instead, they will ask how we could stand using the buggy and unstable computers we have today.

Apple redefined the smart phone three years ago. They definitely have the potential to redefine the experience of computing. Not that the this new paradigm would mean no more Unix command-line tools. Not that it will mean that the current desktop computers will immediately die out. But what we first think of as a computer in ten years, might very likely be a much more task-centric device than the laptops we use today. And even though this is a loss of freedom, it will surely be a great gain in usability. Until the next paradigm comes along…


Where are the Mac viruses?

Quite often I hear the explanation that Macs don’t get infected by viruses, because Apple’s market share is so small, it wouldn’t be worth the time and effort write a proper Mac OS X virus. This implies that once Mac OS X has reached a critical market share level, there will be a sudden outbreak of hundreds of viruses. My simple question is this: how come there has (to my knowledge) been no actual Mac virus affecting Mac OS X while there have been a couple of viruses affecting Linux, despite its even smaller market share? Wikipedia lists the following Linux viruses:

  • Alaeda – Virus.Linux.Alaeda
  • Bad Bunny – Perl.Badbunny
  • Binom – Linux/Binom
  • Bliss
  • Brundle
  • Bukowski
  • Diesel – Virus.Linux.Diesel.962
  • Kagob a – Virus.Linux.Kagob.a
  • Kagob b – Virus.Linux.Kagob.b
  • MetaPHOR (also known as Simile)
  • Nuxbee – Virus.Linux.Nuxbee.1403
  • OSF.8759
  • Podloso – Linux.Podloso (The iPod virus)
  • Rike – Virus.Linux.Rike.1627
  • RST – Virus.Linux.RST.a
  • Satyr – Virus.Linux.Satyr.a
  • Staog
  • Vit – Virus.Linux.Vit.4096
  • Winter – Virus.Linux.Winter.341
  • Winux (also known as Lindose and PEElf)
  • Wit virus
  • ZipWorm – Virus.Linux.ZipWorm

Can someone, please, explain to me in a rational way how this list can be so long, despite Linux being such a terribly small platform? I suppose, as I do not know for certain myself, that most of these viruses are rather harmless, and that most wouldn’t work on modern Linux systems, as they probably explore vulnerabilities that have been patched in revisions of the OS. I also am aware of that there have been proof-of-concept viruses for Mac, that utilize vulnerabilities that later have been fixed. Some of the viruses in the list above may be similar proof-of-concept examples for Linux.

Personally, I think OSX and Linux match up quite well when it comes to virus security, and that this has nothing to do with the size of the platform, but everything to do with the UNIX/UNIX-like foundation underneath. In both cases, the worst threat is the users themselves, who often allow to run malicious code without knowing what they are doing. This is a big threat to any computer platform, regardless of the security measures taken by programmers. As long as the user can install new software, this will be a potential threat (even though sandboxing and securely signing applications can decrease the risk of malware infection).

That being said, Mac OS X is incredibly easy to hack once you have access to the computer. This is a problem, and Apple really should be busy fixing that. But please aim your guns at the right issues. Mac viruses is not a real threat for the moment, just as Linux viruses is not really a big threat to Ubuntu users. That a Mac can be hacked to gain root access in a minute – that is a problem, which have everything to do with OS architecture. However, making the Mac market share smaller will not solve this problem, nor will it get worse as the platform expands. If we’re in luck, though, Apple may acknowledge the problem as its user base grows, and address it before it gets too late.

The problem with Linux

XKCD sums up the big problem with Linux, and every other open source/free project driven by enthusiasts. You tend to solve the cool things (in a nerdy way – like supporting 4096 processor cores), or the required things (once again, in the enthusiast world) first, and there is no real driving force in solving problems that regular consumers want. Thus, things like flash support, graphics software, games etc. takes years despite the huge open source programmer community working with Linux distributions.

This illustrates well why Linux never has taken off, despite being free, while Mac OS X is steadily eating into Windows market share. The core of the situation is that Apple is a company that would fail miserably if it wasn’t listening to its consumers. Many times, Apple’s manners upset consumers (like me), but even more often they tend to leverage ideas before everyone else, or in a better way than most other tech companies. Or simply at the right time. The iMac, the iPod, the iPhone and the portable Macs all prove that this strategy works. On the other hand, Apple TV have not taken off, probably because it does not have a big enough audience (which Apple acknowledged from the beginning, calling the whole project a “hobby”).

Open source is a great idea, and should be practiced in many situations. But free is not always the same as great, and a business strategy may just what is needed to create what consumers want. And the open source community lacks a such strategy, instead delivering what they need themselves, at the moment. Thus, Linux will never take off on its own. However, initiatives based on Linux, like the Google Chrome OS, targeted specifically at consumers have great chances in challenging both Mac OS X and Windows, because they are free, and supported by a huge company (Google), making its profits on something else. This situation is somewhat similar to the Apple–Mac OS X situation, where Apple is making OSX great to sell more computers, where their real revenue comes in. Probably this is the business model of the future, selling one thing cheap to have consumers buy something else. Drops have already realized this for knitting, giving patterns away for free, hoping that consumers buy their (relatively cheap) yarn. But this is another story I might go into another time. Thanks XKCD for summing it up!