A Leauki's Writings
Some thoughts on the economics of personal computers
Published on June 19, 2005 By Leauki In
Had Intel suceeded in establishing Itanium on the desktop, as they originally planned, Itanium chips would probably be cheaper than they are now.

Of course, once the chip is established as the new standard, like x86 before, the price would have gone up, since users need the chip as much as the operating system. The result would have been a monopoly of standard microchips compatible with the Intel Itanium just as there is a monopoly of standard operating systems that are fully compatible with Microsoft Windows. Patents and copyright guarantee the monopoly, just as it was intended.


Microsoft, of course, knew about that danger. It is in Microsoft's (and every operating system vendor's) best interest for processors to be commodities (aka cheap and interchangeable). Thus Microsoft made a few moves to make sure Intel and HP could not succeed (while still supporting the Itanium chip as promised).

Microsoft knew about this danger in the late 90s and acted accordingly:

1. Keep everybody 32 bit until the arrival of a commodity 64 bit architecture.

2. Support Apple with Mac OS applications to keep other processor architectures around.

3. Move Xbox to PowerPC.

4. Support the commodity 64 bit architecture when it appears.


Intel also knew and were thus working to make operating systems the commodity.

1. Start and support a Linux-port to Itanium.

2. Try and convince software vendors to port operating systems to Itanium (including AIX).

3. Get HP to port HP-UX and VMS to Itanium.

4. Get Microsoft to offer Windows for Itanium.


The perfect situation for Microsoft is many competing vendors for all products that complement Windows: application software, games, utlities, hardware including graphic cards, microprocessors etc..

Thus Intel and AMD (and Cyrix and Transmeta etc.) producing what is understood as a commodity chip is what Microsoft want. But Intel alone offering the new standard chip without competition, that is as bad for Microsoft as Microsoft's monopoly (again, in the market for fully Windows-compatible operating systems) is for everybody else.

Note that this "everybody else" is makers of computer components (including software), not customers. Customers already pay as much as they will for a computer and this pricepoint is not affected by exactly which component of the computer is a commodity and which is not. An Itanium-based computer with Windows would, in the long run, cost as much as a Windows computer using a commodity CPU, namely as much as customers will pay for the entire solution. The only difference is the percentages of the money Microsoft and Intel or other chip makers receive.

If the processor is a commodity, Microsoft's percentage will be higher. If the processor is not a commodity (say an Itanium standard which only Intel can supply products for), Microsoft's percentage will be lower by exactly the amount Intel have gained.

Customers directly profit from commodity components when every single required component is a commodity. If the operating system as well as all the hardware is a commodity, prices for the entire system will fall, because none of the suppliers has a monopoly on anything and thus none of the suppliers would be in the position to earn the difference between cost of production (the price of a commodity) and what customers are willing to pay (which can be a lot more). The amount paid in excess of what commodity components would cost is what economists refer to as "rent". In this case the rent is a "monopoly rent" since it exists due to a monopoly power. This is similar to the reason for why an apartment in a nice neighborhood costs more than the same apartment in a bad neighborhood, in case you have ever wondered.

There are degrees between the two extremes. For example the operating system could be replaced by something that is not exactly the same thing but sufficiently like it. This would bring the price of the operating system down, i.e. commoditize it a bit. Linux has that effect. OS/2 did. Novell DOS did.

Sometimes no operating system at all can replace Windows. The customers in question don't need it and replace it with anything. In that case the operating system becomes a commodity, and I think one might notice that in these cases a computer with Windows and without Windows will cost the same, because the pricepoint is not affected by the exchange.


Think "Corn Flakes with Milk".

If both Corn Flakes and Milk are commodities, "Corn Flakes with Milk" will cost as much as it costs to produce them, say the amount C+M, with "C" being the cost to produce Corn Flakes and "M" being the cost to produce Milk.

If both Corn Flakes and Milk are produced by only one source, "Corn Flakes with Milk" will cost X, with "X" being the maximum amount customers are willing to pay for "Corn Flakes with Milk". This is the "pricepoint" referred to above and in extreme cases (if there is absolutely no other food at all) it will be infinite. [0]

If Corn Flakes are a commodity and Milk is produced by only one source, the product "Corn Flakes with Milk" will also cost X, with one of the Corn Flakes makers making C and the Milk maker making X-C (which is more than M).

If Corn Flakes are produced by only one source and Milk is a commodity, the product "Corn Flakes with Milk" will again cost X, with one of the Milk makers making M and the Corn Flakes maker making X-M (which is more than C).

If Corn Flakes are produced by only one source and Milk is a commodity and Corn Flakes can be replaced by something that is not quite the same as Corn Flakes but a somewhat acceptable replacement, say Rice Flakes, which are produced by only one source, the product "Rice Flakes with Milk" will make Y, for which we know is true: YM and the Rice Flakes maker will make Y-M and M will be, as above, the cost of producing the Milk.

And if a customer only wants Milk and doesn't care about Corn Flakes, even if only one source produces Corn Flakes, the customer will pay for "Corn Flakes with Milk" the same amount he would pay for just Milk, whatever that costs. Thus, for this customer, Corn Flakes would be a part of the commodity "some product I don't need", which is, of course, the ultimate commodity (the price is below even production costs at zero).


Thus Intel's Itanium chip would have cost less than what a PC costs but the price of it would have been a greater percentage of the price of a PC than Intel's chip is today.

And Microsoft certainly did not want that.

Thus Itanium now competes against SPARC and POWER in the server market where CPUs are expensive and operating systems tend to be commodities.

And that is, simply put, why Mac OS was ported to Intel x86 but not to Intel's IA-64 architecture.

Microsoft did not want it to happen. And Apple must see it the same way.


[0] This is not quite true, or at least it is possibly not quite true. The monopolist might not be able to sell anything if the price is too high because all customers wait for the price to fall and nobody wants to be the first buyer. The price will eventually fall because the monopolist can make a greater profit by selling to more customers. In order to sell to more customers the monopolist has to offer the product for a price more customers can afford. Customers can know that and thus don't want to be the first buyer. And the cycle repeats.


Comments (Page 3)
3 Pages1 2 3 
on Jun 22, 2005
You have to look at what Apple tries to achieve with their hardware, then look at what AMD and Intel provide...

1. Apple wants an established business partner that can produce large quantities reliably. Intel has proven this time and again as they are the dominant brand in the market. AMD has not had a chance to really prove itself here

2. Apple has been trying for a while to get their laptops faster. The PowerPC chips used in the latest desktops (G5) were taking far too long to shrink down, get cooler and consume less power. Despite what YOU may feel about laptops, most people want a laptop that doesn't drink the battery and doesn't scorch their lap (the last is a common complaint about Powerbooks). AMD is known for speed at the expense of power and cooling effeciency. Intel has a line of chips specifically made for laptops that meets all the major needs and delivers solid speed.

3. DRM! Apple needs DRM tech on the chip to limit what hardware OS X can be installed on. Intel is including this on their newer chips. As far as I know, AMD has no such tech.

4. Performance is relative to the market. AMD chips may score higher on lab benchmarks, may perform better in high-end games or other power user applications, but the market Apple is targeting with the Intel Macs doesn't do much of anything that's all that taxing. Macs aren't gamer rigs, and if you're a serious multimedia person, you will stick with the PowerPC lines anyway. The average user is not going to notice the speed difference between a Pentium 3.0GHz and an AMD 3200+

5. Cost per chip is irrelevant to Apple. They have always charged way more than the equipment costs. They've admitted that they put massive profit margins on all hardware sales. So now they'll have a chip that costs a LOT less than the PowerPC, but more than an AMD... I bet the price won't change much, and if it does it'll go down a smidge. The end user wouldn't see the benefits of the small price difference between Intel and AMD (and the difference when buying in bulk is much smaller than when the average consumer buys).

AMD is the gamer/power user's chip. They're great chips too, don't get me wrong, but they don't bring much value to the table for Apple. There was pretty much NOTHING they could offer that Intel couldn't.
on Jun 22, 2005
3. DRM! Apple needs DRM tech on the chip to limit what hardware OS X can be installed on. Intel is including this on their newer chips. As far as I know, AMD has no such tech.


DRM is an evil little monster. From what I've read it prevents the use of copying copyrighted material - and in the long run cracks down on P2P programs. It's a very good thing AMD isn't offering DRM in their chips. Someone please correct me if I'm wrong.
on Jun 23, 2005
From unconfirmed sources I read AMD would. I'm not sure the DRM Intel is implementing in its chips purports to enforce copyrights. At least not yet.
on Jun 23, 2005
Going way back to the first comment.
The Intel chips that will be used in the new Macs will not be x86 based.


Why would Intel make a brand new chip for apple? When it has one that will work just fine?

AMD's advantage is that its 64 bit chips are built from scratch with 64 bit and dual core in mind. Where as Intel have quickly added bits on to make 64 bit and dual core. But how long will it be before they build a brand new chip with designed with 64bit and dual core, from the mobile chip?



Darwin is OS X without the pretty interface, and Darwin runs on x86 systems.


OS X is a lot more that just a pretty interface otherwise why do people buy macs if they could just install Darwin x86 on any pc add KDE or Gnome and then just buy apple software saving themselves £1000 or so. OS X includes aqua the pretty gui and cocoa and carbon and more upon which the software runs thats make OS X differant to Darwin.
on Jun 23, 2005
Darwin is the guts of OS X. Any version of MacOS can be broken down into two equally important parts... the innards of the system, and the user interface layer.

Aqua is part of the UI. So is Expose etc... The UI layer on a Mac is very complex and involved.

Plus, as much as people like KDE/GNOME etc... they don't even come close to the visual refinements, appearance and simple ease-of-use you find on the Mac. That is why you pay more for the Mac, because it's UNIX with a UI that is actually usable.
on Jun 23, 2005
But how long will it be before they build a brand new chip with designed with 64bit and dual core, from the mobile chip?


not sure what you mean. Intel will ship a revision of Yonah (part of Centrino 3 platform code named Napa) about the time Windows code named Longhorn ships. That's gonna be my next laptop. The processor will be dual-core, SSE3, 2MB shared L2 cache, 667 Mhz FSB...
on Jul 02, 2005
Some thoughts about Mac OS Intel and Mac OS PowerPC:

http://www.netneurotic.net/mac/intel/index.html

on Jul 06, 2005
The main reason Apple decided to move away from PowerPC chips is because they were not getting faster as quickly as Apple wanted them to go. The x86 architecture turned out to be better for Apple's speed desires. The reason they picked Intel is because AMD has had a history of supply problems.
on Jul 09, 2005
Darwin is the guts of OS X. Any version of MacOS can be broken down into two equally important parts... the innards of the system, and the user interface layer.

Aqua is part of the UI. So is Expose etc... The UI layer on a Mac is very complex and involved.

Plus, as much as people like KDE/GNOME etc... they don't even come close to the visual refinements, appearance and simple ease-of-use you find on the Mac. That is why you pay more for the Mac, because it's UNIX with a UI that is actually usable.


That is a fairly bold (and by bold I mean biased) thing to say.

To say that the gnome and kde DE are not usuable while the MAC OSX one is? That is one of the funniest things I have heard in a while. To push statements like they are fact makes it even funnier.

I really dont want to get into a war or anything, but I will tell you that I think Gnome/KDE BLOW AWAY MacOSX in terms of simplicity, customizability, and appearance. Plus I get things done faster on them too.

No Mac is not the "elites of elites" in the computing world. They are just like everyone else including windows and the various *nix styles. They got their disadvantages and advantages and cater to a specific group of people.

Heck MacOSX barely compares to some of the WM that are out today. (Not including enlightenment, which rapes MacOSX in terms of visual appearance)

Oh And I think the REAL main reason that APPLES switched to Intel x86 architecture was because they have been planning this thing for 5 years. Switching architectures is REAL hard. Doing it secretly and developing software which can improved compatibility between 2 hardware architectures is INSANELY hard and time consuming. 5 years ago, Intel and x86, was all their only viable option. Remeber that this is more of a business move then a techie one.

Acceptance of 64bit and even AMD is just starting to take off now. Given the chance, Apple may have considered athlon-64 or athlon-xp (or athlon-mac whatever) Or mabey even Itanium, or heck a brand new architecture. But it looks like APple wanted to make the switch, and they wanted to make the switch fast, secretly, and as seemless as possible.
on Jul 09, 2005
Darwin is OS X without the pretty interface,



Just no.

Read up about OS architecture before running your mouth.
on Jul 09, 2005
AMD vs Intel.

Thats always a fun topic.

1st is 1st though. The potential and capabilities that 64 bit offer over their 32 bit counter parts are enormous. It is very obvious, that there will be a shift going on. Of course there will be naysayers to this shift, but these are the same people who doubted previous large scale shifts like this. We went from pII to pIII to P4. Welcome to the realm of technology...things change. The things that 64 bit is capable of is amazing.

Its fairly obvious that AMD is pushing 64 bit. While Intel seems to be sticking with 32 bit, while shrugging off its Itanium line and introducing some revolutionary technology like hyper-threading.

Basically if you want power, the PPC kind of power. Then You want AMD. They get the job done, are high performance, highly stable, and very high quality chips. They are the quickest mainstream chips out there. And very cheap.

However there are concerns like :

About AMD vs Intel: if heat was no problem, I'd probably go for an AMD Athlon 64 that offers better value and better overall performance. But did you ever hear about the Centrino platform for laptops, in which the Pentium-M offers better performance, battery life, efficiency and more performance-per-watt (less heat for higher performance) than their AMD counterparts, Athlon-M or Turion. It's a known thing!
If this chip was clocked higher and used on a desktop motherboard, AMD's hogs would be so poor it'd take six of'em to make a shadow.


Of course heat is an issue. And its nice to know that people arent only concerned about speed. However on my desktop, I have an AMD chip. Why? Cause I know I can cool down my desktop, and I know that the performace increase from AMD far outweighs the heat given off and watt usage that is saved from using Pentium based chips. And I also look into the future. Have any of you guys played the 64 bit version of far-cry, or used the 64 bit version of windows. I love it for what it can do. My memory usage drop ten fold, and I can just do more. Its a great fealing.

AMD chips are just better in the scheme of things.

Now for laptops, its a big issue. Its important to remember what they are used for. We dont need the AMD kind of power. We need something weaker. Currently, I have a mac laptop, PPC of corse =P and it suits my needs just fine. Obviously sticking a 3.2 gighz chip with ht will make it super hot (comparable to AMD 3200 heat). Getting that Centrino M technology is prolly the biggest thing though.

So while AMD is better in the scheme of things, people just dont look for the thing thats "best" but rather the thing thats "best for their situations". And with that apple methodology they cant put AMD 64s in desktops and centrino M in laptops. They want that close software-hardware relationship. As long as apple does that though, they will get bit in the ass because hardware architectures evolve.
3 Pages1 2 3