I guess he had some kind of 'Total Recall' memory wipe in mind.
June 10, 2012 6:51 PM Subscribe
I've lost track of the many reasons that have been given for the [Apple OS X] switch to Intel, but this I know for sure: no one has ever reported that, for 18 months, Project Marklar existed only because a self-demoted engineer wanted his son Max to be able to live closer to Max's grandparents.
I know Jobs was very fond of Sony. When the post mentions Jobs going to Sony Japan, it made me wonder if there was an effort to have Sony be an official licensee of OS X for their Vaio line. Though usually mentioned incidentally, I've read multiple times in the tech press that Sony has been uncomfortable with their Microsoft relationship.
posted by bionic.junkie at 7:05 PM on June 10, 2012
posted by bionic.junkie at 7:05 PM on June 10, 2012
One of the comments on the story is from a Japanese journalist who interviewed someone high up at Sony about this, and said that their recollection of it was that, yes, Steve Jobs approached them with a Vaio running OSX, but that they were too busy going international with the line to consider it at the time.
In hindsight, I'm happy about that because if Sony had become the Apple hardware maker, then Apple would have been tarred with all the stupid shit Sony went on to pull. In the long run, Apple was much better off making the Intel switch themselves.
posted by fatbird at 7:14 PM on June 10, 2012 [12 favorites]
In hindsight, I'm happy about that because if Sony had become the Apple hardware maker, then Apple would have been tarred with all the stupid shit Sony went on to pull. In the long run, Apple was much better off making the Intel switch themselves.
posted by fatbird at 7:14 PM on June 10, 2012 [12 favorites]
That's a neat story.
At another level, things like this don't surprise me much any more. I've been in companies where final site selections for the HQ building - and 100s of people - were based on being close enough to the CxOs' house to be able to go home for lunch. Glad to see if worked out for someone without a corner office.
posted by jquinby at 7:21 PM on June 10, 2012 [1 favorite]
At another level, things like this don't surprise me much any more. I've been in companies where final site selections for the HQ building - and 100s of people - were based on being close enough to the CxOs' house to be able to go home for lunch. Glad to see if worked out for someone without a corner office.
posted by jquinby at 7:21 PM on June 10, 2012 [1 favorite]
That is a very cool story, thanks.
posted by arcticseal at 7:32 PM on June 10, 2012
posted by arcticseal at 7:32 PM on June 10, 2012
It makes one wonder if there's a skunkworks lab making OS X for ARM. (Doubtful still PPC)
posted by pipian at 8:02 PM on June 10, 2012
posted by pipian at 8:02 PM on June 10, 2012
It makes one wonder if there's a skunkworks lab making OS X for ARM. (Doubtful still PPC)Uh...
posted by delmoi at 8:08 PM on June 10, 2012 [6 favorites]
I'm not sure I believe all of that. Saying it will only take two hours to get a pre-alpha port of an OS to run on some arbitrary laptop... that's a little too risky, especially if Steve Jobs needs to have it done tomorrow morning.
posted by qxntpqbbbqxl at 8:08 PM on June 10, 2012 [1 favorite]
posted by qxntpqbbbqxl at 8:08 PM on June 10, 2012 [1 favorite]
Sorry, delmoi, I meant rather, a full fledged form of what's commonly known as Mac OS X.
posted by pipian at 8:14 PM on June 10, 2012
posted by pipian at 8:14 PM on June 10, 2012
Sorry, delmoi, I meant rather, a full fledged form of what's commonly known as Mac OS X.
My understanding is that Apple has made ARM based versions of the MacBook Air, which would run OS X (or a hybrid that switches between iOS and OS X) but that performance just isn't there, ARM is good for some things, but intel is better for others, and I remember reading that intel will probably catch up to ARM on the mobile side before ARM catches up to intel on the desktop side.
posted by furtive at 8:20 PM on June 10, 2012
My understanding is that Apple has made ARM based versions of the MacBook Air, which would run OS X (or a hybrid that switches between iOS and OS X) but that performance just isn't there, ARM is good for some things, but intel is better for others, and I remember reading that intel will probably catch up to ARM on the mobile side before ARM catches up to intel on the desktop side.
posted by furtive at 8:20 PM on June 10, 2012
I'm not sure I believe all of that. Saying it will only take two hours to get a pre-alpha port of an OS to run on some arbitrary laptop... that's a little too risky, especially if Steve Jobs needs to have it done tomorrow morning.
I think they just wanted to see it boot. While a lot of work would have to go into getting it to do anything, at the point where you see "Welcome to Mac" it isn't as if much has happened. The x86 instruction sets are the x86 instruction sets, assuming that the thing was running in the least hardware-taxing (i.e., most arbitrary-hardware-compatible) configuration, it's certainly not impossible.
posted by axiom at 8:20 PM on June 10, 2012
I think they just wanted to see it boot. While a lot of work would have to go into getting it to do anything, at the point where you see "Welcome to Mac" it isn't as if much has happened. The x86 instruction sets are the x86 instruction sets, assuming that the thing was running in the least hardware-taxing (i.e., most arbitrary-hardware-compatible) configuration, it's certainly not impossible.
posted by axiom at 8:20 PM on June 10, 2012
Darwin (the low level OS pieces of Mac OS X without the graphics) never stopped running on Intel. Nextstep ran on HP, Sun, Intel, and PowerPC at one point.
The Marklar project got all of Mac OS X running on Intel. One of the brilliant parts was combining technology from Transitive to create a on-the-fly translator of PowerPC executables called "Rosetta". This let you run your PowerPC applications on an Intel machine and made the transition very much less painful.
posted by blob at 8:31 PM on June 10, 2012 [1 favorite]
The Marklar project got all of Mac OS X running on Intel. One of the brilliant parts was combining technology from Transitive to create a on-the-fly translator of PowerPC executables called "Rosetta". This let you run your PowerPC applications on an Intel machine and made the transition very much less painful.
posted by blob at 8:31 PM on June 10, 2012 [1 favorite]
I think they just wanted to see it boot. While a lot of work would have to go into getting it to do anything, at the point where you see "Welcome to Mac" it isn't as if much has happened. The x86 instruction sets are the x86 instruction sets, assuming that the thing was running in the least hardware-taxing (i.e., most arbitrary-hardware-compatible) configuration, it's certainly not impossible.
By the time you see "Welcome to Mac", the OS has successfully navigated not only the CPU instruction set, but also the graphics and disk I/O subsystems. There are all kinds of little details that can go wrong, and Sony is notorious for weird proprietary hardware. It's certainly possible that they installed Marklar and it booted all the way successfully, but it's not the kind of thing I would want to bet on if I were that engineer.
posted by qxntpqbbbqxl at 8:45 PM on June 10, 2012
By the time you see "Welcome to Mac", the OS has successfully navigated not only the CPU instruction set, but also the graphics and disk I/O subsystems. There are all kinds of little details that can go wrong, and Sony is notorious for weird proprietary hardware. It's certainly possible that they installed Marklar and it booted all the way successfully, but it's not the kind of thing I would want to bet on if I were that engineer.
posted by qxntpqbbbqxl at 8:45 PM on June 10, 2012
floam, verbose mode just means you see all the geeky behind the scenes messages when booting. It is different from single user mode. Eventually when booting in verbose mode, the WindowServer starts up and draws a grey background and all the messages stop being drawn to the screen. In single user mode you have a limited number of commands you can run, with no networking.
You hold down command-v at boot to see verbose mode, and command-s to boot to single user mode. From single user you can type "exit" to complete booting. You can do this today on a modern Macintosh.
posted by blob at 9:17 PM on June 10, 2012 [1 favorite]
You hold down command-v at boot to see verbose mode, and command-s to boot to single user mode. From single user you can type "exit" to complete booting. You can do this today on a modern Macintosh.
posted by blob at 9:17 PM on June 10, 2012 [1 favorite]
..Sony is notorious for weird proprietary hardware
Yeah, this wouldn't be the first time that renegade Apple engineers used a Sony product behind Steve Jobs' back.
It makes one wonder if there's a skunkworks lab making OS X for ARM. (Doubtful still PPC)
There is a really interesting video of Steve Jobs speaking at WWDC 97, when he was just brought back to Apple from Next. He wasn't CEO yet, I think he was VP of Software Development. I transcribed one bit:
Q: What do you think Apple should do with Newton?
A: Huh.. You had to ask that... [sighs, long pause] I'm in the minority. And what I think doesn't really matter about this. I think that most companies can't be successful with one stack of system software. Rarely can they manage two. And we, I believe, are going to be successful managing two over the next few years with Mac OS, and Rhapsody which is a superset of that. I cannot imagine being successful trying to manage three...
Well that's basically what Jobs did, by combining iOS and MacOS X into one unified stack. The rest of that speech, following this remark, is astonishing. In hindsight, you can see him describing everything he would do at Apple for the rest of his life.
In any case, Marklar was not the big deal everyone makes it out to be. The core of MacOS X was officially released as open source and ran on Intel from the very beginning, even before MacOS X was out of beta. It was called Darwin. It's still the core of MacOS X. All they had to do was port the proprietary Apple stuff on top of that stack, the heavy lifting was already done.
posted by charlie don't surf at 9:53 PM on June 10, 2012
Yeah, this wouldn't be the first time that renegade Apple engineers used a Sony product behind Steve Jobs' back.
It makes one wonder if there's a skunkworks lab making OS X for ARM. (Doubtful still PPC)
There is a really interesting video of Steve Jobs speaking at WWDC 97, when he was just brought back to Apple from Next. He wasn't CEO yet, I think he was VP of Software Development. I transcribed one bit:
Q: What do you think Apple should do with Newton?
A: Huh.. You had to ask that... [sighs, long pause] I'm in the minority. And what I think doesn't really matter about this. I think that most companies can't be successful with one stack of system software. Rarely can they manage two. And we, I believe, are going to be successful managing two over the next few years with Mac OS, and Rhapsody which is a superset of that. I cannot imagine being successful trying to manage three...
Well that's basically what Jobs did, by combining iOS and MacOS X into one unified stack. The rest of that speech, following this remark, is astonishing. In hindsight, you can see him describing everything he would do at Apple for the rest of his life.
In any case, Marklar was not the big deal everyone makes it out to be. The core of MacOS X was officially released as open source and ran on Intel from the very beginning, even before MacOS X was out of beta. It was called Darwin. It's still the core of MacOS X. All they had to do was port the proprietary Apple stuff on top of that stack, the heavy lifting was already done.
posted by charlie don't surf at 9:53 PM on June 10, 2012
Oops, I forgot the link to Job's WWDC 97 Keynote. Here it is.
posted by charlie don't surf at 9:54 PM on June 10, 2012
posted by charlie don't surf at 9:54 PM on June 10, 2012
Check out iOS running on Darwin on the Linux kernel.
posted by Ad hominem at 10:28 PM on June 10, 2012 [2 favorites]
posted by Ad hominem at 10:28 PM on June 10, 2012 [2 favorites]
I worked on a high-profile app at Apple when Marklar was just a rumor even inside Apple. I was one of the four engineers working on VideoConference.framework, the videoconferencing core of iChat. Our app was ported to Intel without the knowledge of the engineers on our team, a pretty impressive feat.
I actually inserted a deliberate bug into iChat just to catch the Marklar team, because I was curious about the rumors. It was a big-endian dependency deep in the image resizing code, in the non-Altivec code path. If you ran it on a little-endian machine (like Intel), it would compile and run just fine, but there would be very noticeable banding artifacts in the video.
I don't remember the exact wording in the bug report, but they were pretty upfront about what they were doing. It was basically, "hey, video artifacts on little-endian machines". That confirmed that they were testing Intel builds of even the consumer-facing apps. So not just Darwin stuff, or server software, but flagship apps like iChat.
posted by ryanrs at 10:33 PM on June 10, 2012 [83 favorites]
I actually inserted a deliberate bug into iChat just to catch the Marklar team, because I was curious about the rumors. It was a big-endian dependency deep in the image resizing code, in the non-Altivec code path. If you ran it on a little-endian machine (like Intel), it would compile and run just fine, but there would be very noticeable banding artifacts in the video.
I don't remember the exact wording in the bug report, but they were pretty upfront about what they were doing. It was basically, "hey, video artifacts on little-endian machines". That confirmed that they were testing Intel builds of even the consumer-facing apps. So not just Darwin stuff, or server software, but flagship apps like iChat.
posted by ryanrs at 10:33 PM on June 10, 2012 [83 favorites]
ryanrs-- BRAVO
posted by effugas at 3:00 AM on June 11, 2012 [1 favorite]
posted by effugas at 3:00 AM on June 11, 2012 [1 favorite]
I'm not sure I believe all of that. Saying it will only take two hours to get a pre-alpha port of an OS to run on some arbitrary laptop... that's a little too risky, especially if Steve Jobs needs to have it done tomorrow morning.Not really. PC hardware is all pretty much the same. People who write Linux Distros don't need to test for every single laptop. Windows certainly didn't target every different machine. If it works, it works. The only question would be if they had some unusual hardware that didn't have any windows drivers. And even then, it would just be that that hardware wouldn't work. So long as it wasn't the keyboard or touchpad it should be fine.
Sorry, delmoi, I meant rather, a full fledged form of what's commonly known as Mac OS X.The core is the same. iOS just has a different UI. Getting the rest of the UI to compile and run probably wouldn't take much work, it's unlikely they have any CPU specific code in them (and I'm almost certain they got the whole thing working, then swapped out the UI)
Also, in terms of graphics it would have been even less of a problem apple was already using PC graphics hardware in the Mac. And they were using the same disk IO systems as well. Why would it be a problem? Computers are not that different from each other, at that point the Macintosh was basically the same, hardware wise as a PC it just had a PPC chip and a different BIOS. If you're coding in C/C++ (Or Objective C, I imagine) most of your code should port. It's only if you're doing "weird" stuff that the CPU instruction set becomes an issue, and that shouldn't be happening in the UI code (other then for optimization, for which you should have a fallback)
posted by delmoi at 7:39 AM on June 11, 2012
Any machine can emulate the software parts of any other machine - if you're not picky about how fast it runs. The non-CPU hardware (and its often-masked quirks) is the tricky part.
posted by Twang at 3:18 PM on June 11, 2012
posted by Twang at 3:18 PM on June 11, 2012
ARM is good for some things, but intel is better for others, and I remember reading that intel will probably catch up to ARM on the mobile side before ARM catches up to intel on the desktop side
The former has already happened. Intel make an x86 chipset (and entire phone design) that's competitive (performance and power consumption) with top-end ARM chipsets.
posted by grahamparks at 5:04 PM on June 11, 2012
The former has already happened. Intel make an x86 chipset (and entire phone design) that's competitive (performance and power consumption) with top-end ARM chipsets.
posted by grahamparks at 5:04 PM on June 11, 2012
Any machine can emulate the software parts of any other machine - if you're not picky about how fast it runs.
No, that is not true at all. That is a common misconception called The Church-Turing Fallacy.
This is a misinterpretation of Church-Turing Thesis. The correct interpretation: A Turing Computable function can be computed on a Turing Machine. Notice I said function, not algorithm or program. Notice I did not say it can be computed on any Turing Machine.
Out of the set of all possible software programs, the set of programs that can be emulated perfectly on different computers is actually quite small.
This is an extremely subtle and arcane issue of computer science, so I'm not surprised it has become a common misconception. The labeling of Turing Machines as General Purpose Computers has fed this misconception. Modern computers can be used generally for many purposes, but they are not General Purpose Computers. Modern computer applications are not Turing Computable functions.
posted by charlie don't surf at 9:15 PM on June 11, 2012 [1 favorite]
No, that is not true at all. That is a common misconception called The Church-Turing Fallacy.
This is a misinterpretation of Church-Turing Thesis. The correct interpretation: A Turing Computable function can be computed on a Turing Machine. Notice I said function, not algorithm or program. Notice I did not say it can be computed on any Turing Machine.
Out of the set of all possible software programs, the set of programs that can be emulated perfectly on different computers is actually quite small.
This is an extremely subtle and arcane issue of computer science, so I'm not surprised it has become a common misconception. The labeling of Turing Machines as General Purpose Computers has fed this misconception. Modern computers can be used generally for many purposes, but they are not General Purpose Computers. Modern computer applications are not Turing Computable functions.
posted by charlie don't surf at 9:15 PM on June 11, 2012 [1 favorite]
Charlie, where did you get all of that from? The Church-Turing Fallacy is only relevant to the computational theory of mind, and the reason it's a fallacy is not to do with the fact that there are Turing machines that cannot simulate other Turing machines, but (as per the article you linked) the fact that the Church-Turing thesis is invoked to establish that brains evaluate Turing-computable functions, where either it would need to be simultaneously proven that the brain is performing computations in the first place, or (on the assumption that all physical systems can be interpreted as evaluating Turing-computable functions) it would need to be proven that there is something special about the type of computation that brains do, since they are no longer distinguishable from any other physical system by the fact that they are indeed computational devices.
Anyway, the way you're capitalizing "General Purpose Computer" makes it sound like it's a term of art, but if so it's one I've never heard of. Are you talking about a Universal Turing machine, which is defined as a Turing machine that can simulate any other Turing machine? It is theoretically true that digital computers are not Universal Turing machines, since they don't have infinite storage and can therefore not simulate any Turing machine that would require more than the available storage to simulate. Therefore, digital computers are not Turing complete in the most rigorous sense. However, basically every modern computer would be Turing complete if it were given infinite storage. Anyway, practical computation is better modeled by linear bounded automata, and although it is true that for every linear bounded automaton a there is another linear bounded automaton b that cannot be simulated by a, it is also the case that there is a linear bounded automaton c that can simulate b. So, since every possible program that does not require infinite storage can be emulated by some linear bounded automata (yes, computer applications are Turing-computable, since any Universal Turing machine could replicate them -- otherwise applications could not be run on computers!), it follows that it can be emulated by an infinite number of other linear bounded automata. It then follows that any machine can be emulated by another machine. You are right that the answer you were responding to is technically wrong, but only because some machines are unable to emulate other machines by virtue of limited storage, which is a pretty minor point.
posted by invitapriore at 3:31 PM on June 19, 2012
Anyway, the way you're capitalizing "General Purpose Computer" makes it sound like it's a term of art, but if so it's one I've never heard of. Are you talking about a Universal Turing machine, which is defined as a Turing machine that can simulate any other Turing machine? It is theoretically true that digital computers are not Universal Turing machines, since they don't have infinite storage and can therefore not simulate any Turing machine that would require more than the available storage to simulate. Therefore, digital computers are not Turing complete in the most rigorous sense. However, basically every modern computer would be Turing complete if it were given infinite storage. Anyway, practical computation is better modeled by linear bounded automata, and although it is true that for every linear bounded automaton a there is another linear bounded automaton b that cannot be simulated by a, it is also the case that there is a linear bounded automaton c that can simulate b. So, since every possible program that does not require infinite storage can be emulated by some linear bounded automata (yes, computer applications are Turing-computable, since any Universal Turing machine could replicate them -- otherwise applications could not be run on computers!), it follows that it can be emulated by an infinite number of other linear bounded automata. It then follows that any machine can be emulated by another machine. You are right that the answer you were responding to is technically wrong, but only because some machines are unable to emulate other machines by virtue of limited storage, which is a pretty minor point.
posted by invitapriore at 3:31 PM on June 19, 2012
Yes, that fallacy comes up most often in computational mind theories, but I first came to it through strictly computer based theories. It's been so long, I'll have to see if I can dig up the papers where I learned about it.
I learned the usage of "General Purpose Computer" as a term of art, equivalent to a Turing machine, not a Universal Turing machine. I will have to do a little refresher study to see if there are some refinements of language here that I have glossed over. In any case, I don't want to complicate this issue, which you and I basically understand and agree on. I mostly come to the point I made in this thread from discussions with anti-DRM people, who believe that any computer system can be emulated under a hypervisor, which could "intercept" DRM and remove it. These people truly do not understand Church-Turing, and violate it with the fallacy in its most obvious applications. But then, I don't think there are more than a handful of people who can follow Turing through his papers, and none who understand him fully. I'm sure not even close.
But let's not belabor the point, which is that people extrapolated from Church-Turing a misbelief that any computer can emulate any other computer, given enough time and memory. Sorry, it is obvious that even with infinite memory and time, an AIM-65 single board computer with a 20 character LED display cannot emulate my Mac mini with a 30 inch 24 bit LCD display.
posted by charlie don't surf at 6:40 PM on June 19, 2012
I learned the usage of "General Purpose Computer" as a term of art, equivalent to a Turing machine, not a Universal Turing machine. I will have to do a little refresher study to see if there are some refinements of language here that I have glossed over. In any case, I don't want to complicate this issue, which you and I basically understand and agree on. I mostly come to the point I made in this thread from discussions with anti-DRM people, who believe that any computer system can be emulated under a hypervisor, which could "intercept" DRM and remove it. These people truly do not understand Church-Turing, and violate it with the fallacy in its most obvious applications. But then, I don't think there are more than a handful of people who can follow Turing through his papers, and none who understand him fully. I'm sure not even close.
But let's not belabor the point, which is that people extrapolated from Church-Turing a misbelief that any computer can emulate any other computer, given enough time and memory. Sorry, it is obvious that even with infinite memory and time, an AIM-65 single board computer with a 20 character LED display cannot emulate my Mac mini with a 30 inch 24 bit LCD display.
posted by charlie don't surf at 6:40 PM on June 19, 2012
Sorry, it is obvious that even with infinite memory and time, an AIM-65 single board computer with a 20 character LED display cannot emulate my Mac mini with a 30 inch 24 bit LCD display.
Why are you bringing up the display? That's not what UTMs are about; the question for C-T is, can the AIM-65 single board computer compute the same class of functions that your Mac mini can, given arbitrarily large storage. Or more rigorously, given a function your mini can perform, whose input and output can be represented as a series of symbols, can the AIM-65 be configured to compute the same function. The answer to that is yes (well, I don't know anything specific about the AIM-65 architecture, so maybe there's some esoteric shortcoming of its instruction set that prevents that, but I really doubt that).
An aside about displays, though: Insofar as you can look at the series of frames computed by your mini to output to the display as equivalent to a series of symbols on a tape, the AIM can be made to output the same string of symbols given the same inputs (and a lot of storage). Just not as fast.
posted by axiom at 8:38 PM on June 19, 2012
Why are you bringing up the display? That's not what UTMs are about; the question for C-T is, can the AIM-65 single board computer compute the same class of functions that your Mac mini can, given arbitrarily large storage. Or more rigorously, given a function your mini can perform, whose input and output can be represented as a series of symbols, can the AIM-65 be configured to compute the same function. The answer to that is yes (well, I don't know anything specific about the AIM-65 architecture, so maybe there's some esoteric shortcoming of its instruction set that prevents that, but I really doubt that).
An aside about displays, though: Insofar as you can look at the series of frames computed by your mini to output to the display as equivalent to a series of symbols on a tape, the AIM can be made to output the same string of symbols given the same inputs (and a lot of storage). Just not as fast.
posted by axiom at 8:38 PM on June 19, 2012
Well yes, I'm deliberately taking this out of UTM territory. Arbitrary examples of computers don't map well onto each other. That's the fallacy I was refuting with those DRM advocates, they believe that any computer with sufficiently fast speed and memory can emulate any other computer. That's their "hypervisor" hypothesis, that a sufficiently fast computer can emulate the other computer from a hypervisor, giving us a window into the emulated machine as a subset of the capacity of the superior machine. That is absolutely not what Church-Turing was about, but that's what they think it implies.
posted by charlie don't surf at 9:14 PM on June 19, 2012
posted by charlie don't surf at 9:14 PM on June 19, 2012
« Older Watch out for that volcano! | Spoiler alert Newer »
This thread has been archived and is closed to new comments
posted by Blazecock Pileon at 6:53 PM on June 10, 2012