The Texas Instruments TMX 1795: the first, forgotten microprocessor
May 11, 2015 7:48 AM   Subscribe

In the late 60's and early 70's, the technology and market were emerging to set the stage for production of monolithic, single-chip CPUs. In 1969, A terminal equipment manufacturer met with Intel to design a processor that was smaller and would generate less heat than the dozens of TTL chips they were using. The resulting design was the 8008, which is well known as the predecessor to the x86 line of processors that are ubiquitous in desktop PC's today. Less well known though, is that Texas Instruments came up with a competing design, and due to development delays at Intel, beat them to production by about nine months.
"The story starts with the Datapoint 2200, a "programmable terminal" sized to fit on a desktop. While originally sold as a terminal, the Datapoint 2200 was really a minicomputer that could be programmed in BASIC or PL/B. Some people consider the Datapoint 2200 the first personal computer as it came out years before systems such as the Apple II or even the Altair. "
Also included in the article is background on the state of the industry at the time, and information on other designs that are commonly stated as the "first microprocessor".
posted by ArgentCorvid (17 comments total) 21 users marked this as a favorite
 
Dang, I did not know this, and I teach first year computer architecture! Time to update my slides.
posted by bouvin at 9:28 AM on May 11, 2015 [5 favorites]


Yes... I knew about most of the other designs milling (ho ho) around at the start of the 70s, but not the TI part. I wonder if we'll ever know how much information was shared between Intel and TI on the Datapoint design - I can quite see that both implementations of the Datapoint spec could have the same interrupt handling bug even though they were independently implemented...

I see that Ted Hoff says that he got some design inspiration from a computer at Stanford that Plessey, "a British tractor company", had donated.

Which is odd and interesting. Plessey wasn't a tractor company; it was mostly a defence contractor and telecomms specialist. It did produce a range of computers for UK air defence and, later, for telephone exchange applications, but information online is scarce. One reference says that for a while, Plessey was a world leader in computer design but because this was mostly classified work there was no way to turn that advantage into long-term benefit. (On the other hand, my experience of the UK defence industry is that it was perfectly capable of mucking up anything it touched without outside help.)

Plessey worked on the Atlas computer (early 60s, most powerful computer n the world at the time), and I know about the XL2 (germanium) and XL9 (silicon). But I don't know about the XL series' architecture, nor whether it was one of those in Stanford that gave Ted Hoff ideas for the 4004. It'd be great to do a family tree of processor concepts.
posted by Devonian at 9:37 AM on May 11, 2015 [3 favorites]


What a fantastic article. Ken Shirriff has been writing some really excellent stuff. His database of SMS cards from 1960s IBM mainframes is fascinating. Also his description of Bitcoin Script is one of the best things I've ever read describing the nitty-gritty of what's actually on the Bitcoin blockchain. He also has done a lot of really excellent Unix systems work, particularly in the 90s at Sun. This old resume gives a quick overview of some of his background.

I think Shirriff is right in categorizing it as an inevitable technology: "the microprocessor wasn't anything to specifically invent, but just something that happened when MOS technology improvements and a marketing need made it worthwhile to build a single chip processor". Half of why this "first microprocessor" history is so fraught is its frightfully expensive journey through the insane US patent system.
In 1990, seemingly out of nowhere, Gilbert Hyatt received a very general patent (4942516) covering a computer with ROM and storage on a single chip. Hyatt had filed a patent on his computer in 1969, and due to multiple continuations, he didn't receive the patent until 1990.
A 21 year old submarine patent. Thankfully in the meantime actual inventors were inventing actual things. I don't know the story about how '516 ended up getting "lost", anyone know?
posted by Nelson at 9:42 AM on May 11, 2015 [1 favorite]


But even taking that into account, Texas Instruments didn't seem to put much effort into the layout, which Mazor calls "pretty sloppy techniques" and "throwing some blocks together".[9] While the 4004 and especially the 8008 are densely packed, the TMX 1795 chip has copious unused and wasted space.

It has ever been thus - you have to optimize if you want to run at the bleeding edge.
posted by GuyZero at 10:22 AM on May 11, 2015


"...had the same architecture as the 8008"
When I read that, I was suspicious, but since they were both designed around the Datapoint, it makes sense.

But then I went off into SMS card world. That's a pretty big undertaking- there were a lot of card types. I found my favorite one, the DJL. Sorry there wasn't a picture, but it looked like a double-wide version of the AEN.
One of the few logic cards I could sometimes fix (replace the fuse) rather than just replace.
posted by MtDewd at 10:48 AM on May 11, 2015 [1 favorite]


Nelson: "I don't know the story about how '516 ended up getting "lost", anyone know?"

The Federal Circuit upheld a decision against Hyatt in an interference, which is basically an action to determine who invented something first. In the same opinion, the court upheld the decision that the winner in the interference was "not entitled to a patent on this subject matter."
posted by exogenous at 11:35 AM on May 11, 2015 [1 favorite]


Devonian:Plessey wasn't a tractor company

Just like Ball isn't just a packaging company. I've seen their name on jars for canning (and more recently beverage cans) forever. I didn't realize that they were also a top 100 defense contractor.
posted by ArgentCorvid at 11:39 AM on May 11, 2015 [1 favorite]


My first job as a software engineer was programming the 8008-1 built into a digital tape drive.

Ye Gods, that was a long time ago.
posted by Chocolate Pickle at 11:52 AM on May 11, 2015 [2 favorites]


My friend did a bunch of work on Intel 4004 35th Anniversary Project. He says via text message
I think it is silly to debate who "invented" the first microprocessor. The key move that Intel made was that they sold the 4004 to all comers instead of offering an exclusive to a single customer. I had heard that TI was working on custom calculator chips around the same time. I hadn't heard they built an 8008 clone.  "Second sourcing" was extremely common back then. The 8008 was a commercial flop for some of the the reasons mentioned in the article. Even the 8080 was the heart of a single board computer. Even it was not a single chip computer. That came much later.
posted by alms at 12:08 PM on May 11, 2015


Plessey wasn't a tractor company

Footnote: They didn't build tractors, but their Dynamics group produced hydraulic pumps and related equipment that was used not only by the military, but also many tractor manufacturers. There's not much left of the company these days, though.
posted by effbot at 12:23 PM on May 11, 2015


Well.. Plessey's not quite dead yet, and still has a semiconductor fab plant.

In Devon.

Natch.
posted by Devonian at 3:41 PM on May 11, 2015


I'll never forget my first microprocessor: a Cyrix 486SLC. It ran Doom, but only after hitting F5. *tear*
posted by turbid dahlia at 3:45 PM on May 11, 2015


pretty sloppy techniques

TI is a weird company. They have a history of making really nice stuff, then handing it off to some other part of themselves that does its best to make sure it can't possibly work properly.

Their TI 99/4 and /4A personal computers were built around their TMS9900 processor, an interesting and reasonably capable 16 bit design. But in a move prefiguring the way Android hobbles the interesting and reasonably capable ARM architecture it almost always runs on by making its entire userland run on top of the rather crappy Dalvik virtual CPU, TI totally crippled the TMS9900 in the 99/4 by connecting only 256 bytes of RAM to it directly (the CPU itself was capable of addressing 64K) and making it execute nothing but an interpreter for a virtual machine called GPL (Graphics Programming Language).

The hardware interface to the bulk of the RAM in the machine required the GPL interpreter to write a RAM address out to a port in the video display controller, then read or write the data through another port*. This applied to all memory accesses including GPL instruction fetch.

Then they used GPL to implement the BASIC interpreter. So anything you wrote in BASIC on your 99/4A was interpreted by an interpreted interpreter, making TI BASIC dog slow; about a tenth of the speed of the rather older Applesoft BASIC.

*This technique also turns up in Nintendo's NES and SNES consoles, which is one of the reasons they are such a pain in the arse to write performant code for.
posted by flabdablet at 9:13 PM on May 11, 2015 [1 favorite]


Early virtualisation - a la TI GPL, or p-code - is fascinating, because it reveals tectonic faultlines between engineering, marketing and what users want. The TMS9900, at least in the TI 94/A, was such a disaster - but there are always reasons...

If you're an engineer and are suddenly given a blank sheet of paper and a magic wand, which is what happens when a brand-new market opens up on the back of a brand-new technology, then you go to town. You can see for a thousand miles. You have the chance to Do Things Right, which means finally being able to make things properly, not bodged and compromised like last time, using the philosophies and methodologies you were taught, or that match your vision for a satisfying, capable and above all clean construction. You're building for the ages. So you design in future-proofing. You want to have one overall architecture that can work across hardware generations (software compatibility!) and from home computers to mighty mainframes (market segments!). You want robustness, flexibility and modularity. You want something you can hold up before your maker when you are called to answer for your life and say "There! Look! Is it not perfect?". All this needs not an architecture but An Architecture, with functional layers and pathways forward and options in all the right places and... well, it may take some time to do right and the untrained eye may not grok its elegance from the outside straight away, but as we go into the future, boy, will they thank you.

Marketing sniffs the potential here. One ring to bind them all, etc. A clear vision that covers everything, that's future-proofed, that encompasses the latest thinking, that can be sold as an efficient, far-sighted investment. Hell, this defines the future. And it's all about the future in this game.

Users want something that does a job now, does it cheaper and faster. that's shinier the moment you plug it in, something that makes them look good to their peers or their bosses, whatever. That's what they'll actually buy, given the chance. (Which they may not get - corporate computing acquisition is dangerously decoupled from what users want and need... but that's another story.)

So, while the technology creators are busy building a visionary future, some cruddy little outfit who doesn't have the time for all that nonsense has thrown together a box that is indeed cheap and cheerful and will be obsolescent across the stack, from the CPU to the applications, by the time the ink is dry on the brochure. It is nothing but brilliant compromise. It sells, though. and for some damn reason users actually prefer it to the Grand Vision V 1.0 (not quite available yet) - just because it's cheaper, faster and does the damn job. So corners are cut to get Grand Vision 1.0 out in some competitive form ASAP, and a crippled, malformed monster with echoes of a beautiful soul is thrust out there to die.

Meanwhile, the other crippled, malformed monster - that the cruddy little outfit didn't expect to live past a season - has actually changed the environment around it enough to not only survive but start to mutate into new niches.

Rinse and repeat, in CPU architecture, in OSs, in frameworks...

Lots of us have been lucky enough (or cursed enough) to live through this at first hand. I was at the launch of ARM and the launch of Java - the first as an engineer working in Cambridge, the second as a hack covering PC technology, neatly ten years apart - and tracing the path over the decades that led from those events to where I'm at now, and how I'm holding an Android phone that cost next to nothing but encompasses the Whole Bloody World, is technologically, culturally and personally one of wonder.

Reading stories like this one, of the still-born TMX 1795, is like being a biologist simultaneously working on the human genome and the Cambrian Explosion, at the point that some miracle has provided the fossilised DNA of a barely-suspected exotic forebear, and seeing a bit more of the big picture behind the fractured miracles of the present.

I love this stuff.
posted by Devonian at 2:07 AM on May 12, 2015 [8 favorites]


I was at the launch of ARM

It's taken a couple of decades, but that particularly pleasing little monotreme has finally evolved to the point where it looks like it's beginning to press Intel's lipstick-saturated pigs for living space, and I for one say huzzah for that.

Pity about the Transputer. The world was just not ready for high speed serial links.
posted by flabdablet at 1:57 PM on May 12, 2015 [4 favorites]


ArgentCorvid, are you familiar with the MP944 Central Air Data Computer (CADC)?
posted by NortonDC at 5:35 AM on May 13, 2015


I've heard of it, yes. It's also covered in the article: "I don't consider this a microprocessor since the control, arithmetic, and storage are split across four separate chips in each functional unit.[23] Not only is there no CPU chip, there's not even a general-purpose ALU chip." ... "Even if you define a microprocessor as including a multi-chip processor, Viatron beat the CADC by a few months. While the CADC processor is very interesting, I don't see any way that it can be considered the first microprocessor. "

The overarching theme that I got out of the article is that the development of the microprocessor as a chip was pretty much inevitable at that point in time.
posted by ArgentCorvid at 6:50 AM on May 13, 2015


« Older "Zero tolerance"   |   I breathe deeply, banish all distractions, and... Newer »


This thread has been archived and is closed to new comments