WI no Apple: how long are home computers delayed?

Hendryk

Banned
Steve Jobs' passing today raises some obvious counterfactual questions, one of which being the impact on the development of home computers if Apple had never been founded.

So, let us assume that in 1974, Steve Jobs, during his spiritual pilgrimage to India, has a fatal accident while riding a bus with defective brakes. Steve Wozniak keeps his day job at Hewlett-Packard and the Apple company is never founded. It's probable that the idea of the computer as a domestic item would eventually catch on, but how much longer would it take, and where might the idea come from if not Apple?
 

NothingNow

Banned
Probably a few years, but with either Atari, Commodore, Acorn or someone we never heard of leads the way. Standards would probably be even more troublesome than OTL.
 
It wont be delayed at all, the Apple wasnt the first home computer, and there were plenty of firms (far too many really) ready to fill any missing numbers.
 
IBM is already working on a business PC and a GUI was a idea waiting to happen. Apple was truly inspiring but not critical to PC development. So we dont get a few overpriced for there function computers and HP gets a good engineer. Gaming has driven PC development for along time and will continue to do so for even longer. Apple has almost no game presence and doesn't drive development, So basically as soon as people want better games development goes into the over drive its in now. Keep in mind I can remember 30+ years ago playing games on a PC, It might have only been Ultima 1 but we gamed.
 
Commodore/Amiga could take over as the professional graphics workstation. IIRC, Babylon 5's space SFX were done on Amigas.
 
The early home PC market reminded me of the early automobile marketplace. Many manufacturers with different features and designs. Some innovative and some just trying to make a buck.

Apple helped to produce a comprehensive idea of what a home PC should or could be, but I think this just saved a couple of years development. Sooner or later, someone else would have stepped up with a similar design and we'd be where we are today.

Now I think they really killed with the Iphone, Ipad, etc.:D
 
You have to remember how Apple got started in the first place.

Woz built a machine. He worked for HP. HP had first dibs on it. HP passed. Jobs and Woz get a small business loan, the ball gets rolling.

Woz may not have been able to get HP to see it for what it was, but that doesn't mean another company, like Atari or TI or Tandy or Commodore or whoever would pass on it so quickly.

I always liked the idea of a team of Woz, Al Alcorn, George McCleod and Jay Miner building an 8 bit for the home market at Atari.

McCleod and Miner's chips in a Woz and Alcorn designed machine?

World. Beater.

Woz loves games too, so a Woz and Alcorn console wouldn't be out of the question either.

Now, if only there was a way to get them to use a better CPU. (I'm a fan of the MCP 6800 series myself...)
 
You have to remember how Apple got started in the first place.

Woz built a machine. He worked for HP. HP had first dibs on it. HP passed. Jobs and Woz get a small business loan, the ball gets rolling.

Woz may not have been able to get HP to see it for what it was, but that doesn't mean another company, like Atari or TI or Tandy or Commodore or whoever would pass on it so quickly.

I always liked the idea of a team of Woz, Al Alcorn, George McCleod and Jay Miner building an 8 bit for the home market at Atari.

McCleod and Miner's chips in a Woz and Alcorn designed machine?

World. Beater.

Woz loves games too, so a Woz and Alcorn console wouldn't be out of the question either.

Now, if only there was a way to get them to use a better CPU. (I'm a fan of the MCP 6800 series myself...)

an Apple / Amiga hybrid would be pretty cool.. you'd have to wax apple OS though.. early apples where not Macintosh's .. Miners dreams where basically what the Amiga turned out to be.. at least in the first few models.. But if you could have a Macintosh / Amiga in the late 70's early 80's instead of 84 -85 that would be a game changer.

I grew up with the Amiga.. Mac's annoyed me cause they were so slow and couldn't multi task.. The Amiga for its faults.. with the help of the blitter, and the custom chips just rocked.. if you could combine the two in some manner that served both markets.. IE the geek- home market and the educational business market.. it might just knocked the IBM PC out of the game .. stunted Microsoft... and other glorious things :)
 
You have to remember how Apple got started in the first place.

Now, if only there was a way to get them to use a better CPU. (I'm a fan of the MCP 6800 series myself...)

Four problems with that line of reasoning...

1. The Motorola 6800 series chips were only cross-compatible with each other if one was psychic and could write a compiler (in 1974!) that only used the 44% of the instructions on the chip shared with the 6803 (introduced in 1976, used only in bargain models of the Tandy CoCo, Various U.K. and Hong Kong clones, and a few French business computers I could only find reference to on Old-Computers.com.) the 6805 (introduced in 1978) and 6809 (introduced in 1979). What's worse, each chip in the series had unique instructions with very useful functions that weren't followed up on later chips. (Including vector math in the 6800 {Which is why it was used as a graphics co-pocessor in Atari arcade hardware} and accumulate in the 6805). The 65c02, 65c802, and 65816 were completely backward compatible with the 6502.

2. Aside from the 6809, all chips in the series executed one instruction every four clock cycles. All MOS Technology CPUs and their derivatives executed one instruction every clock cycle.

3. The 68000 series had a completely different architecture from the 6800 series, with completely different 8 bit instructions along side the 16 and 32 bit. Backward compatibility could only be assured by either A; putting a 6800 series chip on the motherboard as a co-processor (and putting that into account in OS, which isn't completely foolish in the early-mid Eighties if you plan to use it as, say, a sound CPU) or B; Emulation which will require a 68000 with 70Mhz clock speed (Won't happen; way too hot for 1980s), 60820 with 35Mhz, or a 68030 or higher with 17Mhz. Hitachi, however did produce subsequent backward compatible chips; first the 6309, then the H8 and SuperH series of RISC chips.

4: Motorola insisted on even higher margins than Intel does now.
 
Four problems with that line of reasoning...

1. The Motorola 6800 series chips were only cross-compatible with each other if one was psychic and could write a compiler (in 1974!) that only used the 44% of the instructions on the chip shared with the 6803 (introduced in 1976, used only in bargain models of the Tandy CoCo, Various U.K. and Hong Kong clones, and a few French business computers I could only find reference to on Old-Computers.com.) the 6805 (introduced in 1978) and 6809 (introduced in 1979). What's worse, each chip in the series had unique instructions with very useful functions that weren't followed up on later chips. (Including vector math in the 6800 {Which is why it was used as a graphics co-pocessor in Atari arcade hardware} and accumulate in the 6805). The 65c02, 65c802, and 65816 were completely backward compatible with the 6502.

2. Aside from the 6809, all chips in the series executed one instruction every four clock cycles. All MOS Technology CPUs and their derivatives executed one instruction every clock cycle.

3. The 68000 series had a completely different architecture from the 6800 series, with completely different 8 bit instructions along side the 16 and 32 bit. Backward compatibility could only be assured by either A; putting a 6800 series chip on the motherboard as a co-processor (and putting that into account in OS, which isn't completely foolish in the early-mid Eighties if you plan to use it as, say, a sound CPU) or B; Emulation which will require a 68000 with 70Mhz clock speed (Won't happen; way too hot for 1980s), 60820 with 35Mhz, or a 68030 or higher with 17Mhz. Hitachi, however did produce subsequent backward compatible chips; first the 6309, then the H8 and SuperH series of RISC chips.

4: Motorola insisted on even higher margins than Intel does now.


ahh but to be seen as a true inovater.. true cutting edge.. real world high performance.. margines are one thing.. if it works and it rocks.. it will sell.. maybe not in god crazy numbers... but it will sell.. and prices do drop... and then those become more affordable to the masses...

I hated Apple II's had to use them at school.. between that and IBM pc's i wanted to puke.. then along came atari and commodore to the rescue.. the C64 and 128 were game changers.. followed up with the Amiga (which i used for 10 years) and falocon series of atari's (which were cool but the amiga had it all over them) were the true game changers.. they were the cutting edge in home computing...
 
Top