A world without the microchip?

NomadicSky

Banned
Geoffrey W.A. Dummer could have been the father of the modern world he unsuccessfully attempted to build the integrated circuit in 1956.

So he wasn't a success, it's possible there could have been more.

After years of constantly being unsuccessful he gives up.

How world 2009 look without microchips?

Today their world is at a late 60's level of technology.

Color tv is new stations are converting to color, you'll still be able to use your black and white tv but you'll need a color set to truly see the broadcast.

What would the global society look like in a world without yahoo, google, facebook, a world where the Encarta only exist in the form of paper World Books sitting on a self in your local library.

There are no cellulars, no txt, there is the pen, and paper. Friends are met through true social interaction not myspace.

And yes we wouldn't be sitting here this morning typing away at a plastic computer that can sit in your lap.
 
If your sole POD is the idea that the integrated circuit or microchip, colour television is still going to be around. The NTSC colour standard was first formulated in 1953, for example, and early broadcasts in 1954. (I think there seem to have been colour broadcasts in the US going even further back.) The SECAM standard in use in parts of Europe was apparently patented in 1956. The microchip, however, was first properly implemented, IOTL, by Jack Kilby in 1958 (and independently by Robert Noyce, who was involved in founding Fairchild Semiconductor and Intel).

(And apart from Jack Kilby, which I knew already, most of that came off Wikipedia.)

It would be interesting to see how technology would develop without the microchip- I am sure it would not stay static since the mid-late '60s.
 
The trouble is that killing the IC was going to take more than just one POD. The idea was in the air back in the late '50's if Dummer had not put multiple transistors on a single wafer somebody else would have. And if that person hadn't then somebody else would have. There were probably 50-60 people looking into the IC "problem" during that time period.

The US military really, really wanted IC's for their missile and radio programs, they were going to keep throwing money at the problem until somebody solved it.

The only real change would be the name of the person who is the father of the IC.


If it had been delayed by 10 years it might have an impact but only in that tubes were getting smaller, cheaper and less power hungry. You might see more combined discrete transistor/tube designs and you might get "Tube-IC" designs where there were multiple switching tubes. But I highly doubt it.
 

NomadicSky

Banned
I was thinking a world where McCarty becomes president in 1956 and is again reelected in 1960 as one of the major pod's. In this timeline technology is still developed but slow to get out into the public.
 
I was thinking a world where McCarty becomes president in 1956 and is again reelected in 1960 as one of the major pod's. In this timeline technology is still developed but slow to get out into the public.

Joe McCarthy died in 1957 at the age of 48, and his life could have been prolonged if another POD could stop his alcoholism.

As for color TV, it would have existed quite well with tubes and transistors. You would have to fine tune each station every time you changed channel, and adjust the color level and hue for each station. If you were lucky enough to have a remote control in 1962, it was the Zenith Space Command device that was connected by a cord to the TV.

IC technology did not become visible to the public until the first pocket calculators were introduced in 1972. Early microwave ovens, in lunch rooms in the early seventies but not in homes until the late seventies or eighties, ran with mechanical timers and temperature probes.

For the IC to be delayed to the point that we are still living in with 1970 technology, it would take several POD's that make missile technology unnecessary in the sixties and beyond. If that sounds unlikely today, remember those of us who were around in 1970 could only plan our lives as if the technology of the time would continue indefinitely.

If you needed to do a research paper, you went to the library and copied notes out of books, including the source information for footnotes. If the library was well equipped, it might have a photocopy machine that produced copies for 10 cents each. Since 10 cents was today's equivalent of 50 or 60 cents, you didn't make very many copies. Most likely, the copies were on thermal paper with its poor contrast and "greasy" feeling, since the xerographic toner system used by more modern copiers was still under patent of Xerox.

No microchips is an exercise in describing what it was like to be in school (or at work) in 1970.
 
If you needed to do a research paper, you went to the library and copied notes out of books, including the source information for footnotes. If the library was well equipped, it might have a photocopy machine that produced copies for 10 cents each. Since 10 cents was today's equivalent of 50 or 60 cents, you didn't make very many copies. Most likely, the copies were on thermal paper with its poor contrast and "greasy" feeling, since the xerographic toner system used by more modern copiers was still under patent of Xerox.

No microchips is an exercise in describing what it was like to be in school (or at work) in 1970.

Oh my this brings back memories! Yes, lots of hand written notes with specific pages copied if you really needed to. Typewriters rather than computers. Research was go to the library for 10-12 hours at a shot and hope the books you needed were not in use.

Also I don't think it would have held up at that level of tech for very long anyway. Even with delayed ICs there were people looking at ways of packing more computing power into less space even in the early to mid-60's with multi-function tubes and smaller discrete transistors.

Unless your POD is more like a delay of the transistor its self somebody was going to try to put more than one transistor in a single package. And even then people were looking to make tubes smaller and more efficient at the same time. So while people who have never worked with tubes think of the giant 2+" tall monsters they show on old movies there were "mini" tubes that were 1/2" or less tall and around that did not produce as much heat and were much more reliable. And the old tube companies were working on better and more efficient models, so I think that there would have been some advances in computer tech even without ICs and transistors.
 
Technology reaches plateaus from time to time. Color TV's reached a plateau from the early eighties until after 2000 when they adopted remote controls and automated color-tuning systems (yes, with the IC). Even today, personal computers are reaching a plateau, as five year old models are still functional today; unlike the nineties when the machines became obsolete much faster.

We can not forecast tomorrow's technology, just as Bill Gates could not forecast the need for more than 640 kb of computer memory in 1980. Ever since the early twentieth century, each innovation came with the response "what will they think of next?"
 
Technology reaches plateaus from time to time. Color TV's reached a plateau from the early eighties until after 2000 when they adopted remote controls and automated color-tuning systems (yes, with the IC). Even today, personal computers are reaching a plateau, as five year old models are still functional today; unlike the nineties when the machines became obsolete much faster.

um, no not true. I had much more success running new software on 5 year old hardware in the late '90s that I do today. If anything the rate of change has been dramatically increasing. The computers I sold in the early '80s were close enough analogs in performance, operating system and functionality that they were recognizable and usable 10 years later in the early '90s. Now the delta in performance and ability is dramatically different between the new machine on my desk at work, the 1 year old machine that my son built and the 3 year old machine on my desk at home. And that is not even taking into account other things that are computerized - hell my MP-3 player has more computer power than the first PC I built by 2 orders of magnitude, and 5 orders of magnitude more memory even counting both kinds of memory.

Yes technology reaches plateau's and maybe color TV's are an example but computers, nope no sign of them plateauing yet.

Not to sound old but..."you damn kids don't know how well you got it!" :D

We can not forecast tomorrow's technology, just as Bill Gates could not forecast the need for more than 640 kb of computer memory in 1980. Ever since the early twentieth century, each innovation came with the response "what will they think of next?"

Ha 640kb, try 8kb or 64kb - Frankly 640kb seemed huge. It is easy to forget with throwing around TB drives and GB memory, one of my early jobs had access to a "supercomputer" with 1MB of main memory, 1GB of disk space, with 10GB of "offline" disk space that was really a tape array. It was not just Microsoft, nobody projected where this computer/information revolution would go.
 
um, no not true. I had much more success running new software on 5 year old hardware in the late '90s that I do today. If anything the rate of change has been dramatically increasing.

It all depends on the type of software and hardware you are using. In the business world, Microsoft Office and e-mail applications are the staples. Since 2000, I have experienced much more fluency between users than in earlier years.

Twenty years ago, my first computer was a Macintosh SE with an 8 kHz processor and an impressive 1 MB of RAM. I used MS Word 3.0 and Excel 2.3. Transferring documents to the DOS platform was a pain, and remained so well into the nineties. That computer became obsolete in six years. Today, my six-year old model with a 1 GHz processor and 1 GB of RAM shows no sign of slowing down for the standard business documents I use.

Granted, much entertainment software is very demanding and some computer companies use games to evaluate the speed of their systems. But the basic formats .doc, .xls, .jpg, .txt, etc., have become established terms of standardization. A documment created in 2001 is readable today and will likely remain so for years in the future. Plateau may have been a poor choice of words, but compared to the mix-matched formats of earlier years, it is so much easier to exchange files today. Most of all, it is a relief not to have to buy new computers every three or four years.
 
It all depends on the type of software and hardware you are using. In the business world, Microsoft Office and e-mail applications are the staples. Since 2000, I have experienced much more fluency between users than in earlier years.


Oh sure at that level they have "stagnated" but I would call that standardization - take Railroad for example. At the start Rails were many different gages, and distances apart, even within a single country. (Call that the OS/Software situation in PC's in the 1970's) then countries standardized within and sometimes that was the same as other countries (Call that the 1980's), then there were large networks of rail that were all standard (1990's+).

But at the same time there was major change and innovation in rail engines and cars until it reached diesel electric and stopped for many years.

I am not saying that is perfect (or even a good:D) analogy but it is one I could think of off the top of my head!

Tom.
 

NomadicSky

Banned
Yes technology reaches plateau's and maybe color TV's are an example but computers, nope no sign of them plateauing yet.

Not to sound old but..."you damn kids don't know how well you got it!" :D

I disagree.

Sure sometimes computers made life somewhat more easy, writing papers spell check and online or just cd-rom style info like the encarta really helped.

But it sucks in other ways, online added a way for people to be even more antisocial, more accidents due to cell phones, and more rudeness...

No, I think from a human aspect a world without the microchip might not be as advanced or "global" but it would better in other ways.
 
Didn't President Lincoln sign the bill that subsidized railroads that built to a standard gauge?

All technology eventually reaches multiple plateaus, where periods of rapid change are followed by periods of slow change. The evolution of the automobile is a perfect example, as new features became standard.

Look at the typewriter. The skeletal-looking mechanical device from the late nineteenth century evolved into the heavy business machine of the 1920's. Their they stayed until electric typewriters became the norm. From the sixties to the eighties, the electric machines were in every office, with only style changes and occasional "frills" like proportional spacing. Then came the word processor and the "Olivetti factor" that doomed the typewriter industry.

We might call the simple Microsoft business computer applications a "stagnation," but they do their jobs well. Actually, computers could have remained at the level of the Intel 80286 and the Motorola 68000 and the typewriter and adding machine would still have been replaced by the desktop computer. In other words, they constituted a critical "step" in the evolution of the office.

In those years, offices shared large data files consisting of catalog-sized printouts of lists, part numbers, inventories. With the server, the data went on line and the catalogs disappeared.

Each of those developments represents its own "plateau." A Word document might now contain an animated cartoon, but that does not change the fact that it reached a critical stage of functionality more than a decade (or two) ago.
 
Didn't President Lincoln sign the bill that subsidized railroads that built to a standard gauge?

Well he might have, but it was still another 30 years before all of the US rail was standard gauge. And as late as the 1940's the Eastern European rail system was on a different gauge.
 
All technology eventually reaches multiple plateaus, where periods of rapid change are followed by periods of slow change. The evolution of the automobile is a perfect example, as new features became standard.

I agree.

Look at the typewriter. The skeletal-looking mechanical device from the late nineteenth century evolved into the heavy business machine of the 1920's. Their they stayed until electric typewriters became the norm. From the sixties to the eighties, the electric machines were in every office, with only style changes and occasional "frills" like proportional spacing. Then came the word processor and the "Olivetti factor" that doomed the typewriter industry.

I somewhat agree, there were periods of major change and there were periods of standardization or stagnation but over all once a typewriter reached the "you can type on it" stage the changes were frills or making it cheaper.

We might call the simple Microsoft business computer applications a "stagnation," but they do their jobs well. Actually, computers could have remained at the level of the Intel 80286 and the Motorola 68000 and the typewriter and adding machine would still have been replaced by the desktop computer. In other words, they constituted a critical "step" in the evolution of the office.

Yes, they could have replaced those two pieces of equipment. However that covers about 10% of what people actually use computers for today - even in business. In the 1980's you could have dropped someone from any time back to the early 18th century into an office and they would have basically understood what was going on. People wrote things down on paper (even if they typed them it was obvious what was happening), people sent memos, they could communicate faster and the phone would have been a reach but they were basically doing the same thing.

Today, that is no longer true, we sit in front of glass screens moving small plastic things and typing but no paper comes out and we point at things and they move on the screen. Then the factory builds something. And fast, everything happens fast. The thing to remember is that computers are shrinking so fast and changing capabilities so fast that for the most part we can only keep track of small parts of it. My example of the MP-3 player a few posts back is a perfect example - that would not have been possible for the highest end computer 40 years ago, it would not have been possible for a top of the line workstation class machine in the early '90s I now carry it around in my pocket for less than $200. Maybe, and I am not sure I even buy this, but maybe desktop computers have plateaued, computers in general no way. Still getting smaller/faster/more capable faster than we can react.

In those years, offices shared large data files consisting of catalog-sized printouts of lists, part numbers, inventories. With the server, the data went on line and the catalogs disappeared.

Each of those developments represents its own "plateau." A Word document might now contain an animated cartoon, but that does not change the fact that it reached a critical stage of functionality more than a decade (or two) ago.

Except that it was not possible to "embed" anything in a document two decades ago let alone easily send it to someone half a world away that you only know "virtually". The computer power just was not there, let alone at so many peoples fingertips.
 
I agree.
Except that it was not possible to "embed" anything in a document two decades ago let alone easily send it to someone half a world away that you only know "virtually". The computer power just was not there, let alone at so many peoples fingertips.

Maybe not "embed" but faxes and Telex could send documents around the world where there were phone lines. Furthermore, modern offices tend to use even more paper than in the 80s because every second email has to be printed out for some technophobe or to be archived for legal purposes.

As for know people virtually, when I was a child in the early 80s I had several "pen pals" around the world with whom I wrote these mythical missives called letters on actual paper and sent through real mail. :rolleyes::rolleyes:
 
Today ...... we sit in front of glass screens moving small plastic things and typing but no paper comes out and we point at things and they move on the screen. Then the factory builds something. And fast, everything happens fast. The thing to remember is that computers are shrinking so fast and changing capabilities so fast that for the most part we can only keep track of small parts of it.

Yes, and the trend will continue to the point that the Blackberry, iPhone and Microsoft phone will have the characteristics of laptop computers of a few years ago. But many work stations, as described above, have already reached the point where one operator can do the work of five machinists.

It will get better. Surgeons will do more and more sophisticated work through Band-Aid sized incisions. The real world fact is, though, that many individuals only use a fraction of the computing power that is available in the hardware and software they already own.

Sure sometimes computers made life somewhat more easy, writing papers spell check and online or just cd-rom style info like the encarta really helped.

But it sucks in other ways, online added a way for people to be even more antisocial, more accidents due to cell phones, and more rudeness...

No, I think from a human aspect a world without the microchip might not be as advanced or "global" but it would better in other ways.

This point makes me think of an aspect of cell phone communication that strikes me as a regression, rather than an innovation: texting. I do not use that feature, mainly because (in my opinion) it takes longer to key in words than to leave a voice message. Because texts can be forwarded, broadcast and bounced around, you have a more antisocial (if not rude) potential.

Texting might appear to be the antithesis of the communicating innovation that epitomized the future in the sixties: the picture telephone. In fact, I believe the # sign on the telephone keypad was originally intended to designate a picture line, as such a phone number might look like #221-2345. With web cameras and photo cell phones so widely available, it is interesting that users spend more time trying to exchange stolen music than conducting video conversations.

Go back 25 years and what was the most popular Christmas toy: the Cabbage Patch Kid. At a time when microchips were appearing in all kinds of toys, this stuffed toy had no more technological innovation than the fabrics and dyes from which it was made. While some jump into the newest technology ASAP, others prefer to wait.
 
Maybe not "embed" but faxes and Telex could send documents around the world where there were phone lines. Furthermore, modern offices tend to use even more paper than in the 80s because every second email has to be printed out for some technophobe or to be archived for legal purposes.

As for know people virtually, when I was a child in the early 80s I had several "pen pals" around the world with whom I wrote these mythical missives called letters on actual paper and sent through real mail. :rolleyes::rolleyes:


Well yes but the point was that computers are stagnating, my point was that the things that we do regularly and simply via a computer network were only possible (if at all) with great difficulty and expense.

And that difficulty and expense is going down due to the advances in computers.
 
Yes, and the trend will continue to the point that the Blackberry, iPhone and Microsoft phone will have the characteristics of laptop computers of a few years ago. But many work stations, as described above, have already reached the point where one operator can do the work of five machinists.

It will get better. Surgeons will do more and more sophisticated work through Band-Aid sized incisions. The real world fact is, though, that many individuals only use a fraction of the computing power that is available in the hardware and software they already own.

Ah but I only said the capabilities are going up not that everyone was using them. :D

Or more to the point using them for useful things - heck 90+% of the bandwidth of the Internet is either spam or porn. :eek:

But that does not change my point that computer tech has not plateaued yet.

This point makes me think of an aspect of cell phone communication that strikes me as a regression, rather than an innovation: texting. I do not use that feature, mainly because (in my opinion) it takes longer to key in words than to leave a voice message. Because texts can be forwarded, broadcast and bounced around, you have a more antisocial (if not rude) potential.

Texting might appear to be the antithesis of the communicating innovation that epitomized the future in the sixties: the picture telephone. In fact, I believe the # sign on the telephone keypad was originally intended to designate a picture line, as such a phone number might look like #221-2345. With web cameras and photo cell phones so widely available, it is interesting that users spend more time trying to exchange stolen music than conducting video conversations.

I agree completely on this - I just don't see what the point is. If I need to talk to someone I will call or email. Texting just does not do it for me - maybe the fact that I am a touch typist I just can't see using a micro-useless phone keyboard when I have a full sized laptop keyboard.

Go back 25 years and what was the most popular Christmas toy: the Cabbage Patch Kid. At a time when microchips were appearing in all kinds of toys, this stuffed toy had no more technological innovation than the fabrics and dyes from which it was made. While some jump into the newest technology ASAP, others prefer to wait.

Oh that is absolutely true, and more to the point my daughter was more interested in and had more fun with her charcoal, pastel and pencil art kit this Christmas than the computer games we got for the family. Just goes to show that sometimes the simple stuff is the best.
 
But that does not change my point that computer tech has not plateaued yet.

I agree that computer technology as a whole has not plateaued for the long term. However, its interface with the business and the public has reached an intermediate plateau in terms file formats, platform compatiblites and the comfort zone of the public to adapt based on need.
 
I agree that computer technology as a whole has not plateaued for the long term. However, its interface with the business and the public has reached an intermediate plateau in terms file formats, platform compatiblites and the comfort zone of the public to adapt based on need.

Okay - that I'll agree with. Yes the current basic interface/file format has plateaued.
 
Top