|
Posted by David McCall on 12/06/05 19:29
"Richard Crowley" <richard.7.crowley@intel.com> wrote in message
news:dn4moh$8n1$1@news01.intel.com...
> "Martin Heffels" wrote ...
>> "bmcswain" wrote:
>>
>>>It is SO remarkable where we have come from and where we are.
>>>I remember the day when IBM doubled (DOUBLED) memory for the 3084 - a
>>>monster maimframe - doubled the memory to 64 meg! WOW! HUGE! How
>>>will customers EVER use all that memory?
>>
>> I worked in an air-traffic-control center, where they ditched the
>> 370 mainframe 5 years ago. It only had 64MB to run all the apps, but they
>> were all written in nice and clean assembly, so there was still plenty of
>> space left in that RAM :)
>
> Well, if we're playing "can you top that?"....
>
> When I first came to Intel (1978), we were running "e-test"
> (electrical testing of individual transistors, etc while still on the
> wafer) with a test system (Lomac) run by an S-100 bus
> computer, a Z80 microcomputer with 64KB of RAM and
> two 240Kb 8-inch floppy drives. I wrote test software for
> the machine and within the 64KB RAM, we got...
> 1. CP/M operating system
> 2. Basic interpereter (including Lomac's extentions)
> 3. Test code
> 4. Test parameter array
> 5. Data results array
>
> But e-test was two careers ago and I don't know what they
> are using now. Generic PCs, I'd suspect.
So, your 64K of memory came in the form of 8 x 8K cards,
or 4 x 16K cards? I built a few S-100 systems in the 70s too.
Made me feel like a real pioneer :-)
David
Navigation:
[Reply to this message]
|