From: Tim J. <te...@we...> - 2007-12-29 21:20:25
|
On Wed, 2007-12-26 at 16:19 -0500, David Essex wrote: > I agree that small code may have been good practice back in the > 60s early 70s. > > However, assuming this part of the urban legend is true, these > programs were replaced by structured programming methods. > So memory was not an issue. > > Even if the programs were written back in the 60s, these > programs could have been easily converted by the maintainers > of these systems. Not to mention cheaper. > > As to why management choose to replace these programmers, who > knows. Maybe it was just cheaper to replace them with less paid > programmers. Then used some excuse to justify their actions. > Nothing new under the sun here ... > > That is the thing about urban legends, you never really know > what to believe. > > > Cheers At that time a typical computer ran 100,000 instructions per second. I remember in 1977 using a "supercomputer" capable of 1,000,000 instructions per second. Program listings often had clock counts written next to each instruction, so people could add up the counts to see what the impact on performance was of a given change, such was the concern about running times. Using an alter statement changed an evaluation followed by a conditional branch to an unconditional branch saving at least one instruction. See how much work you can get done, given a budget of 1 hour of CPU time on such a CPU (3,600 X 100,000 = 36,000,000 instructions or about 30 milliseconds on your modern computer. This amount of processing time would have cost about $3,000). On such a computer, a build of GCC would have taken about 6 months (if it had fitted in memory, which it would not have by a factor of a few hundred). Tim Josling |