<
unit> A basic unit of computation, one period of a computer
clock.
Each
instruction takes a number of clock cycles. Often the
computer can access its memory once on every clock
cycle, and
so one speaks also of "memory cycles".
Every
hacker wants more cycles (noted hacker
Bill Gosper
describes himself as a "
cycle junkie"). There are only so
many cycles per second, and when you are sharing a computer
the cycles get divided up among the users. The more cycles
the computer spends working on your program rather than
someone else's, the faster your program will run. That's why
every hacker wants more cycles: so he can spend less time
waiting for the computer to respond.
The use of the term "
cycle" for a computer clock period can
probably be traced back to the rotation of a generator
generating alternating current though computers generally use
a clock signal which is more like a
square wave.
Interestingly, the earliest mechanical calculators,
e.g. Babbage's
Difference Engine, really did have parts
which rotated in true cycles.
[
Jargon File]
(1997-09-30)