Clock speed refers to how fast the system clock drives the computer’s CPU (central processing unit, the chip that runs the computer) which determines how fast the system as a whole can process information internally. Clock speed is measured in megahertz; a speed of one megahertz (l MHZ) means the system clock is sending out its electric current one million times per second. The higher the clock speed of a computer, the faster the computer can operate, assuming all other factors are equal. However, clock speed isn’t the only factor that determines your computer’s overall performance, or even how fast the microprocessor (another term for the cpu) gets things done. Two different microprocessors may run at the same clock speed, and still take different amounts of time to finish a given job.
In fact different sections within the PC run at different speeds, so the external memory and system BUS are more likely to be clocked at 100 MHz while input/output channels run at yet another frequency.
Although each microprocessor is rated for a maximum clock speed, the actual clock speed of your system is determined by the system clock, not the microprocessor. The original IBM PC and its Intel 8088 CPU poked along at a torpid 4.77MHZ; my Mac Hcx, with its Motorola 68030 (pronounced “sixty eight oh thirty” or just “oh thirty” for short) runs at 16MHZ; and Steve’s 80486-based PC clone barrels along at 33MHZ.
The same microprocessor “model” is typically available in several versions, each capable of running at a different maximum clock speed. For instance, there are versions of the 68030 that run at 16MHZ, 25MHZ, 40MHZ, and 50MHZ, and of course they get faster all the time. The faster the processor can run, the more expensive it is, and that’s one reason computers with a faster clock speed cost more.