floating point operations per second (Flops)

Previous Definition
Next Definition
Popular Terms
Definition
Measure of a computer's (or microprocessor's) computational ability. It indicates how many mathematical operations involving decimal fractions the computer (or microprocessor) can handle in one second. For PCs it is measured in millions of flops (megaflops), for mainframe computers in billions of flops (gigaflops), and for super computers in trillions of flops (teraflops).