Moore’s law: the theory first postulated by Intel cofounder Gordon Moore in 1965 that the speed and power of microchips—that is, computational processing power—would double roughly every year, which he later updated to every two years, for only slightly more money with each new generation.

