“Moore’s law” is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years. The observation is named after Gordon E. Moore, co-founder of the Intel Corporation, who first described the trend in a 1965 paper and formulated its current statement in 1975. His prediction has proven to be accurate, in part because the law now is used in the semiconductor industry to guide long-term planning and to set targets for research and development. The capabilities of many digital electronic devices are strongly linked to Moore’s law: quality-adjusted microprocessor prices, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at roughly exponential rates as well. This exponential improvement has dramatically enhanced the effect of digital electronics in nearly every segment of the world economy. Moore’s law describes a driving force of technological and social change, productivity, and economic growth in the late twentieth and early twenty-first centuries. The period is often quoted as 18 months because of Intel executive David House, who predicted that chip performance would double every 18 months (being a combination of the effect of more transistors and their being faster). Although this trend has continued for more than half a century, “Moore’s law” should be considered an observation or conjecture and not a physical or natural law. Sources in 2005 expected it to continue until at least 2015 or 2020. The 2010 update to the International Technology Roadmap for Semiconductors predicted that growth will slow at the end of 2013, however, when transistor counts and densities are to double only every three years.