Vaclav Smil digs through the history of invention to focus on the failures — the innovations that started out promising but ended disastrously, that were the next big thing but fell far short of the hype, or have been overhyped for decades but are still more science fiction than reality — and the lessons that can be learned from it.
The Notes
- Categories of Invention:
- Simple Handmade Items – stone tools, farm tools, tools for draft animals, furniture, pottery, etc.
- Machines – waterwheels, windmills, furnaces, sailing ships, etc.
- New Materials – bronze, aluminum, glass, cement, plastics, carbon-based composites, etc.
- New Methods of Production/Operation/Management – mass-scale manufacturing, information gathering, data processing, etc.
- “Innovation is perhaps best understood as the process of introducing, adopting, and mastering new materials, products, processes, and ideas. Accordingly, there could be plenty of invention without commensurate innovation.”
- Total US Patent applications granted:
- 911 from 1800 to 1810
- 250,000 during 1890s
- 340,000 from 1900 to 1910
- 1,653,000 during 1990s
- The media often hypes up new inventions with exaggerated claims and fantasies, using labels like “revolutionary” or “transformative,” as if they offer immediate solutions to problems when they don’t. Most inventions start as a foundation for further gradual progress that may or may not become something bigger.
- “There are limits to everything, and invention and innovation cannot be exceptions.”
- “We should restrain our ever-present compulsion to forecast how new inventions will shape our future: retrospectives of such efforts show only very limited success and a preponderance of failures.”
- “Every solution of a complex problem, every helpful advance that eases or eliminates a specific harmful or undesirable impact, every innovation promising better performance, higher profits, or improved handling, or increased comfort or safety, has its obverse. Its reach and intensity range from predictable, tolerable, manageable (or simply time-limited) side effects to unforeseen yet potentially serious consequences that are not easy to deal with.”
- Leaded Gasoline
- Gottlieb Daimler and Wilhelm Maybach attached a water-cooled engine to a wooden coach in 1886.
- Karl Benz attached a single-cylinder engine on a 3-wheel chassis in 1886.
- Maybach designed the Mercedes 35 in 1901.
- Henry Ford sold the Model T in 1908.
- Charles Kettering designed an electric starter in 1911, eliminating the hand crank.
- Standard Oil introduced the process of thermal cracking crude oil that increased gasoline yield.
- Knocking was a problem for all combustion engines that could cause damage to cylinder heads, piston rings, and pistons. It reduced engine efficiency and released pollutants like nitrogen oxide emissions.
- “Knocking is caused by spontaneous ignitions (small explosions, mini-detonations) taking place in the remaining gases before they are reached by the flame front initiated by sparking. Knocking creates high pressures (up to 18 MPa, or nearly up to 180 times the normal atmospheric level), and the resulting shock waves, traveling at speeds greater than sound, vibrate the combustion chamber walls and produce the telling sounds of a knocking, malfunctioning engine.”
- 3 options to eliminate knocking in the 1900s:
- Keep the compression ratio low — meant reduced engine efficiency and wasting fuel
- Develop smaller, more efficient engines that run on better fuel.
- Use additives to prevent uncontrolled ignition — allowed for lower-quality fuel to be used in more powerful engines with higher compression ratios. The easy solution at the time.
- It was known that engines didn’t knock with pure ethanol and ethanol blends were tested in the US and Europe.
- Ethanol has 3 disadvantages:
- More expensive than gasoline.
- Not enough supply to meet the growing demand compared to gasoline.
- Increasing its supply meant a significant share of crop production.
- Thomas Midgley headed the research that fixed the “knocking” issue at Charles Kettering’s Dayton Research Laboratories in 1916.
- Kettering was hired by GM in 1919.
- Kettering and Midgley found tetraethyl lead (TEL) reduced knocking in concentrations as low as 0.04% by volume.
- GM contracted DuPont and Standard Oil to produce TEL in 1922 and the new leaded fuel was available in 1923.
- Midgley and Kettering stated ethanol was the “fuel of the future” and that a 20% blend of ethanol and gasoline could be supplied by 9% of the country’s crop production.
- Yet, GM claimed: “So far as we know at the present time, tetraethyl lead is the only material available which can bring about these results.”
- Ethanol required the mass development of a new industry not controlled by GM.
- Ethanol production was impractical from a cost perspective.
- TEL was low cost – a penny’s worth of TEL per gallon of gasoline reduced knocking.
- GM claimed there were no health concerns despite lead being known for centuries to be a toxic metal that caused health problems. It was inaccurately named “ethyl gas” to avoid the mention of lead.
- Lead was known to cause neurotoxic damage, with unborn children and infants being the most vulnerable by the early 1900s.
- Bureau of Mines released its TEL investigation claiming no public danger in 1924, the day the last of five workers died, out of 35 total that experienced neurological symptoms, from acute exposure at the TEL processing plant.
- Another investigation in 1925 of 252 workers over 7 months, again, found no danger to the public.
- The Surgeon General’s office set a voluntary standard of 3 grams of TEL per gallon of gasoline in 1927. It became the global standard.
- TEL was used in aviation fuel and produced faster, more powerful engines in WWII.
- The Surgeon General’s office raised the maximum allowable TEL to 4.23 grams per gallon.
- The US used 2 trillion gallons of leaded gasoline between 1945 and 1975. It added about 4.7 million tons of lead to the environment via exhaust.
- By the 1940s, it was clear that lead exposure caused behavioral disorders, growth retardation, and intellectual impairment in children. By the 1970s, it was known that low prolonged exposure could cause the effects.
- The EPA mandated reduced auto emissions and the phased-out removal of lead in gasoline in 1973. The phase-out began in 1979.
- Lead paint was banned in the US in 1977.
- Unleaded gasoline held 3% of the market in 1970, 12% by 1975, 63% in 1985, and 95% in 1991.
- Average lead concentration in children dropped 80% from 1976 to 1994.
- “A recent study led by Anna Aizer has shown that even further reductions in lead from historically low levels have significant positive effects on children’s third-grade reading test scores: every unit decrease in average blood lead levels reduced the probability of a child being substantially below proficient in reading by about 3 percent.”
- “There are no worrisome adverse effects caused by burning a mixture of gasoline and ethanol, and crop-derived ethanol (in the US overwhelmingly from corn, in Brazil from sugar cane) became the leading antiknocking additive. The rise of US ethanol began in earnest in 2005.”
- DDT
- The ideal insecticide should:
- Rapid effect on as many insect species as possible.
- No toxicity for mammals and plants.
- Nonirritating, nonodorous, stable, and affordable.
- No insecticides in the early 1900s met those requirements.
- Paul Hermann Müller, working for J.R. Geigy (a dye-making company) found a promising option in 1939. He synthesized dichlorodiphenyltrichloroethane (DDT).
- DDT was first synthesized in 1874 by Austrian chemist Othmar Zeidler but had no known use for it.
- Geigy produced DDT in 3% and 5% concentrations in 1942.
- US military began using it to fight malaria, typhus, and lice in WWII.
- US Army had more soldiers in hospital with malaria than casualties in Sicily in the summer of 1943.
- Testing began in Italy in August 1943. Malaria cases dropped by 80% by 1945.
- Used in Naples to battle typhus. 1.3 million people were dusted and typhus cases dropped to zero in two months.
- Paul Müller won the Nobel Prize in Medicine in 1948 for his discovery of DDT. DDT was praised by the National Academy of Sciences in 1970.
- A study by Patrick Buxton in 1945 claimed large doses of DDT could cause changes in the liver and tremors, but no evidence of harm to people.
- DDT was found to have a persistence to it, deposits would kill mosquitos for weeks. Clothes dusted with DDT would continue to kill lice after several washings.
- Derek Ratcliffe published findings of abnormally large numbers of broken eggs in Peregrine falcon nests in 1958.
- Illinois State Natural History Survey released a study that claimed DDT could be found in earthworms and affect robins at least one year later. 21 dying robins were found on campus between 1950 and 1952 with elevated levels of DDT.
- Rachel Carson wrote a book, Silent Spring, on anti-DDT groups.
- It was published in 1962 and turned into a TV presentation on CBS.
- The title referred to a letter written by a Hinsdale, Illinois resident after years of DDT spraying in the area: “The town is almost devoid of robins and starlings; chickadees have not been on my shelf for two years, and this year the cardinals are gone too; the nesting population in the neighborhood seems to consist of one dove pair and perhaps one catbird family. It is hard to explain to the children that the birds have been killed off… ‘Will they ever come back?’ they ask, and I do not have the answer.”
- The book was on the bestseller list for 86 weeks.
- It was nonfiction. It offered unsupported exaggerations on the toxicity of pesticides including children dying.
- It led to widespread support for banning DDT.
- Further investigations found it responsible for the decline in raptorial birds — peregrine falcons and bald eagles.
- EPA examiner Edmund Sweeney issued a report in 1972 that DDT had essential uses and should not be banned. The EPA banned DDT six weeks later based on concerns:
- Concentration in land and marine organisms.
- Transference through the food chain.
- Persistence in soil for decades.
- Contamination of aquatic ecosystems.
- Lethality to beneficial insects.
- Toxicity to fish.
- Thinning bird eggshells and impairing reproduction.
- Possible carcinogenic harm.
- DDT was still used in parts of the US to fight typhus and plague-carrying fleas, weevils, and moths in the 1970s.
- “The persistence of DDT means that some bird populations have yet to revert to normal eggshell thickness: gains have been steady among Greenland’s peregrines for decades, but the return to pre-DDT normal may not take place until 2034.”
- DDT-resistant mosquitos showed up in the 1950s, thanks to natural selection and widescale DDT use.
- Stockholm Convention in 2004 outlawed 9 compounds and limited DDT to malaria control.
- The WHO claimed that DDT does not harm people or wildlife in 2004 and reiterated the claim in 2011.
- By 2000, over 50 species of mosquito are DDT-resistant. Malaria was endemic in 87 countries in 2019.
- “We know that acute exposures to DDT produce a variety of responses ranging from heightened excitability, tremors, dizziness, and seizures to sweating, headache, nausea, and vomiting. Chronic occupational exposures can lead to permanent behavioral changes ranging from diminished attention and the loss of synchrony between visual information and physical movement to a variety of neuropsychological and psychiatric symptoms.”
- “DDT’s indisputably positive role in eliminating malaria from many countries and reducing its burdens in others could have been even more positive had we not resorted to massive spraying of crops, which burdened the environment with a persistent pollutant and led to the widespread rise of DDT/DDE tolerance among targeted insects.”
- The ideal insecticide should:
- CFCs
- Ice was the only way to keep food and drinks cold before modern refrigeration was invented. The ice industry cut, transported, and stored ice to be sold in the 19th century.
- Jacob Perkins invented the mechanical refrigeration machine in 1834. His system was the foundation for industrial refrigeration.
- The first ice-making plant was built in Cleveland in 1855.
- The first meat-freezing plant was built in Syndey in 1861.
- Steam engines powered compressors until electricity offered a quieter, cleaner option.
- “The ideal refrigerant should be nonflammable, nontoxic, and nonreactive: if it gets spilled from a broken duct or from a malfunctioning compressor it should not ignite or asphyxiate or poison people or combine with other compounds it may encounter.”
- Ferdinand Carre invented a refrigeration cycle using ammonia (corrosive to skin, eyes, and lungs) in 1860, which is still used today in industrial systems.
- Alfred Mellows designed his first Frigerator in 1915 but only sold a few. GM bought his company Frigidaire. Charles Kettering, of GM’s Research Lab, looked at finding a better refrigerant.
- Kettering and Midgley (leaded gasoline inventors), introduced Freon, also known as dichlorodifluoromethane, in 1930. Freon was the first chlorofluorocarbon compound they synthesized.
- GM and DuPont created a joint stock company in 1930 to make Freon.
- GM sold 1 million refrigerators by 1929. 2.25 million by 1932. 10% of US households owned a fridge in 1930, 60% by 1960, and 90% by 1952.
- “By the early 1970s all affluent countries had more refrigerators than color TVs, and more Americans were also benefiting from two important applications of the Perkins cycle: by 1970 about half of all households had air conditioning, and so had more than half of new cars.”
- “The concatenation of desirable CFC properties—stable, noncorrosive, nonflammable, nontoxic, and affordable—also made them the ideal choices for aerosol propellants (used in products from cosmetics to paints, and medical inhalers), the production of plastic insulants (including polyurethanes, phenolics, and extruded polystyrene), the cleaning of delicate electronic circuits, and the extraction of edible and aromatic oils.”
- CFC production grew from less than 550 tons in 1934, to 50,000 tons in 1950, 125,000 tons in 1960, and 815, 522 tons at its peak in 1974. The US accounted for half the amount. 10 million tons of CFC released in the atmosphere since the 1930s.
- James Lovelock designed a way to measure CFC levels in the atmosphere in 1970. They found average concentrations of 50 parts per trillion. Their conclusion at the time was that CFCs posed no hazard.
- Sherwood Rowland published a paper in 1974 that linked chlorine in CFCs to ozone destruction. He won a Nobel Prize in Chemistry 11 years later.
- “Chlorine destroys ozone but then is released to start a new cycle of destruction, and a single atom of the gas can destroy on the order of 100,000 ozone molecules before it is eventually removed from the stratosphere by downward diffusion and reactions with methane. This was a highly worrisome hypothesis because the stratospheric ozone has been essential for the evolution of higher forms of life: without it, life on Earth would consist only of UV radiation–tolerant microbes and algae.”
- Rodolphe Zander reported the first evidence of CFCs in the stratosphere in 1975. Two global monitoring networks were set up which should a rise in concentration.
- US, Canada, Norway, and Sweden banned nonessential aerosols in 1978. Europe put a cap on CFC capacity in 1980.
- The Vienna Convention for the Protection of the Ozone Layer, of 43 nations, was convened in March 1985.
- Joseph Farman et al. published a paper in Nature on May 1, 1985, showing a reduction in the ozone in Antarctica.
- “The well-known longevity of CFCs—an atmospheric lifetime of between forty-six and sixty-one years for CFC-11 and between ninety-five and 132 years for CFC-12.”
- DuPont decided to ban the production of CFCs and assured they could produce other alternatives.
- Montreal Protocol in 1987 cut production of CFCs by 50%. London Amendments, added in 1990, phased out production entirely by 2000 in developed countries and 2010 in lower-income countries. Copenhagen Amendments, in 1992, moved the start date to 1996.
- “When the Antarctic ozone measurements began in 1956, concentrations above the continent averaged about 300 Dobson units, and this level prevailed until the mid-1970s. The subsequent decline brought the concentrations to just above 100 Dobson units by 1995, and this was followed by stabilization and a slow recovery (rising minimum concentrations). The UN’s 2018 assessment concluded that the continent’s ozone layer was on the way to recovery and that pre-1980s levels might return by 2060.”
- Worst case scenario, had CFCs not been banned, simulations suggest 17% of the ozone would have been destroyed by 2020, and 67% by 2065.
- HCFCs
- Replaced CFCs.
- Had been known for decades.
- Also destroy the ozone but at a fraction of CFCs.
- Also contributes to global warming.
- The signers of the Montreal Protocol, agreed in 2007, to phase out the production of HCFCs by 2020, with low-income countries phased-out by 2030.
- HFCs
- The next available substitute.
- Do not contain chlorine, so do not deplete ozone.
- Not controlled by the Montreal Protocol.
- Also contributes significantly to global warming.
- There are currently no better alternatives to HFCs that meet the ideal characteristics — nonflammable, nontoxic, and nonreactive — of a good refrigerant.
- “By 2020 there were some 1.8 billion air-conditioning units in operation, with more than half of them in just two countries, China and the US. But this is only a fraction of the potential total because among the nearly three billion people living in the world’s warmest climates, fewer than 10 percent have air conditioning, compared to 90 percent in the US or Japan.”
- “The threat posed by CFCs to stratospheric ozone was a truly unforeseeable failure of innovation.”
- Re leaded gasoline, CFCs, DDT – “The most encouraging lesson common to the history of these three notable failures has been our ability not only to come up with better alternatives but also to devise practical international arrangements to make the bans and substitutions effective (with some notable breaches) on a global scale.”
- “CFCs and DDT carry different, much more sobering but also expected lessons: human interventions in Earth’s environment often carry delayed, complex risks, so far removed from the initial concern and so far beyond the readily conceivable complications that only time and the accumulation of events will make us aware of those unexpected but highly consequential impacts.”
- James Maxwell’s development of his theory of electromagnetic waves from 1865 to 1873, led to wireless communications: radios, TVs, mobile phones, internet, and GPS.
- Julius Edgar Lilienfeld created the first solid-state electronic device in 1925. His creation ultimately led to the first transistor created by John Bardeen and Walter Brattain and then the junction transistor created by William Shockley.
- Not all innovations provide a foundation for further advancement. Some lead to insignificance or failure.
- Airships
- “In comparison to large (seating hundreds of passengers) yet sleek-looking modern jetliners, lighter-than-air (LTA) flying machines appear clumsy, outmoded, painfully slow, hopelessly inefficient, and incorrigibly weather dependent, and hence unfit for any mainstream use in modern aviation. But that most definitely was not the consensus opinion, expert or public, during the first four decades of the twentieth century.”
- Joseph-Michel and Jacques Étienne Montgolfier flew the first balloon, using hot air carrying a basket of small animals, on September 19, 1783, in front of the king.
- Jean Baptiste Marie Charles Meusnier developed a balloon with hand-cranked propellers in 1784 for control of movement.
- Jules Henri Giffard launched the first dirigible on September 24, 1852. It was powered by a steam engine and propeller. It managed a speed of 10 km/h for 27 kilometers.
- Charles Renard and Arthur Constantine Krebs made the first round trip (landing in the same place they took off) of a dirigible on August 9, 1884. It was powered by an electric motor.
- Friedrich Wölfert built the first airship powered by an internal combustion engine in 1897.
- Ferdinand Graf von Zeppelin built the first rigid-designed dirigible with a suspended gondola in 1899. He piloted his first design on July 2, 1890.
- The first passenger airline (via airship) was set up by Deutsche Luftschiffahrts-Aktiengesellschaft in November 1909. Over 1,500 people flew on 218 flights before WWI.
- Airships were praised as the next big thing in long-distance travel! Claims of crossing the Atlantic in hours were reported in newspapers.
- WWI ushered in potential military uses. Germany had 140 airships for reconnaissance and bombing during the war. The first airship bombing raid was on Liège on August 6, 1914. Britain was first attacked by airship on January 19, 1915.
- WWI peace treaty prevented Germany from building airships. Restrictions were eased in 1925.
- A British dirigible made the first round trip voyage from Scotland to New York in July 1919.
- Graf Zeppelin
- Was a German-made airship in 1928.
- It was the first to circumnavigate the planet. William Randolf Hearst helped finance the trip.
- It was grounded in June 1937 with 1.7 million kilometers of service, 13,000 passengers, 144 intercontinental trips, and 717 days aloft.
- The US controlled the world’s helium supply at the time. The Helium Control Act of 1927 forbids export. Hydrogen was the primary gas used in non-US airships.
- Hindenburg
- Launched March 4, 1936.
- Was the largest airship at 245 meters long and 41 meters in diameter — 200,000 cubic meters in volume.
- Carried 70 passengers.
- Powered by 4 diesel engines with a speed of 122 km/h.
- It was used for Nazi propaganda flights on 17 intercontinental trips — 10 to the US and 7 to Brazil.
- The first commercial flight was on May 3, 1937, from Frankfurt, arriving in Lakehurst on May 6.
- It was the first commercial airship that was destroyed by fire and documented as it occurred. 5 news services filmed the fire and explosion.
- 35 of the 97 people died in the explosion.
- WWII brought airships back into military use. The US was the only power to use a large number, used for minesweeping, reconnaissance, anti-sub patrols, and ship escorts. Only 1 was shot down.
- “Any realistic prospects for commercial airships on intercontinental routes ended even before World War II, and they did so not because of the Hindenburg catastrophe but because of advances in airplane propulsion.”
- Boeings B-314 Clipper, in 1936, could carry 68 passengers at a speed of 300 km/h. Commercial airplane speeds only increased from there. By 1958, Boeing’s 707 cruised at 897 km/h. But that hasn’t stopped people from building airships for commercial use today.
- “Dreams of airships in other roles—above all as cargo carriers and platforms for scientific studies and for military reconnaissance—keep recurring and collapsing.”
- “Recent promoters of airships always note how the advances in materials, propulsion, and electronic controls could combine to produce a highly functional, very reliable, flexible, and economically acceptable (and also more sustainable) LTA cargo lift solution.”
- “All of these claims and plans have one thing in common: they pay little attention either to what any rapid expansion of LTA fleets would do to the supply of helium or what the actual revenue-earning time aloft might be.”
- Global helium estimates sit at 50 billion cubic meters. 40% of that is in the US, 20% in Qatar, and the rest in Russia, Algeria, and Canada. The US uses 40 million cubic meters per year primarily in MRIs, lifting gas, and laboratory applications.
- Nuclear Fission
- Henri Becquerel discovered uranium’s radioactivity in 1896.
- Lithium was split into 2 helium atoms in 1931.
- James Chadwick theorized the existence of neutrons in 1932.
- Leo Szilard in 1932, theorized the possibility of a nuclear reaction by splitting an element with neutrons. He believed beryllium, uranium, and thorium were possible candidates to be that element.
- Otto Hahn and Fritz Strassman split the first atom in 1939. Lise Meitner and Otto Frisch interpreted their results as nuclear fission. WWII began 7 months later.
- The first nuclear sub was built in the US in 1954. It allowed subs to stay submerged for long periods.
- “Some Manhattan Project physicists considered the possibility of using nuclear reactors for electricity generation and concluded that it would be highly uneconomical. That sentiment prevailed even after the establishment of the US Atomic Energy Commission (AEC), and during the late 1940s and the early 1950s it was shared by America’s leading electricity-generating companies.”
- The Cold War (the need to beat the USSR) drove the construction of the first nuclear power station in the US. President Eisenhower pushed for it. The first nuclear power plant went online on December 18, 1957. 6 months after the Soviet plant, and 15 months after a British plant.
- The Price-Anderson Act in 1957 reduced the liability from catastrophic accidents of power companies to make nuclear power a less risky investment.
- Nuclear reactor orders increased in 1973 after the OPEC oil blockade. 42 new nuclear reactors were ordered by utilities.
- “During the late 1960s and early 1970s the AEC projected one thousand reactors operating in the United States by the year 2000, and in 1974 General Electric predicted that breeders would be commercially introduced by 1982, all fossil-fueled energy generation would be gone by 1990, and by the century’s end all but a tiny fraction of the electricity used in the US would come from breeder reactors.”
- It’s downfall:
- The deceleration of electricity demand was the single biggest contributor to the cancellation of nuclear reactor orders.
- “During the 1920s electricity generation had almost exactly doubled, during the 1940s it grew nearly 2.2 times, between 1950 and 1960 it rose nearly 2.3 times, and between 1960 and 1970 it once again and almost exactly doubled; but during the 1970s it grew by a bit less than 50 percent, followed by about 33 percent growth during the 1980s, 25 percent growth during the 1990s, less than 9 percent gain during the first decade of the twenty-first century, and no growth at all between 2010 and 2019 (a comparison that includes COVID-19-affected 2020 results in a 3 percent decline).”
- Secondary causes were regulations on new plant construction, public distrust, the inability to make smaller reactors a reality, and construction delays and cost overruns on existing projects.
- The last two orders for nuclear plants in the US were in 1978 and 1979.
- The Washington Public Power Supply System planned to spend $2.5 billion on 2 nuclear plants in 1975. By 1982 the project was expected to cost $12 billion. The work was stopped and the utility folded in 1983. IT was the largest municipal bond default in the US at the time.
- Global Nuclear Plants:
- 1970 – 132 reactors generating 173 terawatt-hours of power.
- 1988 – 416 reactors generating 1,727 terawatt-hours of power.
- 2020 – 443 reactors generating 2,500 terawatt-hours of power.
- France was the only European country to build a successful nuclear power program. It produced 80% of electricity but the last plant built was in 1991.
- Only 10% of the world’s electricity generation was nuclear in 2021.
- “The most knowledgeable scientists, engineers, and utility managers were well aware of most of the challenges (from commercializing suboptimal reactor designs to indefensibly optimistic claims of competitive costs), inherent disadvantages, and less than appealing features of the new technique (radiation risks, fear of accidents, the need for lasting vigilance, security concerns, including terrorist attacks on nuclear stations and weapons proliferation). Commercial fission should have been developed more deliberately, more cautiously, and with much more attention given both to its public acceptance and to the eventual long-term storage of its radioactive wastes.”
- “When judged simply by its actual achievements, the post-1945 development of fission has been a “successful failure.” I began to use this contradictio in adjecto description before the end of the twentieth century, and the past two decades have only confirmed its accuracy. Though the grand promises of a new epoch ushered in by a brilliant high-tech solution failed, nuclear generation has been a partial (if very expensive) success.”
- Supersonic Flight
- The first powered flight occurred in Kitty Hawk, North Carolina on December 17, 1903.
- “Perhaps nothing illustrates better the subsequent speed of aviation advances than the fact that forty years after the first breakthrough, aircraft engineers were beginning to think seriously about designing a plane that would travel considerably faster than sound, shortening trips between Europe and North America to less than the time that elapses between breakfast and an early lunch.”
- Jet fighter planes were deployed in WWII in 1943 by Britain and Germany. The jets reached a max speed of 900 km/h versus the propeller fighter speed of 600 km/h.
- Mach:
- Is the ratio of object speed to the speed of sound.
- Sound travels faster at sea level than at higher elevations.
- Mach < 1 is subsonic.
- Mach = 1 is transonic.
- Mach 1 to 3 is supersonic.
- Chuck Yeager was the first to fly faster than the speed of sound in the X-1 on October 14, 1947.
- “In 1959 the annual report of the International Civil Aviation Organization acknowledged these developments and noted not only that ‘there is now general agreement amongst the potential manufacturers on the technical feasibility of producing a supersonic transport aircraft in the relatively near future—that is to say, by about 1965 to 1970’ but also that 1959 was ‘the year in which realization became general that such an aircraft not only was a practical possibility, but almost certainly would be the successor to the present jet transport.'”
- The Concorde:
- An agreement between Britain and France in 1962.
- 22 planes were produced between 1967 and 1979.
- Max speed was limited to Mach 2.2. Below 2.2 allowed for aluminum alloys. Flights above 2.2 require titanium and special steels adding to the cost.
- March 2, 1969 – first test flight.
- October 1, 1969 – reached Mach 1.
- November 4, 1970 – reached Mach 2.
- January 21, 1976 – the start of commercial operations.
- July 25, 2000 – metal punctured a tire during takeoff, ruptured the fuel tank, and everyone died on board. The plane was overloaded. All planes were grounded.
- October 23, 2003 – the last flight.
- It was never profitable.
- Soviet Tupelov Tu-144 – was a ripoff of the Concorde. Production ended in 1982 and stopped flying in 1984.
- President Kennedy pushed for a US supersonic plane in 1963. The project was canceled a decade later due to poor efficiency, limited range, and high noise levels.
- It’s downfall:
- Overcoming supersonic drag was the biggest problem. The drag coefficient peaks just above Mach 1. Minimizing drag means keeping the plane’s fuselage small.
- Engines able to sustain Mach 2 economically. Higher speeds mean higher mass to keep the plane together making travel more costly than typical commercial flights. The Concorde burned 3x the fuel per passenger than the Boeing 747.
- Overcoming the environmental impact. A sonic boom has been equated to “the simultaneous takeoff of 50 jumbo jets.”
- Many supersonic planes have been announced with much hype and promise since the Concorde with nothing to show for it…yet.
- Hyperloop
- Elon Musk announced the idea of a Hyperloop — a travel in a near vacuum tube — on August 12, 2013. It was not a new idea.
- “The historical record shows that there is nothing new about any of these ideas, that the basic concept for the fifth mode of transportation has been around for more than two hundred years, and that during the intervening time various patents were filed, several detailed proposals were made, and some models and mock-ups of specific components were built. And yet not a single (near) vacuum- or low-pressure-tube, super-fast transportation project (be it for people or goods, or both) has been completed and put into operation.”
- George Medhurst in England pioneered the idea of traveling via tubes in 1810. He offered a more detailed concept in 1812 and revisited it in 1827.
- The London and Edinburgh Vacuum Tunnel Company:
- Formed in 1825.
- It was a joint stock company.
- Prospectus: “With a capital of Twenty Millions Sterling, divided into 200,000 shares, of £100 each, for the purpose of forming a Tunnel or Tube of metal between Edinburgh and London, to convey Goods and Passengers between these cities and the other towns through which it passes.”
- “The train would carry only goods because the tube would be just four feet (1.2 meters) in diameter, and passengers would be seated in railway carriages running on rails fastened to the tube’s top and coupled by strong magnets to the freight train inside the tube whose rapid progress would drag on the passenger train, covering nearly 800 kilometers in five minutes.”
- The London Mechanic’s Register ridiculed the idea. The technology didn’t exist to do anything that was claimed.
- National Pneumatic Railway Association conducted the first trial run in 1939 with a 50% vacuum and a speed of 48 km/h.
- South Devon Railway
- Attempted to install a vacuum on a 52-mile stretch of existing rail between Exeter and Plymouth.
- Work began in 1844.
- Work was stopped in 1847 after severe losses and malfunctions.
- “Short-lived (and short-distance) atmospheric railways ran between 1847 and 1860 near Paris, at London’s Crystal Palace in 1864 (just 550 meters), and under New York’s Broadway between 1870 and 1873 (a pneumatic subway track of a mere 95 meters).”
- The next great idea was a levitating train inside a tube via magnetic levitation by Robert Goddard in 1904.
- Émile Bachelet got the US patent for a levitation apparatus in 1912.
- Robert Ballard Davy got a patent for a vacuum railway system in 1920.
- “None of these endeavors resulted in any practical results.”
- Robert Salter, of the Rand Corporation in 1972, came up with a “tubecraft” that rode on magnetic waves inside a tube.
- ” By 1978 Salter was suggesting that the ‘Planetran’ could be ‘extended to a worldwide network using under-ocean tunnels to connect continents’ and that it would be ‘safe, convenient, low-cost, efficient and non-polluting.’ What a perfect example of that common phenomenon of an inventor attached to his cherished project far beyond the boundaries of any critical appraisal!”
- The Hyperloop Alpa publication led to a repeat of history — new designs, hopes, promises, and new companies trying to make it a commercial reality. “None of the system’s often repeated advantages in comparison with high-speed rail—the absence of wheels (moving on air cushion or magnetically levitated), much faster operating speeds, significantly reduced energy use, lower construction costs—has been tested on even a single commercial project, and all such claims, until proven otherwise, remain in the category of wishful thinking.”
- Challenges:
- No materials exist, currently, that can withstand the vacuum pressures in the tube, safely and reliably, across large distances.
- The public overcoming the idea of traveling in claustrophobic pods at the speed of sound.
- Maintaining a perfect vacuum (an engineering first).
- Catastrophic decompression.
- “A steel tube on pylons would have to be engineered to maintain the thousandfold pressure difference between its inside and outside walls that threatens to crush it, and it would have to do so reliably along hundreds of kilometers of the track while also supporting the pressure generated by the rapidly moving pods and coping not just with overall thermal expansion along its course but with the differential thermal expansion between the tube’s top and bottom, an occurrence particularly significant in hot climates. With a common temperature variance of 50°C (−10 to +40°C), the system would require numerous expansion joints, each required also to maintain a near vacuum.”
- Unground tunnels would make it easier but also require an engineering first of tunnels spanning thousands of kilometers. The world’s longest tunnel, the Gotthard Base Tunnel at 57 kilometers, cost $10.5 billion and took 17 years to finish.
- Inventions Between 1867 and 1914:
- Internal combustion engines
- Electricity generation
- Electric Lights
- Electric Motors
- Low-cost steel production
- Telephones
- Plastics
- First electronic devices
- 1880s Alone (saw unprecedented innovation that became the foundation for the modern world):
- Bicycles
- Cash registers
- Vending machines
- Punch cards
- Adding machines
- Ballpoint pens
- Revolving doors
- Antiperspirants
- Coal-fired electric plant
- Hydroelectric plant
- Steam turbines
- Transformers
- Transmission (AC and DC)
- Meters
- Incandescent light bulbs
- Electric motors
- Elevators
- Streetcars
- First kitchen gadgets
- Motor cars
- Inflatable rubber tires
- Aluminum smelting
- Steel skyscrapers
- Wireless communication
- Nitrogen-Fixing Cereals
- A better understanding of nutritional requirements for healthy growth, faster population growth, and rising disposable incomes changed the dietary habits in the US. People could afford more plant-based foods and meat, eggs, and dairy products. The change in demand meant a larger portion of crop yields went toward animal feed.
- “By the end of the nineteenth century, growing feed for America’s horses and mules claimed about a fifth of the country’s abundant farmland. At the same time, average crop yields remained low (less than one ton per hectare for American and Russian wheat, no more than 1.5 t/ha even in the most productive European regions) even as the period of unprecedented farmland expansion (large-scale conversion of grasslands into cropland on North America’s Great Plains and the Canadian prairies, as well as in Russia, South America, and Australia) was coming to its end.”
- William Crookes, in 1898, offered a solution to the ” deadly peril of not having enough to eat” by increasing crop fertilization with nitrogen. Existing fertilizers were not enough to meet future demand. Another alternative was needed.
- Fritz Haber synthesized ammonia in 1909.
- BASF began making ammonia in 1913. Production was diverted for explosives in WWI, used in fertilizer production again in 1918, diverted back to military uses in WWII, and large-scale fertilizer production went into effect after WWII.
- “By 1970 the global applications of synthetic nitrogen fertilizers were more than eight times the 1950 level. By the century’s end they had risen above 80 million tons a year, and recently they have been close to 120 million tons of nitrogen a year.
Their benefits are indisputable: I have calculated that no less than 40 percent of the global population receive their dietary protein (directly from crops and indirectly from animal foodstuffs) from harvests that got nitrogen from the Haber-Bosch synthesis of ammonia.” - The downside of ammonia production:
- Almost half the nitrogen ends up in the environment, not the crops.
- That’s over 50 million tons (out of 110 million tons produced per year) released into the environment per year today.
- “Nitrogen leached into streams is transported into ponds and lakes, eventually reaching the shallow coastal ocean waters, where it supports the excessive growth of algae. When these algae die and sink to the bottom, their decomposition consumes dissolved oxygen and leaves the water anoxic, suffocating fish and marine invertebrates. These dead zones are now found in the Gulf of Mexico and along many European and East Asian shorelines.”
- Nitrogen oxide and nitrogen dioxide released during fertilization contribute to acid rain.
- Nitrous oxide, a side effect of bacterial decomposition of nitrates, has a 3x more global warming potential than carbon dioxide.
- Heavy use of nitrogen fertilizers reduces soil’s natural fertility and biodiversity.
- Jean-Baptiste Boussingault discovered that peas naturally add nitrogen to the soil in 1838.
- Hermann Hellriegel and Hermann Wilfarth theorized in 1888 that legumes had a symbiotic relationship with Rhizobium bacteria in their root nodules that accounted for nitrogen that did not exist in wheat, rice, barley, and oats.
- Experiments began as early as 1917 to try to make cereals behave like legumes to fix the demand for nitrogen. Research ramped up in the 1970s to that end.
- Norman Borlaug won a Nobel Peace Prize in 1970 for developing new, high-yield crops. His speech offered some ” wishful science fiction” about the possibilities of nitrogen-fixing bacteria in cereals.
- “By the mid-1980s a major nitrogen fixation symposium ended by concluding that little of the experimental progress ‘has yet been applied in a practical sense to improve crop production.'”
- 3 Strategies to Bring Nitrogen Fixation to Cereals:
- Replicate the relationship between Rhizobium and legumes with cereals.
- “Nothing in biology makes sense except in the light of evolution.” – Dobzhansky’s Maxim
- “Evolution (more than 100 million years of diversification among higher plant species) has not endowed a single nutritionally important species outside of the Leguminosae family with the capacity for symbiotic Rhizobium-driven nitrogen fixation.”
- Enhance the bacteria present in the root zones of cereal plants.
- Using foliar sprays or treating seeds before planting might improve nitrogen production by bacteria at the roots but so far the results are inconclusive.
- Design new crops with nitrogen-fixing genes into cereal plants (remove the need for the symbiotic relationship)
- Genetic engineering is difficult in this case specifically, but also faces public trust issues
- Replicate the relationship between Rhizobium and legumes with cereals.
- “As might be expected with modern media reporting, every news report of some notable research advance has been commonly seen as moving us ‘closer’ to the holy grail of nitrogen fixation in cereals—but ‘closer’ remains elusive.”
- Controlled Nuclear Fission
- “The early Sun radiated nearly a third less energy than the star does now, about 4.5 billion years after its formation. Its ordinariness may be universally unremarkable but its energy production is astonishing: the Sun’s luminosity is about 3.8 × 10^26 watts (joules per second), while the world’s total primary energy consumption (all fuels and all hydro, nuclear, wind, and solar electricity) is about 1.8 × 10^13 watts, a difference of thirteen orders (tens of trillions).”
- “Reactions in the Sun’s core, proceeding under a pressure about 250 billion times more than at Earth’s surface, consume 4.3 million tons of matter every second and release 3.89 × 10^26 joules. This energy flux is rapidly converted into heat and transported outward, with every square meter of the Sun’s visible light-emitting layer radiating about 64 MW. Very little of that flux is absorbed before it reaches Earth’s orbit, and hence the power flux input available at the top of Earth’s atmosphere, the solar constant, is nearly 1,370 W/m^2. These explanations were needed because the quest for controlled nuclear fusion is attempting nothing less than to replicate the extreme circumstances that sustain the Sun’s enormous energy output and then use the resulting heat to generate electricity.”
- Conditions needed to make fusion possible:
- Maintain extraordinarily high temps.
- Sustained plasma density to raise the probability of nuclear collisions.
- Provide lasting plasma maintenance required for constant heat generation.
- “Unfortunately, the mass media have chronically misinterpreted all announcements of experimental controlled fusion advances in two ways. First, they have routinely labeled every gain as a breakthrough, and second and more important, they do not make it clear that these “breakthroughs” bring controlled nuclear fusion closer only in the sense of being a proven (rather than a theoretical) possibility, not to being an actual commercial operation widely deployable to generate heat and electricity.”
- International Thermonuclear Experimental Reactor (ITER):
- Is a joint effort between 35 countries.
- Construction started in France in 2010.
- Built to test the feasibility of energy generation and a foundation for later commercial designs.
- The operation was originally scheduled for 2016, postponed to 2020, and postponed again to 2025.
- Cost estimates were initially €5 billion, was €4 billion over budget in 2016, and estimated costs ballooned to €65 billion in 2018.
- “ITER will not capture any outgoing heat to be used for electricity generation and will not attain a state of continuous fusion: it will generate pulsed net energy (Q > 1) only when the ratio is calculated by dividing the heat energy output by the energy used to heat the plasma (50 MW), not by the total electricity consumption of the facility. ITER’s total electrical power demand will be about 300 MW, mostly for the cryogenic plant needed to cool the superconductor magnets to −269°C and power them to produce a 15 megaamperes plasma current.”
- “If a fully functioning ITER were a real electricity-generating plant it would, with 40 percent efficiency, convert 500 thermal MW into 200 MW of electricity, for a net power loss of 100 MW. This means that any commercial fusion plant will have to operate with a substantially higher Q in order to produce electricity whose cost would be competitive with today’s, and the future’s, alternatives.”
- “ITER is intended to be run for twenty years before construction is started on DEMO, a fusion demonstration power plant whose anticipated dates slip with ITER delays. The original 2021 ITER timeline had the DEMO operating in the early 2040s, but in 2017 it was announced that 2054 is an optimistic date.”
- “During the past seven decades the world has spent at least $60 billion (in 2020 monies) on developing controlled fusion, but it remains perhaps the most stubbornly receding fata morgana on record: always to be reached after yet another thirty years.”
- “The two key challenges are problems with containment and fuel assemblies and the large requirements for parasitic power. Neutron streams generated by deuterium-tritium fusion will damage the solid containment vessel by swelling, fracturing, and embrittlement, imperiling its integrity.”
- “This book has only modest goals: to remind us that success is only one of the outcomes of our ceaseless quest for invention; that failure can follow initial acceptance; that the bold dreams of market dominance may remain unrealized; and that even after generations of (sometimes intensifying) efforts, we may not be any closer to the commercial applications first envisaged decades ago. And what is true about the past is, despite recent claims to the contrary, likely to be repeated in the future.”
- Basic Lessons:
- Every major innovation brings concerns or undesirable consequences that become immediately obvious or known well after the fact.
- Rushing to secure a commercial market share or producing the most cost-efficient product may not offer long-term success.
- It’s difficult to know the public acceptance, societal fit, or commercial success of an invention in its early stages of development.
- Skepticism is appropriate for the most challenging problems. Perseverance and unlimited financing are no guarantee for success.
- “Both the acknowledgments of reality and the willingness to learn, even modestly, from past failures and cautionary experience seem to find less and less acceptance in modern societies where masses of scientifically illiterate, and often surprisingly innumerate, citizens are exposed daily not just to overenthusiastically shared reports of potential breakthroughs but often to vastly exaggerated claims regarding new inventions. Worst of all, news media often serve up patently false promises as soon-to-come, fundamental, or, as the current parlance has it, ‘disruptive’ shifts that will ‘transform’ modern societies. Characterizing this state of affairs as living in a postfactual society is, unfortunately, not much of an exaggeration.”
- Skeptical Worthy Breakthroughs
- Mars colonization
- Repeated science fiction as fact going back to 1950.
- Brain-computer interfaces (BCI)
- “Nearly four thousand news items on BCI published between 2010 and 2017. The verdict is clear: not only was the media reporting overwhelmingly favorable, it was heavily preoccupied with unrealistic speculations that tended to exaggerate greatly the potential of BCI (“the stuff of biblical miracles,” “prospective uses are endless”). Moreover, a quarter of all news reports made claims that were extreme and highly improbable (from “lying on a beach on the east coast of Brazil, controlling a robotic device roving on the surface of Mars” to “achieving immortality in a matter of decades”) while failing to address the inherent risks and ethical problems.”
- Self-Driving Cars
- “Forecasts of completely autonomous road vehicles were made repeatedly during the 2010s: completely self-driving cars were to be everywhere by 2020, allowing the operator to read or sleep during a commute in a personal vehicle. All internal combustion engines currently on the road were to be replaced by electric vehicles by 2025: this forecast was made and again widely reported as a nearly accomplished fact in 2017.”
- Only 2% of the 1.4 billion vehicles around the world in 2022 were electric (and 60% of all electricity generation came from coal and natural gas).
- Medical Research
- “In 2014 a study of nearly five hundred biomedical and health-related science press releases published in the British Medical Journal found that 40 percent of those announcements contained exaggerated advice, a third of them contained exaggerated causal claims, and nearly 60 percent of subsequent news stories based on such releases also contained such exaggerations. Far more remarkably, even completely unsubstantiated claims are now wholesaled as facts and, incredibly, are even approved for use by the very authorities whose duty it is to prevent such a turn of events.”
- AI
- Is often misunderstood as “intelligence” when it’s more about pattern recognition on a much more efficient scale.
- “What we have done, often quite effectively, is to deploy some fairly rudimentary analytical techniques to uncover patterns and pathways that are not so readily discernible by our senses but that can be captured, remembered, recalled, and acted upon by computers at scales and speeds unattainable by humans. That is how IBM’s Blue beat Kasparov in chess; that is how a program trained on hundreds of thousands of actual x-ray images can discern a cancerous lesion in breast tissue.”
- “People are getting confused about the meaning of AI in discussions of technology trends—that there is some kind of intelligent thought in computers that is responsible for the progress and which is competing with humans. We don’t have that, but people are talking as if we do.” — Michael Jordan, AI Researcher, U of C, Berkley
- “Neural networks are not only brittle (good at specific tasks but deeply deficient in general intelligence, and hence easily overconfident or underconfident in their “judgment”) but biased (realities may be far more complex than the training algorithms), prone to catastrophic forgetting, poor in quantifying uncertainty, lacking common sense, and, perhaps most surprising, not so good at solving math problems, even those routinely mastered by high school competitors.”
- “The conclusion is obvious: our quest for AI is an enormously complex, multifaceted process whose progress must be measured across decades and generations and whose impressive achievements on some relatively easy tasks coexist with the much larger realm of intelligence that remains well beyond the capabilities of programmed machines.”
- Global Decarbonization
- “Half the technology needed to get zero emissions either doesn’t exist yet or is too expensive for much of the world to afford.” — Bill Gates, 2021
- “That global warming will get worse before it gets better is a foregone conclusion: even an instant (and totally theoretical) cessation of all greenhouse gas emissions could not bring an instant stabilization and decline of the average tropospheric temperature.”
- Fossil fuels supplied 87% of the world’s energy in 2000, and 83% in 2020. That’s an annual reduction of 0.2%. Going from 83% to 0% in 30 years requires a cut of 2.75% per year, or 14x faster than the last two decades.
- A 40% cut in carbon dependence means more than 500 million tons of iron per year would have to be smelted using green hydrogen by 2030. No steel plants exist as of 2022 that use green hydrogen.
- A 40% cut in carbon dependence would mean 570 million new electric (or hydrogen/ammonia-fueled) vehicles made by 2030 or about 63 million new cars per year — more than the total global car production in 2019.
- A 40% cut in carbon dependence would mean 10,000 electric or hydrogen-fueled commercial aircraft by 2030. The technology for that doesn’t exist today.
- “The faster we move to adopt noncarbon energy-producing processes, the more we will have to rely on carbon-based production and transportation methods that cannot be replaced rapidly with noncarbon processes even if those were readily available—and in most cases they are not.”
- “These are the foremost generic lessons: basic (scientific and technical) understanding must precede specific applications (perhaps the most obvious but repeatedly ignored reality); critical variables may get worse before they get better; it is unwise to specify outcomes by dates; even near-term targets, no more than ten years away, will be missed; some very impressive advances will take place alongside barely changing realities; intra- and international differences (for a variety of reasons) will continue to be significant; initial cost estimates will escalate; and the gains may be partially negated by new developments, undermining the hard-won achievements.”
- Mars colonization
- “Unruly complexities and uncertain outcomes find no favor in the modern discourse, which swings between the collapsing civilizations and ever more enticing futures.”
- Exponential Growth in Innovation?
- Solid-State Electronics
- Exponential growth has only been found in innovations tied to electronics since the advent of the transistor in the 1940s, integrated circuits in the 1960s, and microprocessors in the 1970s. It was an exception, not the norm.
- Moore’s Law has warped our sense of what’s possible as it relates to the pace of innovation in areas outside electronics.
- “In August 1969, two years before the first microchip appeared, the Apollo 11 computer that guided the capsule to land on the Moon packed just 62 bytes of random access memory (RAM) per kilogram of its (at 32 kilograms clearly nonportable) mass. In 2022 an ordinary Dell laptop used to write this book had about 3.5 gigabytes of RAM per kilogram of its portable (about 2.2 kilograms) mass, or a 1.75 billion-fold gain in performance.”
- “Such stunning gains—so large that most readers would not have noticed had I written million or trillion instead of billion—taking place within such relatively short periods of time leave deep impressions and we notice them far more, and perceive them to be disproportionately more important, than the unchanging or marginally evolving fundamentals of our lives.
Moreover, these admirably rapid exponential gains are seen as harbingers or foundations of similarly impressive gains in other realms of reality.” - Growth in processor performance has slowed from 52% per year from 1986 to 2003 to 23% per year between 2003 and 2011, to less than 4% from 2015 to 2018.
- Every Sector outside of electronics:
- No exponential growth in food production, crop yields, energy efficiency gains, transportation speeds, large engineering projects, rate of drug discovery, and human longevity.
- “Annual gains in our food, materials, and energy production have been only a small fraction…resulting from very low rates of exponential growth, mostly on the order of 1–2 percent a year; the first rate increases the initial value only 1.65 times in fifty years, while exponential growth of 2 percent a year will have an outcome 2.7 times higher after half a century.”
- “Most of the world’s electricity is generated by large steam turbines whose efficiency got better by about 1.5 percent per year during the past hundred years. We keep making steel more efficiently, but the annual decline in energy use in the metal’s production averaged less than 2 percent during the past seventy years. And, as already noted (and setting aside the failed Concorde), the average speed of jet flight has not seen any increase since 1958.”
- “In 1900 the best battery (lead-acid) had an energy density of 25 watt-hours per kilogram; in 2022 the best lithium-ion batteries deployed on a large commercial scale (not the best experimental devices) had an energy density twelve times higher—and this gain corresponds to exponential growth of just 2 percent a year.”
- “Breakthrough patents in the furniture, textiles, and apparel industries, in transportation equipment, machinery manufacturing, metal manufacturing, wood, paper, and printing, and in construction all peaked before 1900. Mining and extraction, the coal and petroleum industries, mineral processing, electrical equipment production, and plastics and rubber products had their innovative waves and peaks before 1950, and the only industrial sectors with post-1970 peaks have been agriculture and food (the wave dominated by genetically modified organisms), medical equipment (from MRI and CT scanners to robotic surgical tools), and, of course, computers and electronic products.”
- “Exponential growth does not mean that every variable whose increase is described by it is growing rapidly. A linearly growing variable increases by the same amount during the same period of time, while an exponentially growing variable increases by the same rate during the same period, and if that rate is very low it will take a long time to see any substantial difference.”
- Solid-State Electronics
- “Reasoning with true believers—be they of religious or ideological persuasion or cornucopian techno-optimists—is not an option.”
- “In the grand scheme of things, improving what we know and making it universally available might bring more benefits to more people in a shorter period of time than focusing overly on invention and hoping that it will bring miraculous breakthroughs.”