Why Society Needs Astronomy and Cosmology Transcript
Total Page:16
File Type:pdf, Size:1020Kb
Why Society Needs Astronomy and Cosmology Transcript Date: Tuesday, 15 March 2016 - 6:00PM Location: Museum of London 15 March 2016 Why Society Needs Astronomy and Cosmology Dr Roberto Trotta "One day, Sir, you may tax it!" In 1850 the Chancellor of the Exchequer, William Gladstone, reportedly visited Michael Faraday's laboratory at the Royal Institution of Great Britain. Faraday's reputation as one of the greatest scientists of his time was fully established: thanks to his experimental ingenuity and scientific mind, Faraday had made several ground-breaking discoveries in chemistry and physics, many of which would form the basis of virtually all modern technology. Faraday discovered the principle of electromagnetic induction (that a moving magnet generates an electric current, and, vice-versa, a time-changing current produces a force on a magnet), which underpins electric motors and electric generators; he discovered diamagnetism (an effect that can be used to levitate objects in a magnetic field with no power consumption); and he championed the dialogue of scientists with the public, presenting a series of Christmas Lectures at the Royal Institution whose tradition continues to this day. But in 1850 Faraday's work with electricity and magnetism looked a little more than a curiosity to most of his contemporaries – and certainly to politicians who could not begin to foresee the revolutionary implications of his discoveries. To the Chancellor's questions about the practical use of electricity, Faraday reportedly replied: "One day, Sir, you may tax it!" Faraday's legacy We do pay tax on electricity today – none of Faraday's fault, to be sure, but a testament to the ubiquitous nature of the technology that sprung out of his research into the fundamental laws of nature. The next time that we pay with a 20 pounds note (for those of us who live in the UK), we might want to look more closely at the flip side to the one bearing the Queen's portrait. A bemused Faraday peeks out, and while delivering one of the Christmas Lectures he appears to be reminding us of where it all came from: curiosity-driven research and Faraday's thirst for understanding what appeared to be quirk, unexplainable phenomena. While an extraordinarily accomplished experimentalist, Faraday had little formal education and didn't have a great mastery of maths. It would befall James Clerk Maxwell to put Faraday's findings in a mathematical form that could be understood from first principles. This led to the four celebrated Maxwell Equations, published in the early 1860s, which to this day describe and govern all electric and magnetic phenomena. The development of technology that would soon enter every household proceeded at an appropriately lightning pace: in 1876, Alexander Graham Bell transmits the first message via telephone; in 1880, Edison patents the incandescent light bulb; in 1888 Heinrich Rudolf Hertz demonstrates that radio waves can be transmitted through the air, paving the way for the wireless telegraph patented by Marconi in 1896. The first BBC broadcast from London would follow on Nov 14th, 1922. In 1904, the German inventor Christian Hülsmeyer patents a device he called "telemobiloscope". His invention didn't take hold at the time, but would become of great importance during World War II, when both Germany and the Allies made quick strides towards building a fully functioning radar. In the post-war period, the technology that electricity enabled mushroomed everywhere, changing every aspect of life and defining modernity: in the house (think about electrical and motor-powered kitchen appliances); in the air (airplanes relying on radar and radio telecommunication); throughout society (the rapid diffusion of TV after the war, and the impact it had in terms of mass media and culture) and all the way to the Moon, with the space programme being inconceivable without electricity (Jules Vernes notwithstanding!). Little new left to discover? At the end of the 20th century, the prevailing view seemed to be that little new was left for scientists to discover: Lord Kelvin famously quipped, around 1900, that 'There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.' And how wrong he was! Very soon, what had appeared to be minor discrepancies in the solid edifice of physics turned out to be deep cracks that would send the entire building tumbling down. The puzzling fact that the speed of light did not seem to change with respect to the relative velocity of the observer and the source (as proved by Michelson and Morley in 1887) was a nagging problem of the "aether theory". This theory postulated the existence of a medium through which light (i.e., electromagnetic waves) could propagate, for otherwise it was inexplicable how light could travel through empty space. But the failure to detect any change in the speed of light through the year (when the "aether" wind should change from June to December due to the motion of the Earth around the Sun) led to a crisis. That crisis was resolved in 1905, when Einstein formulated the Special Theory of Relativity. His revolutionary step was to stipulate that the speed of light is constant with respect to any observer. This simple postulate had far- reaching consequences: for example, time and space become intertwined and different observers measure time and length to be different according to their relative state of motion. At the same time, a major revolution was underway in our understanding of the microscopic world. Once again, it was ushered in by the stubborn resolution of some scientists to grasp the deep causes of what seemed to be fairly minor glitches in the behavior of some physical systems. Such phenomena could not be understood with the physics of the time. Why, for example, were charged particles called electrons emitted by a metal only in discrete bursts of energy when a beam of light was shone on it, and not in a continuous stream (the photoelectric effect, for whose explanation Einstein was awarded his only Nobel Prize in 1921)? Or what was the distribution of energy emitted of a body at a certain temperature (a problem solved by Max Planck in 1900)? These fairly exotic phenomena led physicists to the idea that energy is not a continuously distributed quantity, but rather it comes in discrete packets, called "quanta". It was the dawn of a radically new theory of the microscopic world: quantum mechanics, about which Niels Bohr, one of its founding fathers, had to say: "Anyone who is not shocked by quantum theory has not understood it." Apart from giving us a very detailed description and understanding of the sub-atomic world, quantum mechanics is at the heart of countless applications: from atomic clocks to nuclear power (and, sadly, nuclear weapons), from lasers to medical scanners, from semiconductors in computers and consumers electronics to quantum computers in the near future, today's technology is inconceivable without it. Knowledge from the heavens The study of the heavens is perhaps the most striking example of research done first and foremost to satisfy our thirst of knowledge. It is at first sight hard to conceive of sciences more far removed from everyday life and its concerns than astronomy and cosmology. After all, most of the physical phenomena of the cosmos are separated from us by almost unimaginably vast distances. They happen over either very long times (the life of stars spans some 5 billion years) or in catastrophically short-lived, high energetic events (the merging of two black holes in a fraction of a second, emitting gravitational waves like the ones recently recorded by the LIGO instruments). The Universe is everything but at the human scale: too big, too old, too distant, to be of any practical use to us. Or is it? A closer look reveals that astronomy and cosmology have –many times- changed the way we understand our world: from Galileo discovering the four largest moons of Jupiter (1610), to Hubble establishing that the Universe is expanding (1929), and the discovery of the relic radiation from the Big Bang itself by Penzias and Wilson (1964), observations of the cosmos have led to radical shifts in our place in the Universe and our significance in it. Big Science Today, the study of the Universe is very different than it was even in the 1960s, when Penzias and Wilson literally stumbled on the left-over heat from the Big Bang when experimenting with a microwave receiver as a way of establishing an early form of satellite telecommunication. The myth of the lonely astronomer spending long, cold nights at his telescope (yes – unfortunately, part of the myth remains today that scientists and in particular astronomers are always male, something that countless talented female colleagues are working hard to dispel) is deeply engrained in popular culture. But those days are largely over. This is not to say that so-called "amateur" astronomers (who are often better observers than the pros!) cannot make important contributions to the field today. In fact, the advent of relatively affordable CCD technology and the digitalization of observational astronomy mean that the cosmos is more than ever within reach of an ever- increasing number of members of the public. The study of transient phenomena such as supernova explosions has greatly benefitted from a dedicated community of non-professional astronomers. Also, citizen-science projects like GalaxyZoo or SETI@Home, now give everybody a chance to step in and contribute to cutting-edge research from the comfort of their armchair. All that is needed is a computer and an Internet connection. However, by and large astronomy and cosmology are moving in the direction of becoming more and more Big Science (and hence, big money) projects.