<<

2018 Lightning Round – Physics Subfields

Experimental Condensed Matter Physics – Lucas Peeters

More is Different isn’t just the title of Phil Anderson’s classic essay on the limitations of reductionism in science - if condensed matter physics was ever in need of an official tagline, it would also be a prime candidate. Indeed, why do we even need a field dedicated to the behavior of the trillions of electrons in materials all around us? Haven’t particle physicists figured out all the fundamental properties of those particles a while ago? But more is indeed different. The zoo of emergent phenomena arising when we put myriads of such particles together - magnetism, superconductivity and multiferroicity, to name a few - requires its own language, its own concepts and its own tools. Our understanding of and control over such effects are not only a rewarding scientific enterprise in their own right, but have also spurred a host of technological breakthroughs which continue to transform our everyday lives. For example, having been at the root of our current, silicon-based computing architecture, condensed matter physics is now also one of the main contenders to deliver the first industrially useful quantum computing devices.

Experimental condensed matter physicists have several approaches at their disposal to advance our understanding. Materials growth is one of these. This field, with its often-exceptional sensitivity to a slew of control parameters can appear like a dark art to many outsiders. Those who master it, though, can discover materials which combine effects in ways never seen before in , and their controlled realization of such systems can provide invaluable insight. Not all effects are visible at all length scales though, so others specialize in the fabrication of micro- or even nanoscale devices, combining different materials and techniques to create minute devices which bridge the gap between the single-atom scale and macroscopic scales. Another line of research focuses on the development of new measurement techniques; the boundary of spatial and temporal resolution is always being pushed, and increased access to the charge and spin properties of electrons can shed new light on long-standing questions.

Since its debut about a century ago, condensed matter physics has dramatically advanced our understanding of electronic behavior in a wide range of contexts. If the past is any guide, the future will give us even more, and it’ll be even more different.

Computational Condensed Matter Physics – Yanbing Zhu (some sections from Harrison Ruiz)

What is it?

Theoretical/computational condensed matter physics involves studying condensed phases of matter. This usually refers to solid states of matter (anything you can touch) plus liquids. A big goal is characterizing certain materials and their resulting phenomena - i.e. finding materials with certain properties, designing a new material from scratch, understanding a (quantum) phase transition, or studying how a material behaves under external influences.

Examples of properties are studied are stability, conductivity, specific heat, and magnetic susceptibility. Often these properties studied are in exotic phases of matter such as superconductors, topological insulators, and spin liquids, where some unusual and often purely quantum mechanical phenomena are observed in solids. These phases can be studied both in out of equilibrium.

There’s a big interplay between physics, materials science, chemistry, and of course, computational tools. The labels in research between the different subfields becomes blurred - i.e. statistical mechanics, scattering behavior, analytical models, and theories of strongly correlated phenomena are used from physics, physical considerations like the behavior of a material under strain or stress, dislocations, and grain boundaries are used from materials science, and elemental properties, reaction properties, and bonding rules are used from chemistry.

Why do we do it?

Research is driven by both understanding fundamental science problems and also more practical purposes. Oftentimes both goals can be part of the same project. For instance, topological materials are interesting its exotic behavior was only fairly recently understood, but they are also of interest for quantum computing purposes since there is no decoherence. Batteries and solar materials like new lithium based electrolytes or hybrid inorganic- perovskites are studied for their practical purposes (computational/theory work is often used to guide experiments or explain behavior seen in the lab). The work also enables us to better understand topics like conductivity or to figure out which models and approximations work in certain limits.

What does a typical research do?

The daily work consists of making calculations using various models and possibly collaborating with experimentalists working on the same problem. The way the calculations are done can vary depending on the type of research. On one end, you have computational condensed matter physicists who focus on numerical calculations. This can involve using methods such as density functional theory, quantum Monte Carlo , molecular dynamics, or exact diagonalization. Running these types of calculations can either involve using libraries designed for ab initio simulations, such as density functional theory (DFT), or writing code for them from scratch. Common programming languages used for this are C++ and Python. In many cases parallel computing techniques are used as the problem is computationally intensive and even some of the simplest models can only be solved exactly for around 20 particles. Recently, machine learning tools also become more popular.

Other theorists may use analytical techniques to work on these problems. Some of the approaches they may use are mean-field theory, renormalization group, and quantum field theory. There is not always a clear division between the computational physicist and the traditional theorist so a combination of numerical and analytical methods. Some experimentalists also often use computational tools to compare with their experimental findings.

Experimental Particle Physics - Kelly Stifter

Particle physics is the study of the smallest pieces of matter and their interactions. The biggest success of the field is the Standard Model, which describes all known particles and the interactions between them (except ). Despite the success of the Standard Model, there are still many mysteries to be solved that are considered “beyond” the Standard Model, as well as some open questions within the Standard Model itself. On the experimental side, scientists build detectors in order to learn about these particles. When conducting these experiments, we often try to do one of two things. First, we search for new physics that is not included in the Standard Model. Second, we precisely measure, or look for deviations from, “known” physics. We can break it down even further and specify three main ways that we conduct these searches or measurements. They are often referred to as “frontiers”, since in each category, we push the limits of our current technology and understanding. First is the energy frontier, which uses the world’s largest particle accelerator, the LHC, to recreate the as it was a billionth of a second after the and make huge discoveries about the smallest pieces of the universe. The two main experiments on the LHC are CMS and ATLAS. Together, these two detectors discovered the Higgs Boson in 2012. Now, in addition to precisely measuring the masses and couplings of all Standard Model particles, they are looking for and more exotic new physics such as supersymmetry and extra dimensions. Next is the intensity frontier, which investigates some of the rarest processes in nature, including unusual interactions of fundamental particles and subtle effects that require large data sets to observe and measure. Of particular note is the effort to measure neutrino properties, including their mass and parameters associated with their oscillations. Others open questions include matter- asymmetry, proton decay, the anomalous magnetic dipole moment of the muon, and many more. Last but not least is the cosmic frontier, which strives to understand the structure, evolution, and contents of the universe. It turns out that only 5% of the energy in the universe is regular matter, and then rest is dark matter and . There are many experiments trying to understand the physical nature of dark matter either by directly detecting dark matter particles that pass through a detector, or by indirectly detecting specific signatures in cosmic rays coming from the universe. Cosmic rays can also be used to look for other new particles beyond the Standard Model. The very beginning of the universe was marked by a period of intense , and there are experiments that study the cosmic microwave background radiation in order to understand this phenomenon. A similar expansion is happening today and is driven by dark energy, which is being studied by large ground-based telescopes. Everything I have mentioned so far is considered “high energy” particle physics, since it deals with particles and interactions up at the scale of millions of electron volts (eV). Scientists also work on medium and low energy particle physics, which are concerned with structure and stability of hadrons and atomic nuclei. The interactions in these subfields have relatively lower energy - only 10s of eV. In this regime, the line between the subfields of particle physics and nuclear physics starts to blur. All of these experiments require lots of time, effort, and expertise. It all starts with using the known laws of physics to design a particle detector in order to learn about the phenomenon you are interested in. Once a concept has been developed, it has to be refined and final design decisions must be made to optimize the scientific reach of the experiment - this includes mechanical engineering, electronics design, etc. Once the design has been finalized, it must be physically constructed, which requires great care and often specialized clean environments. Once it is built, it must be turned on, and of course it never works the first time. There is usually a period of commissioning before science operations can begin, in which you must learn every quirk of the detector. Once data-taking has begun, it has to be analyzed, which requires software development, algorithm development, careful statistics, and a deep understanding of the underlying physics. Throughout this whole process, detector design and data analysis is bolstered through the use of very detailed simulations of the relevant physical processes, and how they happen in the detector. Overall, particle physics is a broad and exciting field, with many new developments on the horizon! Atomic, Molecular, and Optical Physics (AMO) - Dan Guo

Major questions:

How can we leverage our precise control over atoms and photons for 1. New technologies 2. Study/Test of fundamental physics 3. Improving metrology standards

Recent major achievements:

1. Optical computing (Ising machines with photons) 2. Simulating exotic states of matter (Quantum Hall effect on curved space/High Tc superconductors with atoms in optical lattices) 3. Measuring the electron dipole moment to rule out theory proposals beyond Standard Model 4. Ultra-stable optical clocks for new frequency standard and even study of many-body physics

What researchers do:

1. Study properties of atoms/molecules in different settings (BEC, optical lattice, optical cavity) 2. Improve optical control capabilities (from radio frequency all the way to optical frequency)

Astrophysics/ - Anna Ogorzalek

Astrophysics and cosmology are fields that aim at understanding the universe beyond our Solar System. The key questions we are currently trying to answer are: 1. How does the universe work on a fundamental physical level: How does matter behave under extreme astrophysical conditions, e.g. inside stars or around black holes? What forces drove the Big Bang? What causes the accelerated expansions of the universe? 2. How did the universe began and evolved on all scales: How did the extremely hot and dense primordial quark-gluon plasma transform into the stars and that surround us today? How did the cosmic web form, with all of its voids, filaments, and clusters of galaxies? How do stars and their planetary systems form from the gas within galaxies? 3. Is there life on planets orbiting other stars: What are the properties of stars that posses planetary systems? How many planets are in habitable zones of their stellar hosts? Which stars are best candidates for harbouring life in their planetary systems and why?

You can help to answer the above questions by working on: - Theoretical calculations and modelling: creating new models of astrophysical processes and evolution of the universe, and testing them with the use of (often extremely large) computer simulations. - Observations and data analysis: designing and executing (sometimes in person!) astronomical observations, and choosing the most appropriate statistical methods to confront the data with models to quantify our understanding of the physics involved. Generally you will either focus on in- depth analysis of one or a few objects (e.g. a black hole, a , a planet), or very large samples of objects coming from sky surveys, where often you will be developing machine learning or other data mining techniques. - Instrumentation: developing new technologies for mirrors and detectors for photons of all wavelengths, as well as cosmic rays/fundamental particles and gravitational waves, for both ground based and space based observatories. - Laboratory astrophysics: studying the extreme astrophysical environments by trying to reproduce them in a laboratory. These are typically atomic and plasma physics or heavy ion experiments, since most of the baryonic matter in the universe is ionized.

Examples of major recent breakthroughs include: - The beginning of gravitational wave astronomy, where for the first time we study the physics of black hole and neutron star merges, pushing the limits of our understanding of how gravity works. - Massive cosmological simulations, such as the Illustris Project, are starting to reproduce the universe that surrounds us: starting from the Big Bang, they are able to trace matter that ends up forming galaxies that look like our own Milky Way. - Detection and characterization of many new extrasolar planetary systems, for example TRAPPIST-1, the largest system discovered so far, with 7 planets, 3 of which lie within the habitable zone of their host star. We will learn much more about these alien words with the launch of future space missions within the next decade. To learn more, start with astrobites.org: a truly excellent, undergraduates-oriented website summarizing the state of the art research in astrophysics and cosmology, with a ton of useful information about this particular career path. Theoretical Particle Physics – Adam Scherlis

High Energy Physicists like to think of the world in terms of tiny things. The name of the game is to reduce physical reality to its most basic constituents, in the search of the building blocks of the universe. Key questions: What sorts of new physics are (1) theoretically well-motivated and (2) potentially detectable? Some theoretical motivations for part (1), in no particular order:  What is responsible for the smallness of the Higgs mass relative to the scale? (maybe supersymmetry, or more exotic new physics, or )  Why does quantum chromodynamics (QCD) conserve charge-parity symmetry to a high degree of accuracy, when it doesn’t need to? (maybe )  What is dark matter? (maybe WIMPs, maybe axions, maybe other?)  Why is the nonzero? Why is it tiny? Why is it positive? (maybe multiverse??)  How does gravity fit into a quantum theory? How does it relate to the other forces? (maybe supergravity, or string theory / M-theory)  What is spacetime “made out of”? Does it have more dimensions? (maybe string theory / M-theory, maybe holography, maybe quantum information; these aren't mutually exclusive)

Some experimental ideas for part (2), in no particular order:  Colliders (LHC, future colliders?)  New types of high-precision detectors for dark matter or ultralight particles (NMR, atom interferometers, superconductors…)  Cosmology (constraints or signals from CMB, large-scale structure…)  Astrophysics (constraints or signals from neutron stars, supernovae…)

What we do, in no particular order:  Phenomenology: Some quantum field theory (QFT) to predict observable signals of models, lots of Monte Carlo simulations and statistics to predict experimental sensitivity  Model building: Coming up with new physics models to explain something, then computing its observable consequences to make sure it isn’t ruled out (some QFT, astrophysics, cosmology, a bit of everything)  “Small scale experiments:” Propose experiments that will be sensitive to interesting new physics models that are otherwise untested. (A little of everything---quantum, thermodynamics, statistics, material science, engineering…)  “Theory”: Making sure our theories are self-consistent, figuring out their mathematical structure, developing better ways to do calculations, better ways to interpret or describe the theory. (QFT, string theory, general relativity, quantum information, holography, mathematical physics)  Numerics: Developing tools to simulate particle physics on computers (lattice methods, QFT, numerical general relativity)  Quantum information: Studying information and computation in quantum systems. A young subfield, with surprising connections to quantum gravity. Very mathematical, but with applications to cutting- edge experiments in quantum computation. (QM, information theory, theory of computation)

Day to day: A lot of reading research papers, brainstorming ideas with colleagues at the blackboard, and working out calculations either by hand or on a computer.

Experimental Biophysics - Athena Ierokomos

Biophysics is the application of physical principles to biological problems. This consists of breaking down biological questions into their simplest patterns. These questions are varied in subject matter, beginning with atomic structures of biological molecules, and spanning length scales up through cells, tissues, organisms, and environments. For example, at the smallest scale, scientists study how force and torque affect the structure and function of molecular machines, or other critical components of cellular life. Researchers can address questions on relationships between structure and function of proteins at the molecular level by directly manipulating systems physically, with AFM, optical traps or other methods of applying force. Biophysicists can study phase transitions in DNA structure and in the cytoplasm. These phase transitions can have profound implications on protein expression and protein quality control, and many diseases result from improper maintenance of these. For example, the structure of the DNA in the nucleus of cells affects gene expression, and gene expression can affect health and immunity. Other experimental biophysicists study how cells move in order to hunt down invading bacteria, to close wounds, or to migrate in developing embryos. These are complicated processes that involve sensing something to move towards and generating directed force to head in that direction. Another aspect of biophysics is understanding how organisms harness different aspects of in order to swim or fly. Biochemical and biological techniques make a wide array of biological questions available. With applications from soft matter to medicine, studies in biophysics are wide ranging.

Theoretical/Computational Biophysics - Quinn MacPherson

The field of theoretical biophysics is a broad and quickly evolving. Rapidly improving experimental techniques in biology – most notably imaging and sequencing-based technologies – are showering theorists with vast amounts of increasingly quantitative data which requires interpreting. In practice, it is difficult to distinguish where to draw the lines between the fields of biology, chemistry, statistics, computer science, and physics in interpreting to understand the system at hand. For this reason, biophysics is quite interdisciplinary.

While there are macroscopic biophysical questions, the bulk of biophysics involves polymers membranes, cells, etc. which span length scales of Angstroms for the most detailed full atomic detail models to microns for models of cellular mechanics. Perhaps the problems currently employing the most biophysics involve modeling the folding of biopolymers: proteins, RNA, and DNA. Given the sequence (as well as temperature, surroundings, history, and chemical modifications etc.) of these biophysics biophysicist work to predict the 3D conformations these polymers take and – even more difficult – engineer new sequences with intended structure and function. The motions of biomolecules as tracked by various forms of microscopy is an important task. Membrane structure and dynamics are also major players. Evolutionary processes and theoretical neuroscience also employ many physicists.

While there are some pencil-and-paper biophysicists, the field is dominated by computational work due to the complex and highly nonlinear systems involved as well as the nature of biological datasets. Computational biophysics often involves physical modeling of 3D objects subject to mechanical, chemical, and thermal forces. techniques include molecular dynamics, Brownian dynamics, Monte-Carlo. Modeling time-dependent nonlinear feedback systems by integrating ODE’s is also common. And, of course, there is a lot of data processing. As biological systems are often (always) messy they require liberal use of approximations and simplifications.

Perhaps the most import then the technical aspects keeping up with the rapidly changing field and knowing what questions to ask and which results will other biologist care about.

High Energy Density Physics (Plasma Physics) – Anna Grassi and Franziska Treffert

Our High Energy Density Physics (HEDP) division consists of several research groups who investigates theoretical and experimental high energy density (HED) physics, astrophysics, and advanced diagnostic development.

What are the major questions in your subfield? The HEDP division investigates the properties of matter under extreme conditions. Our field of research goes from Warm Dense Matter, which is believed to exist at the cores of giant gas planets in our solar system, to Cosmic Rays acceleration in solar flares and gamma-rays bursts. Many questions still remain unsolved, particularly in the case of the WDM, a state that is too hot to be accurately described by solid state physics but still too strongly coupled to apply models from plasma physics. Our group is working on building self-consistent models for warm dense matter states using X- ray diffraction and X-ray scattering measurements at LCLS and ultra-fast electron diffraction. Regarding high energy astrophysics, the extreme efficiency of the particles acceleration and the production/amplification of strong magnetic turbulence are still matter of intense investigation.

What does a typical researcher do? The experimental team is responsible for designing and conducting experiments at different facilities all over the world. We have strong collaborations with world-class laser facilities such as LCLS at SLAC, NIF and OMEGA in the US, European XFEL and Helmholtzzentrum Dresden Rossendorf (such as DESY and HZDR) Germany. The majority of our time is then dedicated to the analysis and interpretation of experimental data. The Theory and Simulation group investigates the processes that characterize extreme states of matter using analytical modeling and simulations performed on the largest existing supercomputers. We closely work with the experimental team in the design and interpretation of high-energy-density laser-plasma experiments, where these processes can be directly probed and connected, through appropriate scaling laws, with astrophysical and laboratory plasma models.

What are the important recent discoveries? Because of envisioned applications of hydrogen targets in fusion research, planetary science, proton sources and fundamental physics, we have developed a novel cryogenic jet capability. These jets are well-suited for experiments at high repetition rate and have been successfully used in experiments for proton acceleration up to tens of MeV. This study aims at providing optimal high-quality proton beams for cancer therapy. One of our latest experiments performed at Omega and NIF lasers provided insights of the microphysics of astrophysical shocks where the magnetic field is amplified, and cosmic rays are accelerated.

Accelerator Physics - James MacArthur

Accelerator physics is an applied physics field concerned with developing machines to accelerate charged particles. There are two goals that drive the field: developing much more compact particle accelerators, and using particle beams in novel ways. Modern particle accelerators are built with traditional microwave accelerating cavities with accelerating gradients below 100 MeV/m. This has led TeV colliders well past the kilometer scale. Physicists are actively investigating the use of plasmas and silicon wafer-based accelerators to exceed GeV/m gradients. These compact technologies are also promising for reducing costs in medical applications, like ion beam therapy, where the most effective treatment is often prohibitively expensive. High energy accelerators were originally developed for particle physics experiments. A frustrating limitation on the highest achievable particle energy was synchrotron radiation -- light given off by a charged particle as it is accelerated in a circle. However, renewed interest in this radiation in the last decade has led accelerator physicists to optimize accelerators to produce extremely high brightness light pulses, with laser-like properties. Physicists call one such device a free-electron laser, a device that produces sub-femtosecond x-ray pulses. An attractive property of the field for many physicists is the diversity of work that a scientist performs. A physicist in the field often works on a project that includes some initial theoretical modeling work, to validate the model, and then hands-on work to make a proof-of-principle device.