Photo of an electron. First image of the orbital structure of the hydrogen atom

An atom (from the Greek “indivisible”) is once the smallest particle of a substance of microscopic size, the smallest part of a chemical element that bears its properties. The components of an atom - protons, neutrons, electrons - no longer have these properties and form them together. Covalent atoms form molecules. Scientists study the features of the atom, and although they are already quite well studied, they do not miss the opportunity to find something new - in particular, in the field of creating new materials and new atoms (continuing the periodic table). 99.9% of the mass of an atom is in the nucleus.

Don't be put off by the title. The black hole, accidentally created by employees of the SLAC National Accelerator Laboratory, turned out to be only the size of one atom, so nothing threatens us. And the name “black hole” only remotely describes the phenomenon observed by researchers. We have repeatedly told you about the most powerful X-ray laser in the world, called

PostScience debunks scientific myths and explains common misconceptions. We asked our experts to comment on popular ideas about the structure and properties of atoms.

Rutherford's model corresponds to modern ideas about the structure of the atom

This is true, but partly. The planetary model of the atom, in which light electrons orbit a heavy nucleus, like the planets around the sun, was proposed by Ernest Rutherford in 1911, after the nucleus itself was discovered in his laboratory. By bombarding a sheet of metal foil with alpha particles, scientists found that the vast majority of particles passed through the foil, much like light through glass. However, a small fraction of them - about one in 8,000 - were reflected back to the source. Rutherford explained these results by the fact that mass is not distributed evenly in matter, but is concentrated in “clumps” - atomic nuclei that carry a positive charge that repels positively charged alpha particles. Light, negatively charged electrons avoid "falling" onto the nucleus by spinning around them so that the centrifugal force balances the electrostatic attraction.

It is said that after inventing this model, Rutherford exclaimed: “Now I know what an atom looks like!” However, soon, following inspiration, Rutherford realized the flawed nature of his idea. Rotating around the nucleus, the electron creates alternating electric and magnetic fields around itself. These fields travel at the speed of light in the form of an electromagnetic wave. And such a wave carries energy with it! It turns out that, rotating around the nucleus, the electron will continuously lose energy and fall onto the nucleus within billionths of a second. (The question may arise whether the same argument could be applied to the planets of the Solar System: why don’t they fall on the Sun? Answer: gravitational waves, if they exist at all, are much weaker than electromagnetic waves, and the energy stored in the planets is much greater than in electrons, so the “power reserve” of the planets is many orders of magnitude longer.)

Rutherford instructed his collaborator, the young theorist Niels Bohr, to resolve the contradiction. After working for two years, Bohr found a partial solution. He postulated that among all possible orbits of an electron, there are those in which the electron can remain for a long time without emitting. An electron can move from one stationary orbit to another, while absorbing or emitting a quantum of an electromagnetic field with an energy equal to the difference in the energies of the two orbits. Using the initial principles of quantum physics, which had already been discovered by that time, Bohr was able to calculate the parameters of stationary orbits and, accordingly, the energies of radiation quanta corresponding to transitions. These energies had by that time been measured using spectroscopic methods, and Bohr's theoretical predictions coincided almost perfectly with the results of these measurements!

Despite this triumphant result, Bohr's theory hardly brought clarity to the issue of atomic physics, because it was semi-empirical: while postulating the presence of stationary orbits, it did not explain their physical nature in any way. A thorough clarification of the issue required at least another two decades, during which quantum mechanics was developed as a systematic, integral physical theory.

Within the framework of this theory, the electron is subject to the uncertainty principle and is described not by a material point, like a planet, but by a wave function “smeared” throughout its orbit. At each moment of time it is in a superposition of states corresponding to all points of the orbit. Since the mass distribution density in space, determined by the wave function, does not depend on time, an alternating electromagnetic field is not created around the electron; there is no energy loss.

Thus, the planetary model gives a true visual representation of what an atom looks like - Rutherford was right in his exclamation. However, it does not explain how the atom works: the structure is much more complex and deeper than what Rutherford modeled.

In conclusion, I note that the “myth” of the planetary model is at the very center of the intellectual drama that gave rise to a turning point in physics a hundred years ago and largely shaped this science in its modern form.

Alexander Lvovsky

PhD in Physics, professor at the Faculty of Physics at the University of Calgary, leader of the scientific group, member of the scientific council of the Russian Quantum Centre, editor of the scientific journal Optics Express

Individual atoms can be controlled

This is true. Of course you can, why not? You can control different parameters of an atom, and an atom has quite a lot of them: it has a position in space, speed, and there are also internal degrees of freedom. Internal degrees of freedom determine the magnetic and electrical properties of an atom, as well as its willingness to emit light or radio waves. Depending on the internal state of an atom, it can be more or less active in collisions and chemical reactions, change the properties of surrounding atoms, and its response to external fields depends on its internal state. In medicine, for example, they use so-called polarized gases to construct tomograms of the lungs - in such gases all atoms are in the same internal state, which allows them to “see” the volume they fill by their response.

It is not so difficult to control the speed of an atom or its position; it is much more difficult to select exactly one atom for control. But this can be done too. One of the approaches to such atom separation is realized using laser cooling. For control, it is always convenient to have a known initial position; it is quite good if the atom does not yet move. Laser cooling allows you to achieve both, localize atoms in space and cool them, that is, reduce their speed to almost zero. The principle of laser cooling is the same as that of a jet aircraft, only the latter emits a stream of gas to accelerate, and in the first case, the atom, on the contrary, absorbs a stream of photons (light particles) and decelerates. Modern laser cooling techniques can cool millions of atoms to walking speeds and below. Then various types of passive traps come into play, for example a dipole trap. If laser cooling uses a light field that the atom actively absorbs, then to keep it in a dipole trap, the frequency of the light is selected away from any absorption. It turns out that highly focused laser light is able to polarize small particles and dust grains and draw them into the region of greatest light intensity. The atom is no exception and is also drawn into the region of the strongest field. It turns out that if you focus the light as tightly as possible, then only exactly one atom can be held in such a trap. The fact is that if a second one falls into the trap, then it turns out to be so tightly pressed against the first that they form a molecule and at the same time fall out of the trap. However, such sharp focusing is not the only way to isolate a single atom; you can also use the properties of the interaction of an atom with a resonator for charged atoms, ions, you can use electric fields to capture and hold exactly one ion, and so on. It is even possible to excite one atom in a fairly limited ensemble of atoms into a very highly excited, so-called Rydberg state. An atom, once excited into the Rydberg state, blocks the possibility of excitation of its neighbors into the same state and, if the volume with atoms is small enough, will be the only one.

One way or another, once an atom is captured, it can be controlled. The internal state can be changed by light and radio frequency fields using the desired frequencies and polarization of the electromagnetic wave. It is possible to transfer an atom to any predetermined state, be it a certain state - a level or their superposition. The only question is the availability of the required frequencies and the ability to make sufficiently short and powerful control pulses. Recently, it has become possible to more effectively control atoms by keeping them in the vicinity of nanostructures, which allows not only to “talk” to the atom more effectively, but also to use the atom itself - more precisely, its internal states - to control the flow of light, and in the future, perhaps , and for computational purposes.

Controlling the position of an atom held by a trap is a very simple task - just move the trap itself. In the case of a dipole trap, move the light beam, which can be done, for example, with moving mirrors for a laser show. The atom can again be given speed in a reactive manner - forced to absorb light, and the ion can easily be accelerated by electric fields, just as was done in cathode ray tubes. So today, in principle, anything can be done with an atom, it’s just a matter of time and effort.

Alexey Akimov

The atom is indivisible

Partly true, partly not. Wikipedia gives us the following definition: “Atom (from ancient Greek ἄτομος - indivisible, uncut) is a particle of a substance of microscopic size and mass, the smallest part of a chemical element, which is the bearer of its properties. An atom consists of an atomic nucleus and electrons."

Nowadays, any educated person imagines the atom in Rutherford's model, briefly represented by the last sentence of this generally accepted definition. It would seem that the answer to the question/myth posed is obvious: an atom is a composite and complex object. However, the situation is not so clear-cut. Ancient philosophers rather put into the definition of an atom the meaning of the existence of an elementary and indivisible particle of matter and were unlikely to connect the problem with the structure of the elements of the periodic table. In Rutherford's atom we actually find such a particle - it is an electron.

The electron, in accordance with modern concepts, fits into the so-called

“>The standard model is a point whose state is described by position and velocity. It is important that the simultaneous specification of these kinematic characteristics is impossible due to the Heisenberg uncertainty principle, but by considering only one of them, for example the coordinate, it can be determined with arbitrarily high accuracy.

Is it then possible, using modern experimental technology, to try to localize an electron on a scale significantly smaller than the atomic size (~0.5 * 10-8 cm) and check its point-likeness? It turns out that if you try to localize an electron at the scale of the so-called Compton wavelength - about 137 times smaller than the size of a hydrogen atom - the electron will interact with its antimatter and the system will become unstable.

The pointiness and indivisibility of the electron and other elementary particles of matter is a key element of the principle of short-range action in field theory and is present in all fundamental equations that describe nature. Thus, the ancient philosophers were not so far from the truth in supposing that indivisible particles of matter exist.

Dmitry Kupriyanov

Doctor of Physical and Mathematical Sciences, Professor of Physics, St. Petersburg State Polytechnic University, Head. Department of Theoretical Physics SPbSPU

Science doesn't know this yet. Rutherford's planetary model of the atom assumed that electrons orbited the atomic nucleus, like planets orbiting the sun. At the same time, it was natural to assume that electrons are solid spherical particles. Rutherford's classical model was internally contradictory. Obviously, moving accelerated charged particles (electrons) would lose energy due to electromagnetic radiation and ultimately fall on the nuclei of atoms.

Niels Bohr proposed banning this process and introducing certain requirements for the radii of the orbits along which electrons move. Bohr's phenomenological model gave way to the quantum model of the atom, developed by Heisenberg, and the quantum, but more visual, model of the atom, proposed by Schrödinger. In the Schrödinger model, electrons are no longer balls flying in orbit, but standing waves that, like clouds, hang over the atomic nucleus. The shape of these “clouds” was described by the wave function introduced by Schrödinger.

The question immediately arose: what is the physical meaning of the wave function? The answer was proposed by Max Born: the squared modulus of the wave function is the probability of finding an electron at a given point in space. And this is where the difficulties began. The question arose: what does it mean to find an electron at a given point in space? Shouldn't Born's statement be understood as an admission that an electron is a small ball that flies along a certain trajectory and which can be caught at a certain point in this trajectory with a certain probability?

This is precisely the point of view held by Schrödinger and Albert Einstein, who joined him on this issue. They were objected to by the physicists of the Copenhagen School - Niels Bohr and Werner Heisenberg, who argued that between acts of measurement the electron simply does not exist, which means that it makes no sense to talk about the trajectory of its movement. The discussion between Bohr and Einstein about the interpretation of quantum mechanics went down in history. Bohr seemed to be the winner: he managed, although not very clearly, to refute all the paradoxes formulated by Einstein, and even the famous paradox of “Schrodinger’s cat”, formulated by Schrödinger in 1935. For several decades, most physicists agreed with Bohr that matter is not an objective reality given to us in sensations, as Karl Marx taught, but something that arises only at the moment of observation and does not exist without an observer. It is interesting that in Soviet times, philosophy departments in universities taught that such a point of view is subjective idealism, that is, a trend that runs counter to objective materialism - the philosophy of Marx, Engels, Lenin and Einstein. At the same time, at the physics departments, students were taught that the concepts of the Copenhagen School were the only correct ones (perhaps because the most famous Soviet theoretical physicist, Lev Landau, belonged to this school).

At the moment, the opinions of physicists are divided. On the one hand, the Copenhagen interpretation of quantum mechanics continues to be popular. Attempts to experimentally verify the validity of this interpretation (for example, the successful verification of the so-called Bell inequality by the French physicist Alain Aspe) enjoy almost unanimous approval from the scientific community. On the other hand, theorists are quite comfortable discussing alternative theories, such as the theory of parallel worlds. Returning to the electron, we can say that its chances of remaining a billiard ball are not very high yet. At the same time, they are different from zero. In the 20s of the 20th century, it was the billiard model of Compton scattering that made it possible to prove that light consists of quanta - photons. In many problems related to important and useful devices (diodes, transistors), it is convenient to think of an electron as a billiard ball. The wave nature of the electron is important for describing more subtle effects, such as the negative magnetoresistance of metals.

The philosophical question of whether a ball-electron exists between acts of measurement is not of great importance in ordinary life. However, this question continues to remain one of the most serious problems of modern physics.

Alexey Kavokin

Candidate of Physical and Mathematical Sciences, Professor at the University of Southampton, Head of the Quantum Polaritonics Group of the Russian Quantum Center, Scientific Director of the Mediterranean Institute of Fundamental Physics (Italy)

An atom can be completely destroyed

This is true. Breaking is not building. Anything can be destroyed, including an atom, to any degree of completeness. To a first approximation, an atom is a positively charged nucleus surrounded by negatively charged electrons. The first destructive action that can be performed on an atom is to tear off electrons from it. This can be done in different ways: you can focus powerful laser radiation on it, or you can irradiate it with fast electrons or other fast particles. An atom that has lost some of its electrons is called an ion. It is in this state that atoms are in the Sun, where temperatures are so high that it is practically impossible for atoms to retain their electrons in collisions.

The more electrons an atom has lost, the more difficult it is to remove the rest. Depending on the atomic number, an atom has more or fewer electrons. The hydrogen atom generally has only one electron, and it often loses it even under normal conditions, and it is the hydrogen that has lost its electrons that determines the pH of the water. A helium atom has two electrons, and in a fully ionized state is called an alpha particle - the kind of particles we would expect from a nuclear reactor rather than from ordinary water. Atoms containing many electrons require even more energy to remove all the electrons, but nevertheless, it is possible to remove all the electrons from any atom.

If all the electrons are torn off, then the nucleus remains, but it can also be destroyed. The nucleus consists of protons and neutrons (generally hadrons), and although they are quite tightly bound, an incident particle of sufficiently high energy can break them apart. Heavy atoms, in which there are too many neutrons and protons, tend to fall apart on their own, releasing quite a lot of energy - nuclear power plants are based on this principle.

But even if you break the nucleus and tear off all the electrons, the original particles remain: neutrons, protons, electrons. They, of course, can also be destroyed. Actually, this is what it does, which accelerates protons to enormous energies, completely destroying them in collisions. In this case, many new particles are born, which the collider studies. The same can be done with electrons and any other particles.

The energy of a destroyed particle does not disappear, it is distributed among other particles, and if there are enough of them, then it becomes impossible to quickly trace the original particle in the sea of ​​new transformations. Everything can be destroyed, there are no exceptions.

Alexey Akimov

Candidate of Physical and Mathematical Sciences, head of the “Quantum Simulators” group of the Russian Quantum Center, teacher at MIPT, employee of the Lebedev Physical Institute, researcher at Harvard University

Trurl began to catch atoms, scrape electrons from them, knead protons so that only his fingers flickered, prepared a proton dough, laid out electrons around it and - for the next atom; Not even five minutes had passed before he was holding a block of pure gold in his hands: he handed it to his muzzle, and she, having tried the block on her tooth and nodded her head, said:
- And indeed it is gold, but I can’t chase atoms like that. I'm too big.
- It’s okay, we’ll give you a special device! - Trurl persuaded him.

Stanislaw Lem, Cyberiad

Is it possible, using a microscope, to see an atom, distinguish it from another atom, observe the destruction or formation of a chemical bond, and see how one molecule transforms into another? Yes, if it is not a simple microscope, but an atomic force one. And you don’t have to limit yourself to observation. We live in a time when the atomic force microscope is no longer just a window into the microworld. Today, the instrument can be used to move atoms, break chemical bonds, study the stretching limit of single molecules—and even study the human genome.

Letters made from xenon pixels

Looking at atoms wasn't always so easy. The history of the atomic force microscope began in 1979, when Gerd Karl Binnig and Heinrich Rohrer, working at the IBM Research Center in Zurich, began creating an instrument that would allow the study of surfaces at atomic resolution. To come up with such a device, the researchers decided to use the tunneling effect - the ability of electrons to overcome seemingly impenetrable barriers. The idea was to determine the position of atoms in the sample by measuring the strength of the tunneling current arising between the scanning probe and the surface under study.

Binnig and Rohrer succeeded, and they went down in history as the inventors of the scanning tunneling microscope (STM), and in 1986 they received the Nobel Prize in Physics. The scanning tunneling microscope has made a real revolution in physics and chemistry.

In 1990, Don Eigler and Erhard Schweitzer, working at the IBM Research Center in California, showed that STM can be used not only to observe atoms, but to manipulate them. Using a scanning tunneling microscope probe, they created perhaps the most popular image symbolizing the transition of chemists to working with individual atoms - they painted three letters on a nickel surface with 35 xenon atoms (Fig. 1).

Binnig did not rest on his laurels - in the year he received the Nobel Prize, together with Christopher Gerber and Kelvin Quaite, who also worked at the IBM Zurich Research Center, he began work on another device for studying the microworld, devoid of the disadvantages inherent in STM. The fact is that with the help of a scanning tunneling microscope it was impossible to study dielectric surfaces, but only conductors and semiconductors, and to analyze the latter, it was necessary to create a significant vacuum between them and the microscope probe. Realizing that creating a new device was easier than upgrading an existing one, Binnig, Gerber and Quaite invented the atomic force microscope, or AFM. The principle of its operation is radically different: to obtain information about the surface, they measure not the current strength that arises between the microscope probe and the sample being studied, but the value of the attractive forces that arise between them, that is, weak non-chemical interactions - van der Waals forces.

The first working model of AFM was relatively simple. The researchers moved a diamond probe over the surface of the sample, connected to a flexible micromechanical sensor - a cantilever made of gold foil (attraction arises between the probe and the atom, the cantilever bends depending on the force of attraction and deforms the piezoelectric). The degree of bending of the cantilever was determined using piezoelectric sensors - in a similar way that the grooves and ridges of a vinyl record are converted into an audio recording. The design of the atomic force microscope allowed it to detect attractive forces of up to 10–18 newtons. A year after creating a working prototype, the researchers were able to obtain an image of the graphite surface topography with a resolution of 2.5 angstroms.

Over the three decades that have passed since then, AFM has been used to study almost any chemical object - from the surface of a ceramic material to living cells and individual molecules, both in a static and dynamic state. Atomic force microscopy has become the workhorse of chemists and materials scientists, and the number of studies using this method is constantly growing (Fig. 2).

Over the years, researchers have selected conditions for both contact and non-contact study of objects using atomic force microscopy. The contact method is described above and is based on van der Waals interaction between the cantilever and the surface. When operating in non-contact mode, the piezo vibrator excites oscillations of the probe at a certain frequency (most often resonant). The force acting from the surface causes both the amplitude and phase of the probe's oscillations to change. Despite some disadvantages of the non-contact method (primarily sensitivity to external noise), it eliminates the influence of the probe on the object under study, and therefore is more interesting for chemists.

Lively on probes, in pursuit of connections

Atomic force microscopy became non-contact in 1998 thanks to the work of Binnig’s student, Franz Josef Gissibl. It was he who proposed using a quartz reference oscillator of a stable frequency as a cantilever. 11 years later, researchers from the IBM laboratory in Zurich undertook another modification of non-contact AFM: the role of a sensor probe was not played by a sharp diamond crystal, but by a single molecule - carbon monoxide. This made it possible to move to subatomic resolution, as demonstrated by Leo Gross from the Zurich department of IBM. In 2009, using AFM, he made visible not atoms, but chemical bonds, obtaining a fairly clear and unambiguously readable “picture” for the pentacene molecule (Fig. 3; Science, 2009, 325, 5944, 1110–1114, doi: 10.1126/science.1176210).

Convinced that chemical bonds could be seen using AFM, Leo Gross decided to go further and use an atomic force microscope to measure bond lengths and orders - key parameters for understanding the chemical structure, and therefore the properties of substances.

Recall that differences in bond orders indicate different electron densities and different interatomic distances between two atoms (simply put, a double bond is shorter than a single bond). In ethane the carbon-carbon bond order is one, in ethylene it is two, and in the classical aromatic molecule benzene the carbon-carbon bond order is greater than one but less than two, and is considered to be 1.5.

Determining the bond order is much more difficult when moving from simple aromatic systems to planar or bulk polycondensed cyclic systems. Thus, the order of bonds in fullerenes, consisting of condensed five- and six-membered carbon rings, can take any value from one to two. The same uncertainty is theoretically inherent in polycyclic aromatic compounds.

In 2012, Leo Gross, together with Fabian Mohn, showed that an atomic force microscope with a non-contact metal probe modified with carbon monoxide can measure differences in the charge distribution of atoms and interatomic distances - that is, parameters associated with bond order ( Science, 2012, 337, 6100, 1326–1329, doi: 10.1126/science.1225621).

To do this, they studied two types of chemical bonds in fullerene - a carbon-carbon bond, common to the two six-membered carbon-containing rings of the C60 fullerene, and a carbon-carbon bond, common to the five- and six-membered rings. An atomic force microscope has shown that the condensation of six-membered rings produces a bond that is shorter and of greater order than the condensation of cyclic fragments C 6 and C 5 . The study of the features of chemical bonding in hexabenzocoronene, where six more C 6 rings are symmetrically located around the central C 6 ring, confirmed the results of quantum chemical modeling, according to which the order of the C-C bonds of the central ring (in Fig. 4, the letter i) must be greater than the bonds connecting this ring with peripheral cycles (in Fig. 4 the letter j). Similar results were obtained for a more complex polycyclic aromatic hydrocarbon containing nine six-membered rings.

Bond orders and interatomic distances were, of course, of interest to organic chemists, but it was more important to those who studied the theory of chemical bonds, predicting reactivity, and studying the mechanisms of chemical reactions. However, both synthetic chemists and specialists in studying the structure of natural compounds were in for a surprise: it turned out that the atomic force microscope can be used to determine the structure of molecules in the same way as NMR or IR spectroscopy. Moreover, it provides a clear answer to questions that these methods cannot handle.

From photography to cinema

In 2010, the same Leo Gross and Rainer Ebel were able to unambiguously establish the structure of a natural compound - cephalandol A, isolated from a bacterium Dermacoccus abyssi(Nature Chemistry, 2010, 2, 821–825, doi: 10.1038/nchem.765). The composition of cephalandol A was previously established using mass spectrometry, but analysis of the NMR spectra of this compound did not give a clear answer to the question of its structure: four options were possible. Using an atomic force microscope, the researchers immediately eliminated two of the four structures, and made the correct choice of the remaining two by comparing the results obtained using AFM and quantum chemical modeling. The task turned out to be difficult: unlike pentacene, fullerene and coronenes, cephalandol A contains not only carbon and hydrogen atoms, in addition, this molecule does not have a plane of symmetry (Fig. 5) - but this problem was also solved.

Further confirmation that the atomic force microscope can be used as an analytical tool was obtained in the group of Oscar Kustanza, who at that time worked at the School of Engineering at Osaka University. He showed how to use AFM to distinguish atoms that differ from each other much less than carbon and hydrogen ( Nature, 2007, 446, 64–67, doi: 10.1038/nature05530). Kustants examined the surface of an alloy consisting of silicon, tin and lead with a known content of each element. As a result of numerous experiments, he found that the force generated between the tip of the AFM probe and different atoms differs (Fig. 6). For example, the strongest interaction was observed when probing silicon, and the weakest interaction was observed when probing lead.

It is assumed that in the future, the results of atomic force microscopy for recognizing individual atoms will be processed in the same way as NMR results - by comparing relative values. Since the exact composition of the sensor tip is difficult to control, the absolute value of the force between the sensor and various surface atoms depends on the experimental conditions and the brand of the device, but the ratio of these forces for any composition and shape of the sensor remains constant for each chemical element.

In 2013, the first examples of using AFM to obtain images of individual molecules before and after chemical reactions appeared: a “photoset” of reaction products and intermediates is created, which can then be edited into a kind of documentary film ( Science, 2013, 340, 6139, 1434–1437; doi: 10.1126/science.1238187 ).

Felix Fischer and Michael Crommie from the University of California at Berkeley applied silver to the surface 1,2-bis[(2-ethynylphenyl)ethynyl]benzene, imaged the molecules and heated the surface to initiate cyclization. Half of the original molecules turned into polycyclic aromatic structures consisting of fused five six-membered and two five-membered rings. Another quarter of the molecules formed structures consisting of four six-membered rings connected through one four-membered ring, and two five-membered rings (Fig. 7). The remaining products were oligomeric structures and, in minor quantities, polycyclic isomers.

These results surprised the researchers twice. Firstly, only two main products were formed during the reaction. Secondly, their structure was surprising. Fisher notes that chemical intuition and experience made it possible to draw dozens of possible reaction products, but none of them corresponded to the compounds that formed on the surface. It is possible that the occurrence of atypical chemical processes was facilitated by the interaction of the starting substances with the substrate.

Naturally, after the first serious successes in the study of chemical bonds, some researchers decided to use AFM to observe weaker and less studied intermolecular interactions, in particular hydrogen bonding. However, work in this area is just beginning, and the results are contradictory. Thus, some publications report that atomic force microscopy made it possible to observe hydrogen bonding ( Science, 2013, 342, 6158, 611–614, doi: 10.1126/science.1242603), others argue that these are just artifacts due to the design features of the device, and the experimental results need to be interpreted more carefully ( Physical Review Letters, 2014, 113, 186102, doi: 10.1103/PhysRevLett.113.186102). Perhaps the final answer to the question of whether hydrogen and other intermolecular interactions can be observed using atomic force microscopy will be obtained already in this decade. To do this, it is necessary to increase the AFM resolution at least several times more and learn to obtain images without interference ( Physical Review B, 2014, 90, 085421, doi: 10.1103/PhysRevB.90.085421).

Single molecule synthesis

In skillful hands, both STM and AFM transform from devices capable of studying matter into devices capable of purposefully changing the structure of matter. With the help of these devices, it has already been possible to obtain the “smallest chemical laboratories”, in which a substrate is used instead of a flask, and individual molecules are used instead of moles or millimoles of reacting substances.

For example, in 2016, an international team of scientists led by Takashi Kumagai used non-contact atomic force microscopy to convert the porphycene molecule from one form to another ( Nature Chemistry, 2016, 8, 935–940, doi: 10.1038/nchem.2552). Porphycene can be considered a modification of porphyrin, the internal ring of which contains four nitrogen atoms and two hydrogen atoms. The vibrations of the AFM probe transferred enough energy to the porphycene molecule to transfer these hydrogens from one nitrogen atom to another, and the result was a “mirror image” of this molecule (Fig. 8).

The team led by the indefatigable Leo Gross also showed that it was possible to initiate the reaction of a single molecule - they converted dibromomanthracene into a ten-membered cyclic diyne (Fig. 9; Nature Chemistry, 2015, 7, 623–628, doi: 10.1038/nchem.2300 ). Unlike Kumagai et al., they used a scanning tunneling microscope to activate the molecule, and the result of the reaction was monitored using an atomic force microscope.

The combined use of a scanning tunneling microscope and an atomic force microscope has even made it possible to obtain a molecule that cannot be synthesized using classical techniques and methods ( Nature Nanotechnology, 2017, 12, 308–311, doi: 10.1038/nnano.2016.305 ). This is triangulene, an unstable aromatic diradical whose existence was predicted six decades ago, but all attempts at synthesis failed (Fig. 10). Chemists from Niko Pavlicek's group obtained the desired compound by removing two hydrogen atoms from its precursor using STM and confirming the synthetic result using AFM.

It is expected that the number of works devoted to the use of atomic force microscopy in organic chemistry will continue to grow. Currently, more and more scientists are trying to replicate on the surface reactions that are well known in “solution chemistry.” But perhaps synthetic chemists will begin to reproduce in solution the reactions that were originally carried out on the surface using AFM.

From nonliving to living

Cantilevers and probes of atomic force microscopes can be used not only for analytical studies or the synthesis of exotic molecules, but also for solving applied problems. There are already known cases of using AFM in medicine, for example, for the early diagnosis of cancer, and here the pioneer is the same Christopher Gerber, who had a hand in developing the principle of atomic force microscopy and the creation of AFM.

Thus, Gerber was able to teach AFM to detect point mutations in ribonucleic acid in melanoma (on material obtained as a result of a biopsy). To do this, the gold cantilever of an atomic force microscope was modified with oligonucleotides that can enter into intermolecular interaction with RNA, and the strength of this interaction can also be measured due to the piezoelectric effect. The sensitivity of the AFM sensor is so high that they are already trying to use it to study the effectiveness of the popular genome editing method CRISPR-Cas9. Technologies created by different generations of researchers come together here.

To paraphrase a classic of one of the political theories, we can say that we already see the limitless possibilities and inexhaustibility of atomic force microscopy and are hardly able to imagine what lies ahead in connection with the further development of these technologies. But today, scanning tunneling microscopes and atomic force microscopes give us the opportunity to see and touch atoms. We can say that this is not only an extension of our eyes, allowing us to look into the microcosm of atoms and molecules, but also new eyes, new fingers, capable of touching and controlling this microcosm.

Physicists from the USA managed to capture individual atoms in photographs with record resolution, Day.Az reports with reference to Vesti.ru

Scientists from Cornell University in the USA managed to capture individual atoms in photographs with record resolution - less than half an angstrom (0.39 Å). Previous photographs had half the resolution - 0.98 Å.

Powerful electron microscopes that can see atoms have existed for half a century, but their resolution is limited by the wavelength of visible light, which is larger than the diameter of the average atom.

Therefore, scientists use a certain analogue of lenses that focus and magnify images in electron microscopes - this is a magnetic field. However, fluctuations in the magnetic field distort the results obtained. To remove distortions, additional devices are used that correct the magnetic field, but at the same time increase the complexity of the electron microscope design.

Previously, physicists at Cornell University developed the Electron Microscope Pixel Array Detector (EMPAD), which replaces a complex system of generators that focuses incoming electrons into one small matrix with a resolution of 128x128 pixels that are sensitive to individual electrons. Each pixel records the angle of reflection of the electron; Knowing it, scientists use the technique of ptyakography to reconstruct the characteristics of the electrons, including the coordinates of the point from which it was released.

Atoms in the highest resolution

David A. Muller et al. Nature, 2018.

In the summer of 2018, physicists decided to improve the quality of the resulting images to a record resolution to date. The scientists attached a sheet of 2D material, molybdenum sulfide MoS2, to a moving beam and fired electron beams by rotating the beam at different angles to the electron source. Using EMPAD and ptaycography, scientists determined the distances between individual molybdenum atoms and obtained an image with a record resolution of 0.39 Å.

“We basically created the smallest line in the world,” explains Sol Gruner, one of the authors of the experiment. In the resulting image, it was possible to discern sulfur atoms with a record resolution of 0.39 Å. Moreover, it was even possible to discern a place where one such atom was missing (indicated by an arrow).

Sulfur atoms in record resolution

A hydrogen atom capturing electron clouds. And although modern physicists, using accelerators, can even determine the shape of a proton, the hydrogen atom, apparently, will remain the smallest object, the image of which makes sense to call a photograph. Lenta.ru presents an overview of modern methods of photographing the microworld.

Strictly speaking, there is almost no ordinary photography left these days. The images that we habitually call photographs and can be found, for example, in any photo report of Lenta.ru, are actually computer models. A light-sensitive matrix in a special device (traditionally it continues to be called a “camera”) determines the spatial distribution of light intensity in several different spectral ranges, the control electronics stores this data in digital form, and then another electronic circuit, based on this data, gives a command to the transistors in the liquid crystal display . Film, paper, special solutions for their processing - all this has become exotic. And if we remember the literal meaning of the word, then photography is “light painting”. So what can we say that scientists managed to photograph atom, is possible only with a fair amount of convention.

More than half of all astronomical images have long been taken by infrared, ultraviolet and X-ray telescopes. Electron microscopes irradiate not with light, but with a beam of electrons, while atomic force microscopes even scan the relief of the sample with a needle. There are X-ray microscopes and magnetic resonance imaging scanners. All these devices give us accurate images of various objects, and despite the fact that, of course, there is no need to talk about “light painting” here, we will still allow ourselves to call such images photographs.

Experiments by physicists to determine the shape of the proton or the distribution of quarks inside particles will remain behind the scenes; our story will be limited to the scale of atoms.

Optics never gets old

As it turned out in the second half of the 20th century, optical microscopes still have room for improvement. A decisive moment in biological and medical research was the advent of fluorescent dyes and methods that allow the selective labeling of certain substances. This wasn't "just a new coat of paint," it was a real revolution.

Contrary to popular belief, fluorescence is not a glow in the dark at all (the latter is called luminescence). This is the phenomenon of absorption of quanta of a certain energy (say, blue light) with the subsequent emission of other quanta of lower energy and, accordingly, other light (when blue is absorbed, green ones will be emitted). If you install a light filter that transmits only the quanta emitted by the dye and blocks the light that causes fluorescence, you can see a dark background with bright spots of dyes, and the dyes, in turn, can color the sample extremely selectively.

For example, you can color the cytoskeleton of a nerve cell in red, the synapses in green, and the nucleus in blue. You can make a fluorescent label that will allow you to detect protein receptors on the membrane or molecules synthesized by the cell under certain conditions. The immunohistochemical staining method has revolutionized biological science. And when genetic engineers learned to make transgenic animals with fluorescent proteins, this method experienced a rebirth: for example, mice with neurons painted in different colors became a reality.

In addition, engineers came up with (and practiced) the method of so-called confocal microscopy. Its essence lies in the fact that the microscope focuses on a very thin layer, and a special diaphragm cuts off the illumination created by objects outside this layer. Such a microscope can sequentially scan a sample from top to bottom and obtain a stack of images, which is a ready-made basis for a three-dimensional model.

The use of lasers and sophisticated optical beam control systems has solved the problem of dyes fading and drying of delicate biological samples under bright light: the laser beam scans the sample only when it is necessary for imaging. And in order not to waste time and effort examining a large specimen through an eyepiece with a narrow field of view, engineers proposed an automatic scanning system: you can put a glass with a sample on the stage of a modern microscope, and the device will independently take a large-scale panorama of the entire sample. At the same time, it will focus in the right places, and then stitch together many frames together.

Some microscopes can contain live mice, rats, or at least small invertebrate animals. Others provide a slight magnification, but are combined with an X-ray machine. To eliminate interference from vibrations, many are mounted on special tables weighing several tons inside rooms with a carefully controlled microclimate. The cost of such systems exceeds the cost of other electron microscopes, and competitions for the most beautiful frame have long become a tradition. In addition, the improvement of optics continues: from searching for the best types of glass and selecting optimal lens combinations, engineers have moved on to ways to focus light.

We have specifically listed a number of technical details in order to show that progress in the field of biological research has long been associated with progress in other areas. If there were no computers that could automatically count the number of stained cells in several hundred photographs, supermicroscopes would be of little use. And without fluorescent dyes, all millions of cells would be indistinguishable from each other, so it would be almost impossible to monitor the formation of new ones or the death of old ones.

In essence, the first microscope was a clamp with a spherical lens attached to it. An analogue of such a microscope can be a simple playing card with a hole made in it and a drop of water. According to some reports, similar devices were used by gold miners in Kolyma already in the last century.

Beyond the diffraction limit

Optical microscopes have a fundamental disadvantage. The fact is that using the shape of light waves it is impossible to reconstruct the shape of those objects that turned out to be much shorter than the wavelength: with the same success you can try to examine the fine texture of the material with your hand in a thick welding glove.

The limitations created by diffraction have been partially overcome, without violating the laws of physics. Two circumstances help optical microscopes dive under the diffraction barrier: the fact that during fluorescence quanta are emitted by individual dye molecules (which can be quite far apart from each other), and the fact that due to the superposition of light waves it is possible to obtain a bright spot with a diameter smaller than wavelength.

When superimposed on each other, light waves can cancel each other out, so the sample illumination parameters are set so that the smallest possible area falls into the bright area. In combination with mathematical algorithms that allow, for example, to remove ghosting in the image, such directional lighting provides a sharp increase in the quality of shooting. It becomes possible, for example, to examine intracellular structures using an optical microscope and even (by combining the described method with confocal microscopy) to obtain three-dimensional images of them.

Electron microscope to electronic devices

In order to discover atoms and molecules, scientists did not have to look at them - molecular theory did not need to see the object. But microbiology became possible only after the invention of the microscope. Therefore, at first, microscopes were associated specifically with medicine and biology: physicists and chemists who studied significantly smaller objects made do with other means. When they wanted to look at the microworld, diffraction limitations became a serious problem, especially since the fluorescence microscopy methods described above were still unknown. And there is little sense in increasing the resolution from 500 to 100 nanometers if the object that needs to be examined is even smaller!

Knowing that electrons can behave both as a wave and as a particle, physicists from Germany created an electron lens in 1926. The idea behind it was very simple and understandable to any schoolchild: since the electromagnetic field deflects electrons, it can be used to change the shape of a beam of these particles, pulling them apart in different directions, or, conversely, to reduce the diameter of the beam. Five years later, in 1931, Ernst Ruska and Max Knoll built the world's first electron microscope. In the device, the sample was first illuminated by a beam of electrons, and then an electron lens expanded the beam that passed through before it fell on a special luminescent screen. The first microscope provided a magnification of only 400 times, but replacing light with electrons opened the way to photography with a magnification of hundreds of thousands of times: the designers only had to overcome a few technical obstacles.

An electron microscope made it possible to examine the structure of cells in a quality previously unattainable. But from this image it is impossible to understand the age of the cells and the presence of certain proteins in them, and this information is very necessary for scientists.

Electron microscopes now allow close-up photographs of viruses. There are various modifications of devices that allow not only to illuminate thin sections, but also to examine them in “reflected light” (in reflected electrons, of course). We will not talk in detail about all the variants of microscopes, but we note that recently researchers have learned to reconstruct an image from a diffraction pattern.

Touch, not look

Another revolution occurred through a further departure from the principle of “light and see.” An atomic force microscope, as well as a scanning tunneling microscope, no longer shines anything on the surface of samples. Instead, a particularly thin needle moves across the surface, literally bouncing even over irregularities the size of an individual atom.

Without going into details of all such methods, we note the main thing: the needle of a tunnel microscope can not only be moved along the surface, but also used to rearrange atoms from place to place. This is how scientists create inscriptions, drawings and even cartoons in which a drawn boy plays with an atom. A real xenon atom dragged by the tip of a scanning tunneling microscope.

A tunneling microscope is called a microscope because it uses the effect of a tunneling current flowing through a needle: electrons pass through the gap between the needle and the surface due to the tunneling effect predicted by quantum mechanics. This device requires a vacuum to operate.

An atomic force microscope (AFM) is much less demanding on environmental conditions - it can (with a number of restrictions) operate without pumping out air. In a certain sense, AFM is the nanotechnological successor to the gramophone. A needle mounted on a thin and flexible cantilever bracket ( cantilever and there is a “bracket”), moves along the surface without applying voltage to it and follows the relief of the sample in the same way as the stylus of a gramophone follows along the grooves of a gramophone record. The bending of the cantilever causes the mirror mounted on it to deflect; the mirror deflects the laser beam, and this allows one to very accurately determine the shape of the sample under study. The main thing is to have a fairly accurate system for moving the needle, as well as a supply of needles that must be perfectly sharp. The radius of curvature at the tips of such needles may not exceed one nanometer.

AFM allows you to see individual atoms and molecules, but, like a tunneling microscope, it does not allow you to look beneath the surface of a sample. In other words, scientists have to choose between being able to see atoms and being able to study the entire object. However, even for optical microscopes the insides of the samples being studied are not always accessible, because minerals or metals usually do not transmit light well. In addition, there are still difficulties with photographing atoms - these objects appear as simple balls, the shape of electron clouds is not visible in such images.

Synchrotron radiation, which occurs when charged particles accelerated by accelerators are decelerated, makes it possible to study the fossilized remains of prehistoric animals. By rotating the sample under X-rays, we can obtain three-dimensional tomograms - this is how, for example, the brain was found inside the skull of fish that became extinct 300 million years ago. It is possible to do without rotation if the transmitted radiation is recorded by recording the X-rays scattered due to diffraction.

And this is not all the possibilities that X-ray radiation opens up. When irradiated with it, many materials fluoresce, and the chemical composition of the substance can be determined by the nature of the fluorescence: this is how scientists color ancient artifacts, the works of Archimedes erased in the Middle Ages, or the color of feathers of long-extinct birds.

Atoms pose

Against the backdrop of all the possibilities that X-ray or optical fluorescence methods provide, a new method of photographing individual atoms no longer seems like such a big breakthrough in science. The essence of the method that made it possible to obtain the images presented this week is as follows: electrons are stripped from ionized atoms and sent to a special detector. Each act of ionization removes an electron from a certain position and gives one point in the “photograph”. Having accumulated several thousand such points, scientists formed a picture showing the most likely locations for detecting an electron around the nucleus of an atom, and this, by definition, is an electron cloud.

In conclusion, the ability to see individual atoms with their electron clouds is rather the icing on the cake of modern microscopy. It was important for scientists to study the structure of materials, study cells and crystals, and the resulting development of technology made it possible to reach the hydrogen atom. Anything less is already the sphere of interest of specialists in elementary particle physics. And biologists, materials scientists and geologists still have room to improve microscopes, even with rather modest magnification compared to the background of atoms. Neurophysiologists, for example, have long wanted to have a device capable of seeing individual cells inside a living brain, and the creators of Mars rovers would sell their souls for an electron microscope that could fit on board a spacecraft and could work on Mars.



Did you like the article? Share with your friends!