Claude Shannon and what is he famous for? briefly. American engineer Claude Shannon and why he is famous

Claude Elwood Shannon(English Claude Elwood Shannon; April 30, 1916, Petocki, Michigan, USA - February 24, 2001, Medford, Massachusetts, USA) - American engineer, cryptanalyst and mathematician. Considered the “father of the information age.”

He is the founder of information theory, which has found application in modern high-tech communication systems. Provided fundamental concepts, ideas and their mathematical formulations that currently form the basis for modern communication technologies. In 1948, he proposed using the word “bit” to denote the smallest unit of information (in the article “Mathematical Theory of Communication”). In addition, the concept of entropy was an important feature of Shannon's theory. He demonstrated that the entropy he introduced is equivalent to a measure of the uncertainty of the information in the transmitted message. Shannon's papers "A Mathematical Theory of Communications" and "The Theory of Communications in Secret Systems" are considered fundamental to information theory and cryptography. Claude Shannon was one of the first to approach cryptography from a scientific point of view; he was the first to formulate its theoretical foundations and introduce many basic concepts. Shannon made key contributions to the theory of probabilistic circuits; game theory; the theory of automata and the theory of control systems are areas of science included in the concept of “cybernetics”.

Biography

Childhood and youth

Claude Shannon was born on April 30, 1916 in Petocki, Michigan, USA. His father, Claude Sr. (1862-1934), was a self-made businessman, lawyer and for some time a judge. Shannon's mother, Maybelle Woolf Shannon (1890-1945), was a foreign language teacher who later became principal of Gaylord High School. Shannon's father had a mathematical mind and was aware of his words. Shannon was instilled with a love of science by his grandfather. Shannon's grandfather was an inventor and farmer. He invented the washing machine along with many other equipment useful in agriculture. Thomas Edison was a distant relative of the Shannons.

Claude spent the first sixteen years of his life in Gaylord, Michigan, where he graduated from Gaylord Comprehensive High School in 1932. In his youth, he worked as a courier for Western Union. Young Claude was interested in designing mechanical and automatic devices. He collected model airplanes and radio circuits, created a radio-controlled boat and a telegraph system between a friend's house and his own. At times he had to repair radios for a local department store.

Shannon, in his own words, was an apolitical person and an atheist.

University years

In 1932, Shannon was enrolled at the University of Michigan, where in one of his courses he became acquainted with the work of George Boole. In 1936, Claude graduated from the University of Michigan with a double major in mathematics and electrical engineering and went to the Massachusetts Institute of Technology (MIT), where he worked as a research assistant. He performed operator duties on a mechanical computing device, an analog computer called a "differential analyzer", developed by his supervisor Vanevar Bush. By studying the complex, highly specialized electrical circuits of a differential analyzer, Shannon saw that Boole's concepts could be put to good use. After working the summer of 1937 at Bell Telephone Laboratories, he wrote a paper based on his master's thesis that year, "Symbolic Analysis of Relay and Switching Circuits." It should be noted that Frank Lauren Hitchcock supervised the master's thesis and provided useful criticism and advice. The article itself was published in 1938 in the publication of the American Institute of Electrical Engineers (AIEE). In this work, he showed that switching circuits could be used to replace the electromechanical relay circuits then used to route telephone calls. He then extended this concept by showing that these circuits could solve all the problems that Boolean algebra could solve. Also, in the last chapter, he presents the prototypes of several circuits, for example, a 4-bit adder. For this article, Shannon was awarded the Alfred Nobel Prize by the American Institute of Electrical Engineers in 1940. The proven ability to implement any logical calculations in electrical circuits formed the basis for the design of digital circuits. And digital circuits are, as we know, the basis of modern computing technology, thus, the results of his work are one of the most important scientific results of the twentieth century. Howard Gardner of Harvard University called Shannon's work "perhaps the most important, as well as the most famous master's thesis of the century."

Anatoly Ushakov, Doctor of Technical Sciences, Prof. department control systems and informatics, ITMO University

Many generations of technical specialists of the second half of the 20th century, even those quite far from the theory of automatic control and cybernetics, having left the walls of universities, remembered for the rest of their lives the names of the “author’s” scientific and technical achievements: Lyapunov functions, Markov processes, frequency and Nyquist criterion, Wiener process , Kalman filter. Among such achievements, Shannon's theorems take pride of place. 2016 marks the hundredth anniversary of the birth of their author, scientist and engineer Claude Shannon.

“Who owns the information, owns the world”

W. Churchill

Rice. 1. Claude Shannon (1916–2001)

Claude Elwood Shannon (Fig. 1) was born on April 30, 1916 in the city of Petocki, located on the shores of Lake Michigan, Michigan (USA), in the family of a lawyer and foreign language teacher. His older sister Katherine was interested in mathematics and eventually became a professor, and Shannon's father combined his work as a lawyer with amateur radio. A distant relative of the future engineer was the world-famous inventor Thomas Edison, who had 1093 patents.

Shannon graduated from comprehensive high school in 1932 at the age of sixteen, while receiving additional education at home. His father bought him construction sets and amateur radio sets and contributed in every possible way to his son’s technical creativity, and his sister involved him in advanced mathematics studies. Shannon fell in love with both worlds - engineering and mathematics.

In 1932, Shannon entered the University of Michigan, from which he graduated in 1936, receiving a bachelor's degree with a double major in mathematics and electrical engineering. During his studies, he found in the university library two works by George Boole - “Mathematical Analysis of Logic” and “Logical Calculus”, written in 1847 and 1848, respectively. Shannon studied them carefully, and this, apparently, determined his future scientific interests.

After graduation, Claude Shannon took a job at the Massachusetts Institute of Technology (MIT) Electrical Engineering Laboratory as a research assistant, where he worked on upgrading the differential analyzer of Vannevar Bush, vice president of MIT, an analog “computer.” From that time on, Vannevar Bush became Claude Shannon's scientific mentor. While studying the complex, highly specialized relay and switching circuitry of the differential analyzer control device, Shannon realized that George Boole's concepts could be put to good use in this area.

At the end of 1936, Shannon entered the master's program, and already in 1937 he wrote the abstract of his dissertation for a master's degree and, on its basis, prepared the article “Symbolic Analysis of Relays and Switching Circuits,” which was published in 1938 in the publication of the American Institute Electrical Engineers (AIEE). This work attracted the attention of the scientific electrical engineering community, and in 1939 the American Society of Civil Engineers awarded Shannon the Alfred Nobel Prize for it.

Having not yet defended his master's thesis, Shannon, on the advice of Bush, decided to work on a doctorate in mathematics at MIT, concerning problems in genetics. According to Bush, genetics could be a successful problem area for applying Shannon's knowledge. Shannon's doctoral dissertation, entitled “Algebra for Theoretical Genetics,” was completed in the spring of 1940 and was devoted to problems of gene combinatorics. Shannon received his doctorate in mathematics and at the same time defended his dissertation on “Symbolic analysis of relays and switching circuits”, becoming a master of electrical engineering.

Shannon's doctoral thesis did not receive much support from geneticists and for this reason was never published. However, the master's thesis turned out to be a breakthrough in switching and digital technology. The last chapter of the dissertation gave many examples of the successful application of the logical calculus developed by Shannon to the analysis and synthesis of specific relay and switching circuits: selector circuits, a lock with an electrical secret, binary adders. All of them clearly demonstrate the scientific breakthrough accomplished by Shannon and the enormous practical benefits of the formalism of logical calculus. This is how digital logic was born.

Rice. 2. Claude Shannon at Bell Labs (mid-1940s)

In the spring of 1941, Claude Shannon became an employee of the mathematics department of the Bell Laboratories research center (Fig. 2). A few words should be said about the atmosphere that 25-year-old Claude Shannon found himself in - created by Harry Nyquist, Henrik Bode, Ralph Hartley, John Tukey and other Bell employees Laboratories. All of them already had certain results in the development of information theory, which Shannon would eventually develop to the level of big science.

At this time, war was already going on in Europe, and Shannon was conducting research that was widely funded by the US government. The work Shannon did at Bell Laboratories was related to cryptography, which led him to work on the mathematical theory of cryptography and eventually allowed him to analyze ciphertexts using information-theoretic methods (Figure 3).

In 1945, Shannon completed a large secret scientific report on the topic “Communication Theory of Secrecy Systems”.

Rice. 3. At the encryption machine

At this time, Claude Shannon was already close to speaking to the scientific community with new basic concepts in information theory. And in 1948 he published his landmark work “Mathematical Theory of Communications”. Shannon's mathematical theory of communication assumed a three-component structure, composed of a source of information, a receiver of information and a “transport medium” - a communication channel characterized by throughput and the ability to distort information during transmission. A certain range of problems arose: how to quantify information, how to effectively package it, how to estimate the permissible speed of outputting information from a source to a communication channel with a fixed bandwidth in order to guarantee error-free transmission of information, and, finally, how to solve the last problem in the presence of interference in the channel connections? Claude Shannon gave humanity comprehensive answers to all these questions with his theorems.

It should be said that his colleagues in the “shop” helped Shannon with terminology. Thus, the term for the minimum unit of information amount - “bit” - was proposed by John Tukey, and the term for estimating the average amount of information per symbol of the source - “entropy” - John von Neumann. Claude Shannon presented his seminal work in the form of twenty-three theorems. Not all theorems are equivalent, some of them are of an auxiliary nature or are devoted to special cases of information theory and its transmission over discrete and continuous communication channels, but six theorems are conceptual and form the framework of the building of information theory created by Claude Shannon.

  1. The first of these six theorems is related to the quantitative assessment of information generated by a source of information, within the framework of a stochastic approach based on a measure in the form of entropy indicating its properties.
  2. The second theorem is devoted to the problem of rational packing of symbols generated by a source during their primary encoding. It gave rise to an effective coding procedure and the need to introduce a “source encoder” into the structure of the information transmission system.
  3. The third theorem concerns the problem of matching the flow of information from the information source with the capacity of the communication channel in the absence of interference, which guarantees the absence of distortion of information during transmission.
  4. The fourth theorem solves the same problem as the previous one, but in the presence of interference in the binary communication channel, the effect of which on the transmitted code message contributes to the probability of distortion of an arbitrary code bit. The theorem contains a transmission slowdown condition that guarantees a given probability of error-free delivery of the code message to the recipient. This theorem is the methodological basis of noise-protective coding, which led to the need to introduce a “channel encoder” into the structure of the transmission system.
  5. The fifth theorem is devoted to estimating the capacity of a continuous communication channel, characterized by a certain frequency bandwidth and given powers of the useful signal and the interference signal in the communication channel. The theorem defines the so-called Shannon limit.
  6. The last of the theorems, called the Nyquist-Shannon-Kotelnikov theorem, is devoted to the problem of error-free reconstruction of a continuous signal from its time-discrete samples, which allows us to formulate a requirement for the value of the discrete time interval, determined by the width of the frequency spectrum of the continuous signal, and to form basis functions called reference functions .

It should be said that initially many mathematicians around the world had doubts about the evidence base of these theorems. But over time, the scientific community became convinced of the correctness of all postulates, finding mathematical confirmation for them. In our country, A.Ya. Khinchin devoted his efforts to this matter. and Kolmogorov A.N. .

In 1956, the famous Claude Shannon left Bell Laboratories without breaking ties with it, and became a full professor at two faculties at the Massachusetts Institute of Technology: mathematics and electrical engineering.

Rice. 4. Shannon's Labyrinth

Claude Shannon always had many interests completely unrelated to his professional activities. Shannon's outstanding engineering talent was manifested in the creation of all kinds of machines and mechanisms, including the mechanical Theseus mouse, which solves a labyrinth problem (Fig. 4), a computer with operations on Roman numerals, as well as computers and programs for playing chess.

In 1966, at the age of 50, Claude Shannon retired from teaching and devoted himself almost entirely to his hobbies. He creates a unicycle with two saddles, a folding knife with a hundred blades, robots that solve a Rubik's cube, and a robot that juggles balls. In addition, Shannon himself continues to hone his juggling skills, bringing the number of balls to four (Fig. 5). Witnesses of his youth at Bell Laboratories recalled how he rode around the company's corridors on a unicycle, while juggling balls.

Rice. 5. Claude Shannon - juggler

Unfortunately, Claude Shannon did not have close contacts with Soviet scientists. Nevertheless, he managed to visit the USSR in 1965 at the invitation of the Scientific and Technical Society of Radio Engineering, Electronics and Communications (NTORES) named after A.S. Popova. One of the initiators of this invitation was multiple world chess champion Mikhail Botvinnik, Doctor of Technical Sciences, professor, who was also an electrical engineer and was interested in chess programming. A lively discussion took place between Mikhail Botvinnik and Claude Shannon about the problems of computerizing the art of chess. The participants came to the conclusion that this was very interesting for programming and unpromising for chess. After the discussion, Shannon asked Botvinnik to play chess with him and during the game he even had a slight advantage (a rook for a knight and a pawn), but still lost on the 42nd move.

During the last years of his life, Claude Shannon was seriously ill. He died in February 2001 in a Massachusetts nursing home from Alzheimer's disease at the age of 85.

Claude Shannon left a rich applied and philosophical legacy. He created a general theory of discrete automation and computer technology devices, a technology for effectively using the capabilities of the channel medium. All modern archivers used in the computer world rely on Shannon's efficient coding theorem. The basis of his philosophical heritage consists of two ideas. First: the goal of any management should be to reduce entropy as a measure of uncertainty and disorder in the system environment. Management that does not solve this problem is redundant, i.e. unnecessary. The second is that everything in this world is, in some sense, a “communication channel.” The communication channel is a person, a team, an entire functional environment, industry, a transport structure, and the country as a whole. And if you do not coordinate technical, informational, humanitarian, governmental solutions with the capacity of the channel environment for which they are designed, then do not expect good results.

Literature

  1. Shannon C. E. A Mathematical Theory of Communication. Bell Systems Technical Journal. July and Oct. 1948 // Claude Elwood Shannon. Collected Papers. N.Y., 1993. P. 8-111.
  2. Shannon C. E. Communication in the presence of noise. Proc.IRE. 1949. V. 37. No. 10.
  3. Shannon C. E. Communication Theory of Secrecy Systems. Bell Systems Technical Journal. July and Oct. 1948 // Claude Elwood Shannon. Collected Papers. N.Y., 1993. P. 112-195.
  4. Automatic machines. Collection of articles ed. K. E. Shannon, J. McCarthy / Trans. from English M.: From-in. lit. 1956.
  5. Robert M. Fano Transmission of information: A statistical theory of communication. Published Jointly by the M.I.T., PRESS and JOHN WILEY & SONS, INC. New York, London. 1961.
  6. www. research.att. com/~njas/doc/ces5.html.
  7. Kolmogorov A. N. Preface // Works on information theory and cybernetics / K. Shannon; lane from English under. ed. R.L. Dobrushina and O.B. Lupanova; preface A. N. Kolmogorov. M., 1963.
  8. Levin V.I.K.E. Shannon and modern science // Bulletin of TSTU. 2008. Volume 14. No. 3.
  9. Viner N. Ya. – mathematician / Transl. from English M.: Science. 1964.
  10. Khinchin A. Ya. On the main theorems of information theory. UMN 11:1 (67) 1956.
  11. Kolmogorov A. N. Theory of information transmission. // Session of the USSR Academy of Sciences on scientific problems of production automation. October 15–20, 1956 Plenary session. M.: Publishing House of the USSR Academy of Sciences, 1957.
  12. Kolmogorov A. N. Information theory and algorithms theory. M.: Nauka, 1987.

Claude Elwood Shannon was born in Petoskey, Michigan on April 30, 1916. His father, a descendant of the early settlers of New Jersey, was a businessman, and his mother, the daughter of German emigrants, was a teacher and for a number of years a school principal in Gaylord.

Claude spent the first 16 years of his life in Gaylord, graduating from the local school in 1932 and showing an aptitude for mechanics. His favorite subjects at school were physics and mathematics, but at home he was busy constructing model airplanes, radio-controlled boats and a telegraph to communicate with a friend who lived half a mile away. This telegraph used barbed wire to fence off a local pasture. Claude earned the money needed for these activities by delivering newspapers and telegrams, as well as repairing radio equipment. The hero of his childhood was Edison, who turned out, as he later learned, to be a distant relative - they were both descendants of John Ogden, one of the leaders of the colonization. Additionally, Claude's list of heroes included many scientists such as Newton, Darwin, Einstein and Von Neumann.

In 1932 he entered the University of Michigan, following in the footsteps of his sister Katherine, who had just received a master's degree in mathematics there. In 1936 he became a bachelor in electrical engineering and mathematics; He retained this parallel interest in mathematics and engineering in the future.

In 1936, he received a position as a laboratory assistant in the department of electrical engineering at the Massachusetts Institute of Technology (Massachusetts Institute of Technology, famous M.I.T.). This position gave him the opportunity to continue his studies while working only part time. In addition, this work was ideally suited to his abilities and interests - he worked on the Bush differential computer, the most advanced computer of the time, capable of solving differential equations up to the sixth order analoguely. His job was to translate equations into “mechanical terms”, prepare and run the machine for various initial conditions. At times this process required the collaboration of up to five people.

The electrical circuit that controlled this computer, which included more than a hundred relays, was also interesting. While working with her, Shannon became interested in the theory of constructing such circuits. He had studied symbolic logic and Boolean algebra in math courses at Michigan and realized that this was exactly what was required to describe such binary systems. He developed these ideas in 1937 while in New York at Bell Telephone Laboratories and then in his graduate work in Massachusetts. This work, the first he published, attracted considerable attention and was nominated in 1940 for the Alfred Nobel Prize awarded by the Association of American Engineering Societies.

In the summer of 1938, he was engaged in research work in Massachusetts, and in the fall he was transferred from the electrical engineering department to the mathematics department, where he began work on his doctoral dissertation. His boss, Vannevere Bush, became president of the Carnegie Institution in Washington at this time; One of the divisions of this institute, located in Cold Spring Harbor, N.Y., was then engaged in genetics, and he advised Shannon to tackle the problem of storing genetic information from an algebraic point of view. Shannon spent the summer of 1939 there, working with geneticist Barbara Burks on a dissertation he called "Algebra in Theoretical Genetics" (M.I.T.'s thesis advisor was algebra professor Frank L. Hitchcock). .

Around the same time, Shannon was developing ideas in the field of computers and communication systems. In a letter dated February 16, 1939, he wrote to Bush about the relationship between time, bandwidth, noise, and distortion in communications systems, and the development of computing systems to perform symbolic mathematical operations.

In the spring of 1940, he finally defended his dissertations and received the titles of Master of Electrical Engineering and Doctor of Mathematics; In the summer, he conducted further research in the field of switching electrical circuits at Bell Laboratories, developing a new method for designing them that could significantly reduce the number of contacts in them. The results of this work were published in the article “The Synthesis of Two-Terminal Switching Circuits”.

Academic year 1940-1941 He spent time at Princeton under the guidance of Hermann Weyl, beginning to work seriously on his ideas regarding information theory and efficient communication systems.

Thornton C. Fry, head of the mathematics department at Bell Laboratories, was at the time a member of the committee to develop anti-aircraft fire control systems as the country armed itself for the European War; he invited Shannon to also work on defense. Returning to the Laboratories, Shannon joined a group developing devices for detecting enemy aircraft and missiles and targeting anti-aircraft guns; This task was relevant in connection with the creation of V-1 and V-2 rockets in Germany. Without these guidance systems, England's losses in the war would have been significantly greater.

Shannon spent 15 years at Bell Labs in a fairly good environment - during this time many first-class mathematicians worked there, such as John Pierce, famous for his work in the field of satellite communications, Harry Nyquist, who did a lot in the theory of detection signals, Hendrik Bode, who worked on feedback, the creators of the transistor, Brattain, Bardeen and Shockley, George Stibitz, who created the first (1938) relay computer; Barney Oliver, an outstanding engineer, and others.

All these years, Shannon worked in various fields, mainly in information theory, which began with his article “Mathematical Theory of Communication”. This article showed that any source of information - a telegraph key, a talking person, a television camera, and so on - has an "information production rate" that can be measured in bits per second. Communication channels have "throughput" measured in the same units; information can be transmitted over a channel if and only if the throughput is not less than the rate of information arrival.

This paper on communication theory is generally considered Shannon's most significant contribution to science.

Shannon's studies of information and noise had many different applications. For example, in the article “Communication Theory of Secrecy Systems,” he connected cryptography with the problem of transmitting information over a noisy channel (the role of noise in this case is played by the cryptosystem key). This work subsequently led to Shannon's appointment as a consultant to the US government on cryptography.

Another task that he dealt with together with E.F. Moore (E.F. Moore), was to increase the reliability of relay circuits by maintaining an excessive number of elements (each of which is unreliable). This task, again, comes down to transmitting information over a noisy channel.

In addition, Shannon also applied these ideas to the problem of optimal investment strategy, in which the “noisy signal” is the stock market and its corresponding time series, and the problem is to maximize profits.

His 1950 computer science paper, “Programming a Computer for Playing Chess,” was written in a lighter style. At that time, computers were slow and programming them was quite difficult; Since then, many chess programs have been created, but most of them are still based on the ideas of this work.

In 1965, Shannon was invited to Russia for an engineering conference. There he had the opportunity to meet with multiple world chess champion Mikhail Botvinnik, also an electrical engineer interested in the problem of chess game algorithmization. After a lengthy discussion, Shannon asked the grandmaster to play chess with him; it is not surprising that he lost on the 42nd move.

Subsequently, the development of chess programs was continued, and in 1980 Shannon became the guest of honor at the international computer chess tournament in Linz, Austria (International Computer Chess Championship, Linz, Austria), in which eleven machines from Sweden, Germany, Russia, and France took part , England, Canada and the USA (most of the cars were in their home countries, connected via the Internet to Austria). The winner was "Belle", developed at Bell Laboratories by Ken Thompson and Joe Condon ("Belle", Ken Thompson, Joe Condon); In terms of level of play, she was practically not inferior to the master of sports.

Shannon loved to build funny - and not necessarily useful - devices; in his house one could see, for example, a calculator working with numbers in the Roman system, “turtles” crawling on the floor and avoiding obstacles, or a machine with two crayfish juggling three balls.

In the fifties he created the "Ultimate Machine", based on an idea by Mervin Minsky and described in Arthur C. Clarke's A Voice Over the Sea; This machine looked like a box with a single switch. When turned on, its lid opened, a hand appeared from there, which returned the switch to its original position and disappeared inside again.

In 1949, Shannon, while at Bell Labs, married Mary Elizabeth (Betty) Moore, a numerical scientist (then called "computer") in John Pierce's group. They settled in Mystic Lake, Winchester, Massachusetts.

Claude Elwood Shannon(April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as the "Father of Information Theory".

Shannon known for writing the foundations of information theory, Mathematical Communication Theory, which he published in 1948. At the age of 21, while a master's student at the Massachusetts Institute of Technology (MIT), he wrote a dissertation proving that any logical, numerical relations can be constructed by electrical application of Boolean algebra. Claude Elwood Shannon made major contributions to the field of cryptanalysis for national defense during World War II, including his major work on codebreaking and telecommunications reliability.

In 1950, Shannon published a paper on computer chess entitled "Programming a Computer to Play Chess." He describes how a machine or computer can be programmed to play logic games, like chess. The so-called minimax procedures are responsible for the computer's move process, based on an assessment of the function of a given chess position. Shannon gave a crude example of evaluating a function in which the value of the black position was subtracted from the white position. The values ​​were calculated based on the score of a regular chess piece (1 point for a pawn, 3 points for a knight or bishop, 5 points for a rook, and 9 points for a queen). He looked at some positional factors, subtracting 0.5 points for each doubled pawn, backward and isolated pawns, and adding 0.1 point for each good move. Quote from the document:

“The coefficients 0.5 and 0.1 are just a rough estimate by the writer. In addition, there are many other conditions that must be included. The formula is given for clarity only.”

In 1932, Shannon was enrolled at the University of Michigan, where in one of his courses he became acquainted with the work of George Boole. In 1936, Claude graduated from the University of Michigan with a double major in mathematics and electrical engineering and went to the Massachusetts Institute of Technology (MIT), where he worked as a research assistant. He performed operator duties on a mechanical computing device, an analog computer called a "differential analyzer", developed by his supervisor Vanevar Bush. By studying the complex, highly specialized electrical circuits of a differential analyzer, Shannon saw that Boole's concepts could be put to good use. After working the summer of 1937 at Bell Telephone Laboratories, he wrote a paper based on his master's thesis that year, "Symbolic Analysis of Relay and Switching Circuits." It should be noted that Frank Lauren Hitchcock supervised the master's thesis and provided useful criticism and advice. The article itself was published in 1938 in the publication of the American Institute of Electrical Engineers (AIEE). In this work, he showed that switching circuits could be used to replace the electromechanical relay circuits then used to route telephone calls. He then extended this concept by showing that these circuits could solve all the problems that Boolean algebra could solve. Also, in the last chapter, he presents the prototypes of several circuits, for example, a 4-bit adder. For this article, Shannon was awarded the Alfred Nobel Prize by the American Institute of Electrical Engineers in 1940. The proven ability to implement any logical calculations in electrical circuits formed the basis for the design of digital circuits. And digital circuits are, as we know, the basis of modern computing technology, thus, the results of his work are one of the most important scientific results of the twentieth century. Howard Gardner of Harvard University called Shannon's work "perhaps the most important, as well as the most famous master's thesis of the century."

On Bush's advice, Shannon decided to pursue a doctorate in mathematics at MIT. Bush was appointed president of the Carnegie Institution in Washington and invited Shannon to take part in the work on genetics led by Barbara Burks. It was genetics, according to Bush, that could serve as the subject of Shannon's efforts. Shannon himself, having spent a summer in Woods Hole, Massachusetts, became interested in finding a mathematical basis for Mendel's laws of inheritance. Shannon's doctoral dissertation, entitled "The Algebra of Theoretical Genetics", was completed in the spring of 1940. However, this work was not released until 1993, when it appeared in Shannon's Collected Papers. His research could otherwise have become quite important, but most of these results were obtained independently of him. Shannon is pursuing a PhD in mathematics and a master's degree in electrical engineering. After this he did not return to research in biology.

Shannon was also interested in the application of mathematics to information systems such as communications systems. After another summer spent at Bell Labs in 1940 Shannon became a research fellow at the Institute for Advanced Study in Princeton, New Jersey, USA for one academic year. There he worked under the guidance of the famous mathematician Hermann Weyl, and also had the opportunity to discuss his ideas with influential scientists and mathematicians, including John von Neumann. He also had chance meetings with Albert Einstein and Kurt Gödel. Shannon worked freely in a variety of disciplines, and this ability may have contributed to the further development of his mathematical information theory.

Scientific field: Place of work: Alma mater: Known as: Awards and prizes


  • Prize named after A. Nobel AIEE (1940);
  • Prize in memory of M. Libman (English) Russian IRE (1949);
  • IEEE Medal of Honor (1966);
  • National Medal of Science (1966);
  • Harvey Award (1972);
  • Kyoto Prize (1985).

Biography

In 1985, Claude Shannon and his wife Betty attend the International Symposium on Information Theory in Brighton. Shannon did not attend international conferences for quite a long time, and at first they did not even recognize him. At the banquet, Claude Shannon gave a short speech, juggled just three balls, and then gave out hundreds and hundreds of autographs to the amazed scientists and engineers who stood in a long line, feeling reverent feelings towards the great scientist, comparing him with Sir Isaac Newton.

He was the developer of the first industrial radio-controlled toy, which was produced in the 50s in Japan (photo). He also developed a device that could fold a Rubik's cube (photo), a mini computer for the board game Hex, which always defeated the opponent (photo), a mechanical mouse that could find a way out of a maze (photo). He also realized the idea of ​​the comic machine “Ultimate Machine” (photo).

Communication theory in secret systems

Shannon's work "The Theory of Communication in Secret Systems" (1945), classified as "secret", which was declassified and published only in 1949, served as the beginning of extensive research in the theory of coding and transmission of information, and, in general opinion, gave cryptography the status of a science. It was Claude Shannon who first began to study cryptography using a scientific approach. In this article, Shannon defined the fundamental concepts of the theory of cryptography, without which cryptography is no longer conceivable. Shannon's important merit is the research of absolutely secure systems and proof of their existence, as well as the existence of cryptographically strong ciphers, and the conditions required for this. Shannon also formulated the basic requirements for strong ciphers. He introduced the now familiar concepts of scattering and mixing, as well as methods for creating cryptographically strong encryption systems based on simple operations. This article is the starting point for studying the science of cryptography.

Article "Mathematical theory of communication"

  • The Nyquist-Shannon theorem (in Russian-language literature - Kotelnikov's theorem) is about the unambiguous reconstruction of a signal from its discrete samples.
  • (or silent encryption theorem) sets a limit for maximum data compression and a numerical value for Shannon entropy.
  • Shannon-Hartley theorem

See also

  • Whittaker-Shannon interpolation formula

Notes

Literature

  • Shannon C. E. A Mathematical Theory of Communication // Bell System Technical Journal. - 1948. - T. 27. - P. 379-423, 623-656.
  • Shannon C. E. Communication in the presence of noise // Proc. Institute of Radio Engineers. - Jan. 1949. - T. 37. - No. 1. - P. 10-21.
  • Shannon K. Works on information theory and cybernetics. - M.: Foreign Literature Publishing House, 1963. - 830 p.

Links

  • Bibliography (English)

Categories:

  • Personalities in alphabetical order
  • Scientists by alphabet
  • Born on April 30
  • Born in 1916
  • Michigan born
  • Died on February 24
  • Died in 2001
  • Deaths in Massachusetts
  • US mathematicians
  • Information theory
  • Cryptographers
  • Cybernetics
  • Pioneers of computer technology
  • Artificial Intelligence Researchers
  • Scientists in the field of systems science
  • MIT alumni
  • University of Michigan alumni
  • MIT faculty
  • Members and Corresponding Members of the US National Academy of Sciences
  • Foreign Fellows of the Royal Society of London
  • Mathematicians of the 20th century
  • Harvey Award Winners
  • US National Medal of Science recipients
  • IEEE Medal of Honor Recipients
  • Persons:Computer chess
  • US Electrical Engineers

Wikimedia Foundation. 2010.



Did you like the article? Share with your friends!