That is, there is a high probability that. Definition of probability

At assessing the likelihood of any event occurring random event It is very important to have a good idea in advance whether the probability () of the occurrence of the event we are interested in depends on how other events develop.

When classical scheme, when all outcomes are equally probable, we can already estimate the probability values ​​of the individual event that interests us independently. We can do this even if the event is a complex collection of several elementary outcomes. What if several random events occur simultaneously or sequentially? How does this affect the likelihood of the event we are interested in happening?

If I throw several times dice, and I want a “six” to come up, but I’m always unlucky, does this mean that I need to increase the bet, because, according to the theory of probability, I’m about to get lucky? Alas, the theory of probability does not state anything like this. No dice, no cards, no coins can't remember what they showed us in last time. It doesn’t matter to them at all whether it’s the first time or the tenth time I’m testing my luck today. Every time I repeat the roll, I know only one thing: and this time the probability of getting a six is ​​again one sixth. Of course, this does not mean that the number I need will never come up. This only means that my loss after the first throw and after any other throw are independent events.

Events A and B are called independent, if the implementation of one of them does not in any way affect the probability of another event. For example, the probabilities of hitting a target with the first of two weapons do not depend on whether the target was hit by the other weapon, so the events “the first weapon hit the target” and “the second weapon hit the target” are independent.

If two events A and B are independent, and the probability of each of them is known, then the probability of the simultaneous occurrence of both event A and event B (denoted AB) can be calculated using the following theorem.

Probability multiplication theorem for independent events

P(AB) = P(A)*P(B)- probability simultaneous the onset of two independent events is equal to work the probabilities of these events.

Example.The probabilities of hitting the target when firing the first and second guns are respectively equal: p 1 =0.7;

p 2 =0.8. Find the probability of a hit with one salvo by both guns simultaneously. Solution:


as we have already seen, events A (hit by the first gun) and B (hit by the second gun) are independent, i.e. P(AB)=P(A)*P(B)=p 1 *p 2 =0.56.

Example.What happens to our estimates if the initial events are not independent? Let's change the previous example a little. Two shooters shoot at targets at a competition, and if one of them shoots accurately, the opponent begins to get nervous and his results worsen. How to turn this everyday situation into math problem and outline ways to solve it? It is intuitively clear that we need to somehow separate the two options developments , create essentially two scenarios, two different tasks


. In the first case, if the opponent missed, the scenario will be favorable for the nervous athlete and his accuracy will be higher. In the second case, if the opponent has taken his chance decently, the probability of hitting the target for the second athlete decreases. For separation possible scenarios (they are often called hypotheses) for the development of events, we will often use the “probability tree” diagram. This diagram is similar in meaning to the decision tree that you have probably already dealt with. Each branch represents a separate scenario for the development of events, only now it has eigenvalue so-called


conditional

probabilities (q 1, q 2, q 1 -1, q 2 -1). This scheme is very convenient for analyzing sequential random events. It remains to clarify one more important question: where do the initial values ​​of the probabilities come from?

Example.real situations ? After all, probability theory doesn’t work with just coins and dice? Usually these estimates are taken from statistics, and when statistical information is not available, we conduct our own research. And we often have to start it not with collecting data, but with the question of what information we actually need.

Let's say we need to estimate in a city with a population of one hundred thousand inhabitants the market volume for a new product that is not an essential item, for example, for a balm for the care of colored hair. Let's consider the "probability tree" diagram. In this case, we need to approximately estimate the probability value on each “branch”.

So, our estimates of market capacity:

3) of them, only 10% use balms for colored hair,

4) of them, only 10% can muster the courage to try a new product,

5) 70% of them usually buy everything not from us, but from our competitors.




p 2 =0.8. Find the probability of a hit with one salvo by both guns simultaneously. According to the law of multiplication of probabilities, we determine the probability of the event we are interested in A = (a city resident buys this new balm from us) = 0.00045.

Let's multiply this probability value by the number of city residents. As a result, we have only 45 potential customers, and considering that one bottle of this product lasts for several months, the trade is not very lively.

And yet there is some benefit from our assessments.

Firstly, we can compare forecasts of different business ideas; they will have different “forks” in the diagrams, and, of course, the probability values ​​will also be different.

Secondly, as we have already said, a random variable is not called random because it does not depend on anything at all. Just her exact the meaning is not known in advance. We know that the average number of buyers can be increased (for example, by advertising a new product). So it makes sense to focus our efforts on those “forks” where the probability distribution does not suit us particularly, on those factors that we are able to influence.

Let's look at one more quantitative example research on purchasing behavior.

Example. On average, 10,000 people visit the food market per day. Probability that a market visitor enters a pavilion dairy products, is equal to 1/2.

It is known that this pavilion sells an average of 500 kg of various products per day.

Can we say that the average purchase in the pavilion weighs only 100 g? Discussion.




Of course not. It is clear that not everyone who entered the pavilion ended up buying something there. As shown in the diagram, to answer the question about the average weight of a purchase, we must find an answer to the question, what is the likelihood that

that a person entering the pavilion will buy something there. If we do not have such data at our disposal, but we need it, we will have to obtain it ourselves by observing the visitors to the pavilion for some time. Let's say our observations showed that only a fifth of pavilion visitors buy something. Once we have obtained these estimates, the task becomes simple. Of the 10,000 people who come to the market, 5,000 will go to the dairy products pavilion; only 1,000 will purchase. Average weight purchase is equal to 500 grams. It is interesting to note that to construct happening, the logic of conditional “branching” must be defined at each stage of our reasoning as clearly as if we were working with a “concrete” situation, and not with probabilities.

Self-test tasks

1. Let it be electrical circuit, consisting of n series-connected elements, each of which operates independently of the others.




The probability p of failure of each element is known. Determine the probability of proper operation of the entire section of the circuit (event A).

2. The student knows 20 out of 25 exam questions. Find the probability that the student knows the three questions given to him by the examiner.

3. Production consists of four successive stages, at each of which equipment operates, for which the probabilities of failure over the next month are equal to p 1, p 2, p 3 and p 4, respectively. Find the probability that there will be no production stoppages due to equipment failure in a month.

as an ontological category reflects the extent of the possibility of the emergence of any entity under any conditions. In contrast to the mathematical and logical interpretation of this concept, ontological mathematics does not associate itself with the obligation of quantitative expression. The meaning of V. is revealed in the context of understanding determinism and the nature of development in general.

Excellent definition

Incomplete definition

PROBABILITY

concept characterizing quantities. the measure of the possibility of the occurrence of a certain event at a certain conditions. In scientific in cognition there are three interpretations of V. Classic concept V., which arose from mathematics. analysis gambling and most fully developed by B. Pascal, J. Bernoulli and P. Laplace, considers V. as the ratio of the number of favorable cases to total number all equally possible. For example, when throwing a dice that has 6 sides, each of them can be expected to land with a value of 1/6, since no one side has advantages over another. Such symmetry of experimental outcomes is specially taken into account when organizing games, but is relatively rare in the study of objective events in science and practice. Classic V.'s interpretation gave way to statistics. V.'s concepts, which are based on the actual observing the occurrence of a certain event over a long period of time. experience under precisely fixed conditions. Practice confirms that the more often an event occurs, the more degree objective possibility its appearance, or B. Therefore, statistical. V.'s interpretation is based on the concept of relates. frequency, which can be determined experimentally. V. as a theoretical the concept never coincides with the empirically determined frequency, however, in plural. In cases, it differs practically little from the relative one. frequency found as a result of duration. observations. Many statisticians consider V. as a “double” refers. frequencies, edges are determined statistically. study of observational results

or experiments. Less realistic was the definition of V. as the limit relates. frequencies mass events, or collectives, proposed by R. Mises. As further development The frequency approach to V. puts forward a dispositional, or propensitive, interpretation of V. (K. Popper, J. Hacking, M. Bunge, T. Settle). According to this interpretation, V. characterizes the property of generating conditions, for example. experiment. installations to obtain a sequence of massive random events. It is precisely this attitude that gives rise to physical dispositions, or predispositions, V. which can be checked using relatives. frequency

Statistical V.'s interpretation dominates scientific research. cognition, because it reflects specific. the nature of the patterns inherent in mass phenomena of a random nature. In many physical, biological, economic, demographic. and etc. social processes it is necessary to take into account the effect of many random factors, which are characterized by a stable frequency. Identifying these stable frequencies and quantities. its assessment with the help of V. makes it possible to reveal the necessity that makes its way through the cumulative action of many accidents. This is where the dialectic of transforming chance into necessity finds its manifestation (see F. Engels, in the book: K. Marx and F. Engels, Works, vol. 20, pp. 535-36).

Logical, or inductive, reasoning characterizes the relationship between the premises and the conclusion of non-demonstrative and, in particular, inductive reasoning. Unlike deduction, the premises of induction do not guarantee the truth of the conclusion, but only make it more or less plausible. This plausibility, with precisely formulated premises, can sometimes be assessed using V. The value of this V. is most often determined by comparison. concepts (greater than, less than or equal to), and sometimes in a numerical way. Logical interpretation is often used to analyze inductive reasoning and construct various systems probabilistic logics (R. Carnap, R. Jeffrey). In semantics logical concepts V. is often defined as the degree to which one statement is confirmed by others (for example, a hypothesis by its empirical data).

In connection with the development of theories of decision making and games, the so-called personalistic interpretation of V. Although V. at the same time expresses the degree of faith of the subject and the occurrence of a certain event, V. themselves must be chosen in such a way that the axioms of the calculus of V. are satisfied. Therefore, V. with such an interpretation expresses not so much the degree of subjective, but rather reasonable faith . Consequently, decisions made on the basis of such V. will be rational, because they do not take into account the psychological. characteristics and inclinations of the subject.

With epistemological t.zr. difference between statistical, logical. and personalistic interpretations of V. is that if the first characterizes the objective properties and relationships of mass phenomena of a random nature, then the last two analyze the features of the subjective, cognizant. human activities under conditions of uncertainty.

PROBABILITY

one of the most important concepts science, characterizing a special systemic vision of the world, its structure, evolution and knowledge. The specificity of the probabilistic view of the world is revealed through inclusion in the number basic concepts the existence of the concepts of randomness, independence and hierarchy (ideas of levels in the structure and determination of systems).

Ideas about probability originated in ancient times and related to the characteristics of our knowledge, while the existence of probabilistic knowledge was recognized, which differed from reliable knowledge and from false knowledge. The impact of the idea of ​​probability on scientific thinking, on the development of cognition is directly related to the development of probability theory as a mathematical discipline. The origin of the mathematical doctrine of probability dates back to the 17th century, when the development of a core of concepts allowing. quantitative (numerical) characteristics and expressing a probabilistic idea.

Intensive applications of probability to the development of cognition occur in the 2nd half. 19- 1st floor. 20th century Probability entered into the structures of such basic sciences about nature, such as classical statistical physics, genetics, quantum theory, cybernetics (information theory). Accordingly, probability personifies that stage in the development of science, which is now defined as non-classical science. To reveal the novelty and features of the probabilistic way of thinking, it is necessary to proceed from an analysis of the subject of probability theory and the foundations of its numerous applications. Probability theory is usually defined as a mathematical discipline that studies the patterns of mass random phenomena under certain conditions. Randomness means that within the framework of mass character, the existence of each elementary phenomenon does not depend on and is not determined by the existence of other phenomena. At the same time, the mass nature of phenomena itself has a stable structure and contains certain regularities. A mass phenomenon is quite strictly divided into subsystems, and the relative number of elementary phenomena in each of the subsystems ( relative frequency) is very stable. This stability is compared with probability. A mass phenomenon as a whole is characterized by a probability distribution, that is, by specifying subsystems and their corresponding probabilities. The language of probability theory is the language probability distributions. Accordingly, probability theory is defined as the abstract science of operating with distributions.

Probability gave rise in science to ideas about statistical patterns and statistical systems. The last essence systems formed from independent or quasi-independent entities, their structure is characterized by probability distributions. But how is it possible to form systems from independent entities? It is usually assumed that for the formation of systems with integral characteristics, it is necessary that sufficiently stable connections exist between their elements that cement the systems. Stability of statistical systems is given by the presence of external conditions, external environment, external, not internal forces. The very definition of probability is always based on setting the conditions for the formation of the initial mass phenomenon. Another important idea characterizing the probabilistic paradigm is the idea of ​​hierarchy (subordination). This idea expresses the relationship between characteristics individual elements And holistic characteristics systems: the latter seem to be built on top of the former.

The importance of probabilistic methods in cognition lies in the fact that they make it possible to study and theoretically express the patterns of structure and behavior of objects and systems that have a hierarchical, “two-level” structure.

Analysis of the nature of probability is based on its frequency, statistical interpretation. At the same time, very long time In science, such an understanding of probability prevailed, which was called logical, or inductive, probability. Logical probability is interested in questions of the validity of a separate, individual judgment under certain conditions. Is it possible to assess the degree of confirmation (reliability, truth) of an inductive conclusion (hypothetical conclusion) in quantitative form? During the development of probability theory, such questions were repeatedly discussed, and they began to talk about the degrees of confirmation of hypothetical conclusions. This measure of probability is determined by the available this person information, his experience, views on the world and psychological mindset. In all similar cases the magnitude of probability is not amenable to strict measurements and practically lies outside the competence of probability theory as a consistent mathematical discipline.

The objective, frequentist interpretation of probability was established in science with significant difficulties. Initially, the understanding of the nature of probability was strongly influenced by those philosophical and methodological views that were characteristic of classical science. Historically, the development of probabilistic methods in physics occurred under the determining influence of the ideas of mechanics: statistical systems were interpreted simply as mechanical. Since the corresponding problems were not solved strict methods mechanics, then assertions arose that turning to probabilistic methods and statistical laws is the result of the incompleteness of our knowledge. In the history of the development of classical statistical physics Numerous attempts have been made to substantiate it on the basis classical mechanics, however, they all failed. The basis of probability is that it expresses the structural features of a certain class of systems, other than mechanical systems: the state of the elements of these systems is characterized by instability and a special (not reducible to mechanics) nature of interactions.

The entry of probability into knowledge leads to the denial of the concept of hard determinism, to the denial of the basic model of being and knowledge developed in the process of the formation of classical science. Basic models, represented by statistical theories, have a different, more general character: These include ideas of randomness and independence. The idea of ​​probability is associated with the disclosure of the internal dynamics of objects and systems, which cannot be fully determined external conditions and circumstances.

The concept of a probabilistic vision of the world, based on the absolutization of ideas about independence (as before the paradigm of rigid determination), has now revealed its limitations, which most strongly affects the transition modern science To analytical methods research into complex systems and the physical and mathematical foundations of self-organization phenomena.

Excellent definition

Incomplete definition ↓

probability- a number between 0 and 1 that reflects the chances that a random event will occur, where 0 is complete absence the probability of an event occurring, and 1 means that the event in question will definitely occur.

The probability of event E is a number from to 1.
The sum of the probabilities of mutually exclusive events is equal to 1.

empirical probability- probability, which is calculated as the relative frequency of an event in the past, extracted from the analysis of historical data.

The probability of very rare events cannot be calculated empirically.

subjective probability- probability based on personal subjective assessment events without regard to historical data. Investors who make decisions to buy and sell shares often act based on considerations of subjective probability.

prior probability -

The chance is 1 in... (odds) that an event will occur through the concept of probability. The chance of an event occurring is expressed through probability as follows: P/(1-P).

For example, if the probability of an event is 0.5, then the chance of the event is 1 out of 2 because 0.5/(1-0.5).

The chance that an event will not occur is calculated using the formula (1-P)/P

Inconsistent probability- for example, the price of shares of company A takes into account 85% possible event E, and in the share price of company B by only 50%. This is called inconsistent probability. According to the Dutch Betting Theorem, inconsistent probability creates profit opportunities.

Unconditional probability is the answer to the question “What is the probability that the event will occur?”

Conditional probability - this is the answer to the question: “What is the probability of event A if event B occurs.” Conditional probability is denoted as P(A|B).

Joint probability- the probability that events A and B will occur simultaneously. Denoted as P(AB).

P(A|B) = P(AB)/P(B) (1)

P(AB) = P(A|B)*P(B)

Rule for summing up probabilities:

The probability that either event A or event B will happen is

P (A or B) = P(A) + P(B) - P(AB) (2)

If events A and B are mutually exclusive, then

P (A or B) = P(A) + P(B)

Independent events - events A and B are independent if

P(A|B) = P(A), P(B|A) = P(B)

That is, it is a sequence of results where the probability value is constant from one event to the next.
A coin toss is an example of such an event - the result of each subsequent toss does not depend on the result of the previous one.

Dependent Events - these are events where the probability of the occurrence of one depends on the probability of the occurrence of another.

The rule for multiplying the probabilities of independent events:
If events A and B are independent, then

P(AB) = P(A) * P(B) (3)

Total probability rule:

P(A) = P(AS) + P(AS") = P(A|S")P(S) + P (A|S")P(S") (4)

S and S" are mutually exclusive events

expected value random variable is the average of possible outcomes random variable. For event X, the expectation is denoted as E(X).

Let’s say we have 5 values ​​of mutually exclusive events with a certain probability (for example, a company’s income was such and such an amount with such a probability). The expected value is the sum of all outcomes multiplied by their probability:

Dispersion of a random variable is the expectation of square deviations of a random variable from its expectation:

s 2 = E( 2 ) (6)

Conditional expected value is the expected value of a random variable X, provided that the event S has already occurred.

Probability event is called the ratio of the number of elementary outcomes favorable this event, to the number of all equally possible outcomes of the experience in which this event may appear. The probability of event A is denoted by P(A) (here P is the first letter French word probabilite - probability). According to the definition
(1.2.1)
where is the number of elementary outcomes favorable to event A; - the number of all equally possible elementary outcomes of the experiment, forming full group events.
This definition of probability is called classical. It arose on initial stage development of probability theory.

The probability of an event has the following properties:
1. Probability reliable event equal to one. Let us denote a reliable event by the letter . For a certain event, therefore
(1.2.2)
2. The probability of an impossible event is zero. Let us denote an impossible event by the letter . For an impossible event, therefore
(1.2.3)
3. The probability of a random event is expressed positive number, less than one. Since for a random event the inequalities , or , are satisfied, then
(1.2.4)
4. The probability of any event satisfies the inequalities
(1.2.5)
This follows from relations (1.2.2) - (1.2.4).

Example 1. An urn contains 10 balls of equal size and weight, of which 4 are red and 6 are blue. One ball is drawn from the urn. What is the probability that the drawn ball will be blue?

Solution. We denote the event “the drawn ball turned out to be blue” by the letter A. This test has 10 equally possible elementary outcomes, of which 6 favor event A. In accordance with formula (1.2.1), we obtain

Example 2. All natural numbers from 1 to 30 are written on identical cards and placed in an urn. After thoroughly shuffling the cards, one card is removed from the urn. What is the probability that the number on the card taken is a multiple of 5?

Solution. Let us denote by A the event “the number on the taken card is a multiple of 5.” In this test there are 30 equally possible elementary outcomes, of which event A is favored by 6 outcomes (the numbers 5, 10, 15, 20, 25, 30). Hence,

Example 3. Two dice are tossed and the total points are calculated. upper faces. Find the probability of event B such that the top faces of the dice have a total of 9 points.

Solution. In this test there are only 6 2 = 36 equally possible elementary outcomes. Event B is favored by 4 outcomes: (3;6), (4;5), (5;4), (6;3), therefore

Example 4. Selected at random natural number, not exceeding 10. What is the probability that this number is prime?

Solution. Let us denote by the letter C the event “the chosen number is prime”. IN in this case n = 10, m = 4 ( prime numbers 2, 3, 5, 7). Therefore, the required probability

Example 5. Two symmetrical coins are tossed. What is the probability that there are numbers on the top sides of both coins?

Solution. Let us denote by the letter D the event “there is a number on the top side of each coin.” In this test there are 4 equally possible elementary outcomes: (G, G), (G, C), (C, G), (C, C). (The notation (G, C) means that the first coin has a coat of arms, the second one has a number). Event D is favored by one elementary outcome (C, C). Since m = 1, n = 4, then

Example 6. What is the probability that a two-digit number chosen at random has the same digits?

Solution. Double digit numbers are numbers from 10 to 99; There are 90 such numbers in total. 9 numbers have identical digits (these are numbers 11, 22, 33, 44, 55, 66, 77, 88, 99). Since in this case m = 9, n = 90, then
,
where A is the “number with identical digits” event.

Example 7. From the letters of the word differential One letter is chosen at random. What is the probability that this letter will be: a) a vowel, b) a consonant, c) a letter h?

Solution. The word differential has 12 letters, of which 5 are vowels and 7 are consonants. Letters h there is no in this word. Let us denote the events: A - “vowel letter”, B - “consonant letter”, C - “letter h". The number of favorable elementary outcomes: - for event A, - for event B, - for event C. Since n = 12, then
, And .

Example 8. Two dice are tossed and the number of points on the top of each dice is noted. Find the probability that both dice roll same number points.

Solution. Let's denote this event by the letter A. Event A is favored by 6 elementary outcomes: (1;]), (2;2), (3;3), (4;4), (5;5), (6;6). The total number of equally possible elementary outcomes that form a complete group of events, in this case n=6 2 =36. This means that the required probability

Example 9. The book has 300 pages. What is the probability that a randomly opened page will have serial number, multiple of 5?

Solution. From the conditions of the problem it follows that all equally possible elementary outcomes that form a complete group of events will be n = 300. Of these, m = 60 favor the occurrence of the specified event. Indeed, a number that is a multiple of 5 has the form 5k, where k is a natural number, and , whence . Hence,
, where A - the “page” event has a sequence number that is a multiple of 5".

Example 10. Two dice are tossed and the sum of points on the top faces is calculated. What is more likely - getting a total of 7 or 8?

Solution. Let us denote the events: A - “7 points are rolled”, B – “8 points are rolled”. Event A is favored by 6 elementary outcomes: (1; 6), (2; 5), (3; 4), (4; 3), (5; 2), (6; 1), and event B is favored by 5 outcomes: (2; 6), (3; 5), (4; 4), (5; 3), (6; 2). All equally possible elementary outcomes are n = 6 2 = 36. Hence, And .

So, P(A)>P(B), that is, getting a total of 7 points is a more likely event than getting a total of 8 points.

Tasks

1. A natural number not exceeding 30 is chosen at random. What is the probability that this number is a multiple of 3?
2. In the urn a red and b blue balls, identical in size and weight. What is the probability that a ball drawn at random from this urn will be blue?
3. A number not exceeding 30 is chosen at random. What is the probability that this number is a divisor of 30?
4. In the urn A blue and b red balls, identical in size and weight. One ball is taken from this urn and set aside. This ball turned out to be red. After this, another ball is drawn from the urn. Find the probability that the second ball is also red.
5. A national number not exceeding 50 is chosen at random. What is the probability that this number is prime?
6. Three dice are tossed and the sum of points on the top faces is calculated. What is more likely - to get a total of 9 or 10 points?
7. Three dice are tossed and the sum of the points rolled is calculated. What is more likely - to get a total of 11 (event A) or 12 points (event B)?

Answers

1. 1/3. 2 . b/(a+b). 3 . 0,2. 4 . (b-1)/(a+b-1). 5 .0,3.6 . p 1 = 25/216 - probability of getting 9 points in total; p 2 = 27/216 - probability of getting 10 points in total; p 2 > p 1 7 . P(A) = 27/216, P(B) = 25/216, P(A) > P(B).

Questions

1. What is the probability of an event called?
2. What is the probability of a reliable event?
3. What is the probability of an impossible event?
4. What are the limits of the probability of a random event?
5. What are the limits of the probability of any event?
6. What definition of probability is called classical?

When a coin is tossed, you can say that it will land heads up, or probability this is 1/2. Of course, this does not mean that if a coin is tossed 10 times, it will necessarily land on heads 5 times. If the coin is "fair" and if it is tossed many times, then heads will land very close half the time. Thus, there are two types of probabilities: experimental And theoretical .

Experimental and theoretical probability

If you toss a coin a large number of times - say 1000 - and count the number of times heads are thrown, we can determine the probability of heads being thrown. If heads are thrown 503 times, we can calculate the probability of it landing:
503/1000, or 0.503.

This experimental determination of probability. This definition of probability comes from observation and study of data and is quite common and very useful. Here, for example, are some probabilities that were determined experimentally:

1. The probability that a woman will develop breast cancer is 1/11.

2. If you kiss someone who has a cold, then the probability that you will also get a cold is 0.07.

3. A person who has just been released from prison has an 80% chance of returning to prison.

If we consider tossing a coin and taking into account that it is just as likely that it will come up heads or tails, we can calculate the probability of getting heads: 1/2. This is theoretical definition probabilities. Here are some other probabilities that have been determined theoretically using mathematics:

1. If there are 30 people in a room, the probability that two of them have the same birthday (excluding year) is 0.706.

2. During a trip, you meet someone, and during the conversation you discover that you have a mutual friend. Typical reaction: “This can’t be!” In fact, this phrase is not suitable, because the probability of such an event is quite high - just over 22%.

Thus, experimental probabilities are determined through observation and data collection. Theoretical probabilities are determined through mathematical reasoning. Examples of experimental and theoretical probabilities, such as those discussed above, and especially those that we do not expect, lead us to the importance of studying probability. You may ask, "What is true probability?" In fact, there is no such thing. It is possible to determine experimentally the probabilities in within certain limits. They may or may not coincide with the probabilities that we obtain theoretically. There are situations in which it is much easier to determine one type of probability than another. For example, it would be sufficient to find the probability of catching a cold using theoretical probability.

Calculation of experimental probabilities

Let's consider first experimental determination probabilities. The basic principle we use to calculate such probabilities is as follows.

Principle P (experimental)

If in an experiment in which n observations are made, a situation or event E occurs m times in n observations, then the experimental probability of the event is said to be P (E) = m/n.

Example 1 Sociological survey. Was held pilot study to determine the number of left-handers, right-handers and people whose both hands are equally developed. The results are shown in the graph.

a) Determine the probability that the person is right-handed.

b) Determine the probability that the person is left-handed.

c) Determine the probability that a person is equally fluent in both hands.

d) Most Professional Bowling Association tournaments are limited to 120 players. Based on the data from this experiment, how many players could be left-handed?

Solution

a) The number of people who are right-handed is 82, the number of left-handers is 17, and the number of those who are equally fluent in both hands is 1. Total observations - 100. Thus, the probability that a person is right-handed is P
P = 82/100, or 0.82, or 82%.

b) The probability that a person is left-handed is P, where
P = 17/100, or 0.17, or 17%.

c) The probability that a person is equally fluent in both hands is P, where
P = 1/100, or 0.01, or 1%.

d) 120 bowlers, and from (b) we can expect that 17% are left-handed. From here
17% of 120 = 0.17.120 = 20.4,
that is, we can expect about 20 players to be left-handed.

Example 2 Quality control . It is very important for a manufacturer to maintain the quality of its products at high level. In fact, companies hire quality control inspectors to ensure this process. The goal is to produce a minimum possible quantity defective products. But since the company produces thousands of products every day, it cannot afford to test every product to determine whether it is defective or not. To find out what percentage of products are defective, the company tests far fewer products.
Ministry Agriculture The US requires that 80% of the seeds sold by growers must germinate. To determine the quality of the seeds that an agricultural company produces, 500 seeds from those that were produced are planted. After this, it was calculated that 417 seeds sprouted.

a) What is the probability that the seed will germinate?

b) Do the seeds meet government standards?

Solution a) We know that out of 500 seeds that were planted, 417 sprouted. Probability of seed germination P, and
P = 417/500 = 0.834, or 83.4%.

b) Since the percentage of seeds germinated has exceeded 80% as required, the seeds meet government standards.

Example 3 Television ratings. According to statistics, there are 105,500,000 households with televisions in the United States. Every week, information about viewing programs is collected and processed. In one week, 7,815,000 households tuned in to the hit comedy series "Everybody Loves Raymond" on CBS and 8,302,000 households tuned in to the hit series "Law & Order" on NBC (Source: Nielsen Media Research). What is the probability that one household's TV is tuned to "Everybody Loves Raymond" during a given week? to "Law & Order"?

Solution The probability that the TV in one household is tuned to "Everybody Loves Raymond" is P, and
P = 7,815,000/105,500,000 ≈ 0.074 ≈ 7.4%.
The chance that a household's TV was tuned to Law & Order is P, and
P = 8,302,000/105,500,000 ≈ 0.079 ≈ 7.9%.
These percentages are called ratings.

Theoretical probability

Suppose we are conducting an experiment, such as throwing a coin or darts, drawing a card from a deck, or testing products for quality on an assembly line. Each possible result of such an experiment is called Exodus . The set of all possible outcomes is called outcome space . Event it is a set of outcomes, that is, a subset of the space of outcomes.

Example 4 Throwing darts. Suppose that in a dart throwing experiment, a dart hits a target. Find each of the following:

b) Outcome space

Solution
a) The outcomes are: hitting black (B), hitting red (R) and hitting white (B).

b) The space of outcomes is (hitting black, hitting red, hitting white), which can be written simply as (H, K, B).

Example 5 Throwing dice. A die is a cube with six sides, each with one to six dots on it.


Suppose we are throwing a die. Find
a) Outcomes
b) Outcome space

Solution
a) Outcomes: 1, 2, 3, 4, 5, 6.
b) Outcome space (1, 2, 3, 4, 5, 6).

We denote the probability that an event E occurs as P(E). For example, “the coin will land on heads” can be denoted by H. Then P(H) represents the probability that the coin will land on heads. When all outcomes of an experiment have the same probability of occurring, they are said to be equally likely. To see the differences between events that are equally likely and events that are not, consider the target shown below.

For target A, the events of hitting black, red and white are equally probable, since the black, red and white sectors are the same. However, for target B, the zones with these colors are not the same, that is, hitting them is not equally probable.

Principle P (Theoretical)

If an event E can happen in m ways out of n possible equally probable outcomes from the outcome space S, then theoretical probability events, P(E) is
P(E) = m/n.

Example 6 What is the probability of rolling a die to get a 3?

Solution There are 6 equally probable outcomes on a dice and there is only one possibility of rolling the number 3. Then the probability P will be P(3) = 1/6.

Example 7 What is the probability of rolling an even number on a die?

Solution The event is the throwing of an even number. This can happen in 3 ways (if you roll a 2, 4 or 6). The number of equally probable outcomes is 6. Then the probability P(even) = 3/6, or 1/2.

We will use a number of examples involving a standard 52 card deck. This deck consists of the cards shown in the figure below.

Example 8 What is the probability of drawing an Ace from a well-shuffled deck of cards?

Solution There are 52 outcomes (the number of cards in the deck), they are equally likely (if the deck is well shuffled), and there are 4 ways to draw an Ace, so according to the P principle, the probability
P(draw an ace) = 4/52, or 1/13.

Example 9 Suppose we choose, without looking, one ball from a bag with 3 red balls and 4 green balls. What is the probability of choosing a red ball?

Solution There are 7 equally probable outcomes of drawing any ball, and since the number of ways to draw a red ball is 3, we get
P(red ball selection) = 3/7.

The following statements are results from Principle P.

Properties of Probability

a) If event E cannot happen, then P(E) = 0.
b) If event E is certain to happen then P(E) = 1.
c) The probability that event E will occur is a number from 0 to 1: 0 ≤ P(E) ≤ 1.

For example, in a coin toss, the event that the coin lands on its edge has zero probability. The probability that a coin is either heads or tails has a probability of 1.

Example 10 Let's assume that 2 cards are drawn from a 52-card deck. What is the probability that both of them are peaks?

Solution The number n of ways to draw 2 cards from a well-shuffled deck of 52 cards is 52 C 2 . Since 13 of the 52 cards are spades, the number of ways m to draw 2 spades is 13 C 2 . Then,
P(pulling 2 peaks) = m/n = 13 C 2 / 52 C 2 = 78/1326 = 1/17.

Example 11 Suppose 3 people are randomly selected from a group of 6 men and 4 women. What is the probability that 1 man and 2 women will be selected?

Solution The number of ways to select three people from a group of 10 people is 10 C 3. One man can be chosen in 6 C 1 ways, and 2 women can be chosen in 4 C 2 ways. According to the fundamental principle of counting, the number of ways to choose 1 man and 2 women is 6 C 1. 4 C 2 . Then, the probability that 1 man and 2 women will be selected is
P = 6 C 1 . 4 C 2 / 10 C 3 = 3/10.

Example 12 Throwing dice. What is the probability of rolling a total of 8 on two dice?

Solution Each dice has 6 possible outcomes. The outcomes are doubled, meaning there are 6.6 or 36 possible ways in which the numbers on the two dice can appear. (It’s better if the cubes are different, say one is red and the other is blue - this will help visualize the result.)

The pairs of numbers that add up to 8 are shown in the figure below. There are 5 possible ways receiving a sum equal to 8, hence the probability is 5/36.



Did you like the article? Share with your friends!