CTU FEE Moodle
Probability and Statistics
B242 - Summer 2024/2025
This is a grouped Moodle course. It consists of several separate courses that share learning materials, assignments, tests etc. Below you can see information about the individual courses that make up this Moodle course.
Probability and Statistics - B0B01PST
Main course
Credits | 7 |
Semesters | Winter |
Completion | Assessment + Examination |
Language of teaching | undefined |
Extent of teaching | 4P+2S |
Annotation
Předmět pokrývá základní partie pravděpodobnosti a matematické statistiky. Úvodní část je zaměřena na klasickou pravděpodobnost včetně podmíněné pravděpodobnosti. Další část se věnuje teorii náhodných veličin a jejich rozdělení, příkladům nejdůležitějších typů diskrétních a spojitých rozdělení, číselným charakteristikám náhodných veličin, jejich nezávislosti, součtům a transformacím. Pravděpodobnostních znalostí je v závěru využito při popisu statistických metod pro odhady parametrů rozdělení a testování hypotéz.
Study targets
Studenti se seznámí se základními pravděpodobnostními modely a statistickými metodami používanými v praxi k analýze dat týkajících se výsledků náhodných událostí.
Course outlines
1. Náhodný jev, definice pravděpodobnosti, základní pravděpodobnostní prostory.
2. Podmíněná pravděpodobnost, Bayesova věta.
3. Náhodná veličina, distribuční funkce, hustota pravděpodobnosti.
4. Charakteristiky náhodné veličiny - střední hodnota, rozptyl.
5. Základní diskrétní rozdělení pravděpodobnosti.
6. Základní spojitá rozdělení pravděpodobnosti.
7. Nezávislost náhodných veličin.
8. Transformace a součty náhodných veličin.
9. Náhodný vektor, rozdělení pravděpodobnosti náhodného vektoru.
10. Charakteristiky náhodného vektoru - vektor středních hodnot, korelační matice.
11. Popisná statistika.
12. Bodový odhad parametru, metoda maximální věrohodnosti.
13. Intervalový odhad parametru.
14. Základy testování hypotéz.
2. Podmíněná pravděpodobnost, Bayesova věta.
3. Náhodná veličina, distribuční funkce, hustota pravděpodobnosti.
4. Charakteristiky náhodné veličiny - střední hodnota, rozptyl.
5. Základní diskrétní rozdělení pravděpodobnosti.
6. Základní spojitá rozdělení pravděpodobnosti.
7. Nezávislost náhodných veličin.
8. Transformace a součty náhodných veličin.
9. Náhodný vektor, rozdělení pravděpodobnosti náhodného vektoru.
10. Charakteristiky náhodného vektoru - vektor středních hodnot, korelační matice.
11. Popisná statistika.
12. Bodový odhad parametru, metoda maximální věrohodnosti.
13. Intervalový odhad parametru.
14. Základy testování hypotéz.
Exercises outlines
1. Náhodný jev, definice pravděpodobnosti, základní pravděpodobnostní prostory.
2. Podmíněná pravděpodobnost, Bayesova věta.
3. Náhodná veličina, distribuční funkce, hustota pravděpodobnosti.
4. Charakteristiky náhodné veličiny - střední hodnota, rozptyl.
5. Základní diskrétní rozdělení pravděpodobnosti.
6. Základní spojitá rozdělení pravděpodobnosti.
7. Nezávislost náhodných veličin.
8. Transformace a součty náhodných veličin.
9. Náhodný vektor, rozdělení pravděpodobnosti náhodného vektoru.
10. Charakteristiky náhodného vektoru - vektor středních hodnot, korelační matice.
11. Popisná statistika.
12. Bodový odhad parametru, metoda maximální věrohodnosti.
13. Intervalový odhad parametru.
14. Základy testování hypotéz.
2. Podmíněná pravděpodobnost, Bayesova věta.
3. Náhodná veličina, distribuční funkce, hustota pravděpodobnosti.
4. Charakteristiky náhodné veličiny - střední hodnota, rozptyl.
5. Základní diskrétní rozdělení pravděpodobnosti.
6. Základní spojitá rozdělení pravděpodobnosti.
7. Nezávislost náhodných veličin.
8. Transformace a součty náhodných veličin.
9. Náhodný vektor, rozdělení pravděpodobnosti náhodného vektoru.
10. Charakteristiky náhodného vektoru - vektor středních hodnot, korelační matice.
11. Popisná statistika.
12. Bodový odhad parametru, metoda maximální věrohodnosti.
13. Intervalový odhad parametru.
14. Základy testování hypotéz.
Literature
- M. Navara: Pravděpodobnost a matematická statistika. ČVUT, Praha 2007.
- V. Dupač, M. Hušková: Pravděpodobnost a matematická statistika. Karolinum, Praha 1999.
- V. Dupač, M. Hušková: Pravděpodobnost a matematická statistika. Karolinum, Praha 1999.
Requirements
Počítání základních integrálů.
Probability, Statistics, and Theory of Information - A0B01PSI
Credits | 6 |
Semesters | Winter |
Completion | Assessment + Examination |
Language of teaching | Czech |
Extent of teaching | 4+2 |
Annotation
Basics of probability theory, mathematical statistics, information theory, and coding. Includes descriptions of probability, random variables and their distributions, characteristics and operations with random variables. Basics of mathematical statistics: Point and interval estimates, methods of parameters estimation and hypotheses testing, least squares method. Basic notions and results of the theory of Markov chains. Shannon entropy, mutual and conditional information.
Study targets
Basics of probability theory and their application in statistical estimates and tests.
The use of Markov chains in modeling.
Basic notions of information theory.
The use of Markov chains in modeling.
Basic notions of information theory.
Course outlines
1. Basic notions of probability theory. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Exercises outlines
1. Elementary probability.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Literature
[1] Papoulis, A.: Probability and Statistics, Prentice-Hall, 1990.
[2] Stewart W.J.: Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press 2009.
[3] David J.C. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
[2] Stewart W.J.: Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press 2009.
[3] David J.C. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
Requirements
Linear Algebra, Calculus, Discrete Mathematics
Probability, Statistics and Information Theory - A8B01PSI
Credits | 6 |
Semesters | Winter |
Completion | Assessment + Examination |
Language of teaching | Czech |
Extent of teaching | 4P+2S |
Annotation
Basics of probability theory, mathematical statistics, information theory, and coding. Includes descriptions of probability, random variables and their distributions, characteristics and operations with random variables. Basics of mathematical statistics: Point and interval estimates, methods of parameters estimation and hypotheses testing, least squares method. Basic notions and results of the theory of Markov chains. Shannon entropy, mutual and conditional information.
Study targets
Basics of probability theory and their application in statistical estimates and tests.
The use of Markov chains in modeling.
Basic notions of information theory.
The use of Markov chains in modeling.
Basic notions of information theory.
Course outlines
1. Basic notions of probability theory. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
2. Random variables and their description. Random vector. Probability distribution function.
3. Quantile function. Mixture of random variables.
4. Characteristics of random variables and their properties. Operations with random variables.
Basic types of distributions.
5. Characteristics of random vectors. Covariance, correlation. Chebyshev inequality. Law of large numbers. Central limit theorem.
6. Basic notions of statistics. Sample mean, sample variance.
Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood. EM algorithm.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains. Overview of applications.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Exercises outlines
1. Elementary probability.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
2. Kolmogorov model of probability. Independence, conditional probability, Bayes formula.
3. Mixture of random variables. Mean. Unary operations with random variables.
4. Dispersion (variance). Random vector, joint distribution. Binary operations with random variables.
5. Sample mean, sample variance. Chebyshev inequality. Central limit theorem.
6. Interval estimates of mean and variance.
7. Method of moments, method of maximum likelihood.
8. Hypotheses testing. Goodness-of-fit tests, tests of correlation, non-parametic tests.
9. Discrete random processes. Stationary processes. Markov chains.
10. Classification of states of Markov chains.
11. Asymptotic properties of Markov chains.
12. Shannon entropy. Entropy rate of a stationary information source.
13. Fundamentals of coding. Kraft inequality. Huffman coding.
14. Mutual information, capacity of an information channel.
Literature
[1] Papoulis, A.: Probability and Statistics, Prentice-Hall, 1990.
[2] Stewart W.J.: Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press 2009.
[3] David J.C. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
[2] Stewart W.J.: Probability, Markov Chains, Queues, and Simulation: The Mathematical Basis of Performance Modeling. Princeton University Press 2009.
[3] David J.C. MacKay: Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003.
Requirements
Linear Algebra, Calculus, Discrete Mathematics