This lecture will be review for most people. I will cover some basic concepts and definitions: interpreting uncertainties, statistical and systematic uncertainties, mean and variance of a distribution, propagation of uncertainties.
I will review some basic ideas of probability, including Bayes' Law. Then I will go through some common and useful probability distributions and their properties: binomial, Poisson and Gaussian distributions. The latter two represent different limits of the binomial distribution.
During the first part of the lecture, I will cover estimation of the mean of a distribution and the uncertainty on the mean (the "error on the mean") assuming Gaussian errors. In the second part, I will start to discuss error matrices; this topic will be continued after the break.
Homework 2 available
After a brief review of covariance in error propagation (introducing one new quantity, the dimensionless "correlation coefficient"), I will pick up where we left off last class. We will discuss properties of error ellipses and error matrices for the case when random variables are correlated with each other. Finally I will enumerate practical uses of error matrices; we will go through a number of examples next class.
I will cover several examples of the use of error matrices.
I will cover two useful methods of parameter estimation. The "maximum likelihood" method is quite general and powerful, although can have some practical drawbacks. The "least squares" method, which can be considered a special case of the likelihood method, works well in many common cases.
Homework 3 available
I will first discuss how one determines uncertainties on parameters estimated by the maximum likelihood method. Then we'll turn attention to least squares: I will cover estimation of parameters by a matrix method for the linear case (least squares fit to functions linear in the parameters) and determination of the uncertainties in this case.
I will first discuss how one determines confidence intervals for measured parameters: this can usually be done using a "Neyman construction". Next I will cover the basics of hypothesis testing using chi-squared.