Linear Programming Notes

Just like MITx: 6.041x – Introduction to Probability – The Science of Uncertainty, this is the kind of post in which I share my notes of a special subject. In this time the subject is Linear Programming(LP) or Linear Optimization. I publish them in the hope of helping toddlers (just like me when I first started 🙂 ) understand the subject by unraveling the logic behind LP. These are the small part of the notes and I am going to share all as I sift through them. If anything strange or nice jumps out at you, please notify me freely.

R.I.P Osman Darcan and Barbaros Tansel 😦

MITx: 6.041x – Introduction to Probability – The Science of Uncertainty

I am currently taking a course from MIT called Introduction to Probability – The Science of Uncertainty and this is one of the best courses that I have taken so far. The professor John Tsitsiklis and the other course staffs do great things in this course. How the subjects are studied is quite different than how they are in Turkish education system, hence that makes me realize what our education system lacks: questioning, multiple ways rather than one absolute way to find an answer to a question, developing intuition and application.

Here is some information about content and aim of the course:

About this course 

The world is full of uncertainty: accidents, storms, unruly financial markets, noisy communications. The world is also full of data. Probabilistic modeling and the related field of statistical inference are the keys to analyzing data and making scientifically sound predictions.

Probabilistic models use the language of mathematics. But instead of relying on the traditional “theorem – proof” format, we develop the material in an intuitive — but still rigorous and mathematically precise — manner. Furthermore, while the applications are multiple and evident, we emphasize the basic concepts and methodologies that are universally applicable.

The course covers all of the basic probability concepts, including:

  • multiple discrete or continuous random variables, expectations, and conditional distributions
  • laws of large numbers
  • the main tools of Bayesian inference methods
  • an introduction to random processes (Poisson processes and Markov chains)

The contents of this course are essentially the same as those of the corresponding MIT class (Probabilistic Systems Analysis and Applied Probability) — a course that has been offered and continuously refined over more than 50 years. It is a challenging class, but it will enable you to apply the tools of probability theory to real-world applications or your research.

The course covers:

  • The basic structure and elements of probabilstic models
  • Random variables, their distributions, means, and variances
  • Probabilistic calculations
  • Inference methods
  • Laws of large numbers and their applications
  • Random processes

https://www.edx.org/course/introduction-probability-science-mitx-6-041x-1#!

Here is my notes:

1-2 Conditioning and Bayes’ rule
-Independence
-Counting: Permutation, Combination, Partitioning

IMG_5518

 

3-Discrete Random Variables:
-Probability mass functions and expectations
-Variance; Conditioning on an event; Multiple r.v.’s
-Conditioning on a random variable; Independence of r.v.’s

4-Continuous Random Variables:
-Probability density functions
-Conditioning on an event; Multiple r.v.’s
-Conditioning on a random variable; Independence; Bayes’ rule

5- Further topics on random variables
-Derived distributions
-Sums of r.v.’s; Covariance and correlation
-Conditional expectation and variance revisited; Sum of a random number of r.v.’s

6- Bayesian inference
-Introduction to Bayesian inference
-Linear models with normal noise
-Least mean squares (LMS) estimation
-Linear least mean squares (LLMS) estimation

7- Limit theorems and classical statistics
-Inequalities, convergence, and the Weak Law of Large Numbers
-The Central Limit Theorem (CLT)
-An introduction to classical statistics

8- Bernoulli and Poisson processes
-The Bernoulli process
-The Poisson process
-More on the Poisson process

9- Markov chains
-Finite-state Markov chains
-Steady-state behavior of Markov chains
-Absorption probabilities and expected time to absorption