Posts

Showing posts from October, 2018

5.6, due Oct 31

Difficult: I thought the hardest part of this section was understanding the formulas for the different distributions and their parameters alpha, beta, etc. Specifically, it was kind of hard to get intuition for how the distributions changed when they involved multiple shifting parameters. Reflective: I wonder how they came up with these distributions... where do each of the terms come from? I like how, in the previous homework, we showed that the Poisson distribution was in fact a generalization of the binomial distribution with infinitely small divisions of time. It would be nice to get a similar understanding of the relationship of, for example, the gamma distribution and the beta distribution.

5.5, due October 29

Difficult: I think the hardest part of this section was understanding what exactly an indicator variable was. Also, it would be nice to have a chance in class to think about different problems and what distribution we think would best describe that situation. Reflective: I thought that it was interesting that the Poisson distribution had a very similar relationship to lambda as the binomial distribution had to p. I would love to understand better where exactly the formula for the Poisson distribution comes from.

5.4, due on October 26

Reflective: There are a lot of propositions and theorems in this section that will take some time for me to wrap my head around -- the ones on the Law of the Unconscious Statistician and on variance in particular. This was definitely a harder section. Difficult: The idea of thinking of a random variable as a function is a new concept to me. Does that make the term "random variable" a misnomer? Is it a variable or a function?

5.3, due on October 24

Reflective: This was an awesome section. I loved reading about the real-life examples of what happens when you forget to condition, and the prosecutor's fallacy -- that shows the power of Bayes' Rule! I feel like this knowledge -- knowing the importance of conditioning probabilities and avoiding the prosecutor's fallacy -- is one of the most important things I will take out of ACME. Difficult: I had a harder time understanding the proofs of independence. It would help to go through some examples of how to prove that two events are independent (especially when it involves complementary events).

5.2, due on October 22

Difficult: For some reason, I had a hard time wrapping my head around how P(A|B) = P(A^B)/P(B). I think a picture would help me better understand. I understand conceptually what conditional probability is, but still don't quite understand the formula - specifically what it means to divide by a probability. Reflective: I thought Bayes Rule was great. It simplified the problem of finding the probability that a car was autonomous to a much simpler and understandable formula.

5.1, due on October 19

Difficult: I had a hard time understanding how part iii of Proposition 5.1.9 holds. How is it that P(EUF) = P(E) + P(F) - P(E^F)? Where does that third term come from? Reflective: I think it is interesting that permutations and combinations were taught in chapter 1 instead of this chapter. It seems like they fit the material here better.

4.5, due on October 15

Difficult: I still have to think about it for a second to get the difference between NP, NP-hard, and NP-complete problems. It will take awhile to build up the intuition for that. Reflective: IOne question I have is, how do you prove things about the difficulty of problems? It is straightforward to prove that there exists a solution in O(n^2) if you know what that solution is. But how can you prove that no solution exists that is faster than a certain complexity?

4.4, due on October 12

Difficult: Some of the mathematical notation describing the Huffman Algorithm and average word length was unfamiliar, but in trying to understand the symbols and notation, it made sense. Reflective: I never thought of Morse code as a sort of Huffman encoding, but it totally makes sense. An interesting exercise would be to take the Morse Code alphabet, compare it with a contemporaneous dictionary, and see if their encoding is optimal by comparing its average word length to a Huffman AWL.

4.2, Due on October 10

Difficult: I had a hard time understanding the stack/queue implementation of the BFS and the DFS. I didn't understand what they meant when they talked about popping a path off of the stack (or queue). Reflective: I thought Dijkstra's algorithm would be more complicated than it actually was! I thought it was pretty cool though that you can squeeze some temporal complexity out of the algorithm by using a specialized priority queue, though.

4.1, due on October 8

Difficult: I thought the mathematical representation of grouping was a bit difficult to grasp. Other than that the section wasn't too bad. Reflective: I already know I'm going to like this chapter. Some key points I liked: a choice of grouping of matrix multiplications can greatly affect the complexity of an algorithm. Also, I've always heard that any problem that can be solved with recursion has an iterative solution. I never understood that until I understood top-down and bottom-up programming. The examples in the section were very helpful.

3.4, due on October 5

Difficult: The most difficult part of today's reading was the proof of Proposition 3.4.7. The steps made sense, but I had a hard time understanding the setup. Reflective: Okay, I have a real question about this section. If we can build a priority queue in O(n) time, and we pop things off a priority queue in O(n) time, does that mean we have an O(n) sorting algorithm? That's what it seems like! But as far as I know, that isn't the case.