1.1-1.2, due on September 7
1.1
I read the entire assignment.
Difficult:
The most difficult part of the material for me was understanding the intuition behind little-o. I understood the mathematical definition, but the intuition behind it is harder for me than the intuition behind big-O (which I'm more familiar with).
Reflective:
There were a couple of things that were interesting to me. One, I find it interesting that even if f(x) = 10*g(x), f(x) is still O(g(x)). It's pretty cool that big-O is more concerned with asymptotic growth as x -> infinity than with constant multiples of growth. Also, I think it's hugely valuable to be able to analytically determine the temporal complexity of an algorithm, rather than just giving it a round guess.
1.2
I read the entire assignment.
Difficult:
I read the entire assignment. The most difficult part of the material for me was understanding to what point we count operations. It seems kind of redundant to count constant-order operations executed once. And when we compare something with len(a), we only counted it as a comparison. What about computing len(a)? Why don't we factor that in? I'm having a hard time understanding what we count as a "primitive operation" -- it seems a bit arbitrary and out-of-touch with actual computational time.
Reflective: This is perfect, because it answers the one question that 1.1 left me with: It's great to know what happens to the complexity as n -> infinity, but a 10*n^2 temporally complex function is still 10 times as slow as an n^2 temporally complex function. Don't we care about that? Is there any way to analyze it? Apparently, yes! My only question remaining is, is there any way to create a distribution over min/max temporal complexity analytically?
I read the entire assignment.
Difficult:
The most difficult part of the material for me was understanding the intuition behind little-o. I understood the mathematical definition, but the intuition behind it is harder for me than the intuition behind big-O (which I'm more familiar with).
Reflective:
There were a couple of things that were interesting to me. One, I find it interesting that even if f(x) = 10*g(x), f(x) is still O(g(x)). It's pretty cool that big-O is more concerned with asymptotic growth as x -> infinity than with constant multiples of growth. Also, I think it's hugely valuable to be able to analytically determine the temporal complexity of an algorithm, rather than just giving it a round guess.
1.2
I read the entire assignment.
Difficult:
I read the entire assignment. The most difficult part of the material for me was understanding to what point we count operations. It seems kind of redundant to count constant-order operations executed once. And when we compare something with len(a), we only counted it as a comparison. What about computing len(a)? Why don't we factor that in? I'm having a hard time understanding what we count as a "primitive operation" -- it seems a bit arbitrary and out-of-touch with actual computational time.
Reflective: This is perfect, because it answers the one question that 1.1 left me with: It's great to know what happens to the complexity as n -> infinity, but a 10*n^2 temporally complex function is still 10 times as slow as an n^2 temporally complex function. Don't we care about that? Is there any way to analyze it? Apparently, yes! My only question remaining is, is there any way to create a distribution over min/max temporal complexity analytically?
Comments
Post a Comment