Posts

Showing posts from December, 2018

8.8, due Dec 10

Difficult: I think one thing that is essential to understanding this section is understanding how a space V_l is the direct sum of V_j and V_j(perp), which is something we covered only briefly in the previous lecture, and which we had no homework questions about. I could stand to gain a better foundation on that topic. Reflective: I think the crucial line in the chapter was this one: "Converting from sons to daughters in (V_l intersect L^2([0,1))) corresponds to changing basis from the standard basis in F^n to the [wavelet basis]. That seems really, important, but I still lack intuition for what a wavelet basis looks like.

8.7, due December 6

Difficult: The hard part of this section for me was understanding formula 8.48 and the Vector Space Structure section -- specifically the part about scaling relations. Reflective: I can see how these kinds of functions could be better for approximating functions with local behavior, whereas to express such functions in a Fourier basis would be very computationally expensive. I can also see how the simplicity of the function could cause it to be easier to work with (differentiate, evaluate, etc.)

8.6, due Dec 5

Difficult: This section was actually pretty reasonable. The only part I had trouble with was understanding their method of antialiasing. I don't understand exactly how you can remove all Fourier coefficients outside a certain interval. Reflective: I really liked their examples of aliasing -- the spinning rotor blades, and even the plotting of functions. That was creative but very applicable example.

8.5, due Dec 3

Difficult: I understood the mathematical concept of a convolution. And I understand how convolutional neural networks work like the back of my hand. But after reading this section, I still don't understand how the two mesh together. I was on board with the description given in the intro to 8.5, until they barely mentioned filters and feature extraction again for the rest of the section. Ex. 8.5.11 helped, but more examples like that in class would make a big difference. Reflective: I enjoyed what I did understand of Ex. 8.5.11, which explained that you could apply a convolution to a set of samples to remove some frequencies while leaving others untouched. I still missed the "How", though.