Posts

8.8, due Dec 10

Difficult: I think one thing that is essential to understanding this section is understanding how a space V_l is the direct sum of V_j and V_j(perp), which is something we covered only briefly in the previous lecture, and which we had no homework questions about. I could stand to gain a better foundation on that topic. Reflective: I think the crucial line in the chapter was this one: "Converting from sons to daughters in (V_l intersect L^2([0,1))) corresponds to changing basis from the standard basis in F^n to the [wavelet basis]. That seems really, important, but I still lack intuition for what a wavelet basis looks like.

8.7, due December 6

Difficult: The hard part of this section for me was understanding formula 8.48 and the Vector Space Structure section -- specifically the part about scaling relations. Reflective: I can see how these kinds of functions could be better for approximating functions with local behavior, whereas to express such functions in a Fourier basis would be very computationally expensive. I can also see how the simplicity of the function could cause it to be easier to work with (differentiate, evaluate, etc.)

8.6, due Dec 5

Difficult: This section was actually pretty reasonable. The only part I had trouble with was understanding their method of antialiasing. I don't understand exactly how you can remove all Fourier coefficients outside a certain interval. Reflective: I really liked their examples of aliasing -- the spinning rotor blades, and even the plotting of functions. That was creative but very applicable example.

8.5, due Dec 3

Difficult: I understood the mathematical concept of a convolution. And I understand how convolutional neural networks work like the back of my hand. But after reading this section, I still don't understand how the two mesh together. I was on board with the description given in the intro to 8.5, until they barely mentioned filters and feature extraction again for the rest of the section. Ex. 8.5.11 helped, but more examples like that in class would make a big difference. Reflective: I enjoyed what I did understand of Ex. 8.5.11, which explained that you could apply a convolution to a set of samples to remove some frequencies while leaving others untouched. I still missed the "How", though.

8.4, due Nov. 30

Difficult: I had a hard time following the proof of the fast Fourier Transform algorithm. I did think it was neat that we were able to take advantage of mathematical symmetry to improve performance from O(n^2) to O(nlogn)! Reflective: I thought it was pretty cool that because the DFT can be represented as an orthonormal, it is not only invertible, but very easy to invert!

8.3, due Nov. 28

Difficult: My lack of understanding in this chapter began when I couldn't understand the first equality in Lemma 8.3.1... and still hadn't caught back on by the end of the chapter. Looking forward to class tomorrow! Reflective: I didn't understand this section well enough to adequately reflect on it, but it seemed like the formula for the Dirichlet kernel looked very similar to the truncated Fourier series. In Lemma 8.3.6 it showed their relationship, but I guess I'm still missing the intuition for what a Dirichlet kernel is, and what its purpose is.

8.1, due Nov 26

Difficult: I had a hard time distinguishing between k and c_k. As far as I understand, the c_k's are Fourier coefficients, but k's are the index...? I'm not sure. Some extra explanation on that would be helpful. Reflective: I can already see how powerful this could be. Anywhere you see a periodic signal, a Fourier transform will give extremely valuable information. I saw an interesting application of this in my deep learning class, where the position of a word in a sentence was encoded by modulating its input embedding with a certain frequency, based on the position of the word. The neural network then learned to perform a Fourier Transform to use the position of the word as a feature in its learning.