Did you know that one of the pioneers in neural network was an econometrician? This was not an easy book to find, but it was well worth it. The book is a collection of articles written by Halbert White—as in *White* Standard Errors—that rigorously laid the theoretical foundation for the effectiveness of neural nets, including universal approximation, consistency and asymptotic normality of learned parameters. These results fill a much-needed gap for those with a background in statistics.
While we on the topic of AI, do you know that Andrew Ng’s online course on deep learning is already available in China, two weeks ahead of its official launch in the United States? It’s a very telling sign on how much a leader China is in the development of AI. I highly encourage anyone interested in knowing what AI really is about to take the course. You get to learn from one of the most important figures in the field, how much better than that can you get?
P.S. It’s also interesting that Andrew chose an economic topic as the very first example.
So #Haro actually exists on ISS, now give me my Gundam.
If you have used the preload trick to use Math Kernel Library (MKL) with an early version of Anaconda, you will find that under Anaconda 4.4 MKL is limited a single thread no matter what settings you enter. The solution is to remove the LD_PRELOAD entry for MKL, log out, log back in and you should be good to go.