i just found a blog on computer vision and machine learning, and got a chance to have a fundamental and clear understanding of the curse of dimensionality.
the posting above provides a very thorough and fundamental explanation about the essence of the curse of dimensionality. the other postings on that blog also seem to be definitely must-read articles for ML newbies, just like me. 🙂
it would be really worth reading the whole materials on that blog, and that’s what i’m gonna do for the next week.
I happened to find these undergraduate ML course video clips opened on youtube by UBC. I think I really have to thank to UBC & the professor Nando de Freitas for this valuable sharing. Anyone with strong interest or something in ML should definitely be better to take a look through the whole sessions. That’s what I’m gonna do for months now.
Here’s a community-driven and educational guidance with current focus on machine learning and probabilistic AI – Metacademy. It will make a great assistance for those with strong interest and enthusiasm in machine learning, just like me. 🙂 It draws an intuitive map of sequential steps towards the concept you queried about. You’ll see once you give it a try.
Machine Learning Subreddit with News, Research Papers, Videos, Lectures, Softwares and Discussions on:
- Machine Learning
- Data Mining
- Information Retrieval
- Predictive Statistics
- Learning Theory
- Search Engines
- Pattern Recognition
(quoted from Machine Learning Subreddit page)
Machine Learning and Probabilistic Graphical Models Course provided by Department of Computer Science and Engineering, University at Buffalo.
A bunch of educational video segments on machine learning – Machine Learning Video Library.
Aggregation, Bayesian Learning, Bias-Variance Tradeoff, Bin Model, Data Snooping, Error Measures, Gradient Descent, Learning Curves, Learning Diagram, Learning Paradigms, Linear Classification, Linear Regression, Logistic Regression, Netflix Competition, Neural Networks, Nonlinear Transformation, Occam’s Razor, Overfitting, Radial Basis Functions, Regularization, Sampling Bias, Support Vector Machines, Validation, VC Dimension
Thanks to the first session by prof. Byoung-Tak Zhang (http://bi.snu.ac.kr/) in the ‘2014 PRML winter school’ programs, I luckily found the most important, probably, knowledge warehouse: JMLR. Better create your own account and take a deep look around there. 🙂