Machine Learning Meetup Notes: 2010-04-28

From Noisebridge
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.
  • Mike S presented a mathematical overview of SVMs
    • Started with introduction to linear classification [1]
    • Discussed the kernel trick [2]
    • Loosely derived the loss function and dual loss function for support vector machines [3]
    • Emphasized two important aspects of SVMs:
      • Dual problem is a quadratic programming problem that is easier to solve than the primal problem
      • After dual problem is optimized, only the support vectors (the data points whose langrangian multipliers are > 0) are needed to make predictions for new data (and their associated multipliers)
  • Thomas talked about the KDD conference and their data competition [4]
  • Sai skyped in and talked a bit about his use of libSVM for classification of user history on his website cssfingerprint.com
  • We talked a little bit about libSVM [5]
  • Ted posted some links: