30 June 2009

ICML/COLT/UAI 2009 retrospective

This will probably be a bit briefer than my corresponding NAACL post because even by day two of ICML, I was a bit burnt out; I was also constantly swapping in other tasks (grants, etc.). Note that John has already posted his list of papers.
  1. #317: Multi-View Clustering via Canonical Correlation Analysis (Chaudhuri, Kakade, Livescu, Sridharan). This paper shows a new application of CCA to clustering across multiple views. They use some wikipedia data in experiments and actually prove something about the fact that (under certain multi-view-like assumptions), CCA does the "right thing."
  2. #295: Learning Nonlinear Dynamic Models (Langford, Salakhutdinov,, Zhang). The cool idea here is to cut a deterministic classifier in half and use its internal state as a sort of sufficient statistic. Think about what happens if you represent your classifier as a circuit (DAG); then anywhere you cut along the circuit gives you a sufficient representation to predict. To avoid making circuits, they use neural nets, which have an obvious "place to cut" -- namely, the internal nodes.
  3. #364: Online Dictionary Learning for Sparse Coding (Mairal, Bach, Ponce, Sapiro). A new approach to sparse coding; the big take-away is that it's online and fast.
  4. 394: MedLDA: Maximum Margin Supervised Topic Models for Regression and Classification (Zhu, Ahmed, Xing). This is a very cute idea for combining objectives across topic models (namely, the variational objective) and classification (the SVM objective) to learn topics that are good for performing a classification task.
  5. #393: Learning from Measurements in Exponential Families (Liang, Jordan, Klein). Suppose instead of seeing (x,y) pairs, you just see some statistics on (x,y) pairs -- well, you can still learn. (In a sense, this formalizes some work out of the UMass group; see also the Bellare, Druck and McCallum paper at UAI this year.)
  6. #119: Curriculum Learning (Bengio, Louradour, Collobert, Weston). The idea is to present examples in a well thought-out order rather than randomly. It's a cool idea; I've tried it in the context of unsupervised parsing (the unsearn paper at ICML) and it never helped and often hurt (sadly). I curriculum-ified by sentence length, though, which is maybe not a good model, especially when working with WSJ10 -- maybe using vocabulary would help.
  7. #319: A Stochastic Memoizer for Sequence Data (Wood, Archambeau, Gasthaus, James, Whye Teh). If you do anything with Markov models, you should read this paper. The take away is: how can I learn a Markov model with (potentially) infinite memory in a linear amount of time and space, and with good "backoff" properties. Plus, there's some cool new technology in there.
  8. A Uniqueness Theorem for Clustering Reza Bosagh Zadeh, Shai Ben-David. I already talked about this issue a bit, but the idea here is that if you fix k, then the clustering axioms become satisfiable, and are satisfied by two well known algorithms. Fixing k is a bit unsatisfactory, but I think this is a good step in the right direction.
  9. Convex Coding David Bradley, J. Andrew Bagnell. The idea is to make coding convex by making it infinite! And then do something like boosting.
  10. On Smoothing and Inference for Topic Models Arthur Asuncion, Max Welling, Padhraic Smyth, Yee Whye Teh. If you do topic models, read this paper: basically, none of the different inference algorithms do any better than the others (perplexity-wise) if you estimate hyperparameters well. Come are, of course, faster though.
  11. Correlated Non-Parametric Latent Feature Models Finale Doshi-Velez, Zoubin Ghahramani. This is an indian-buffet-process-like model that allows factors to be correlated. It's somewhat in line with our own paper from NIPS last year. There's still something a bit unsatisfactory in both our approach and their approach that we can't do this "directly."
  12. Domain Adaptation: Learning Bounds and Algorithms. Yishay Mansour, Mehryar Mohri and Afshin Rostamizadeh. Very good work on some learning theory for domain adaptation based on the idea of stability.
Okay, that's it. Well, not really: there's lots more good stuff, but those were the things that caught my eye. Feel free to tout your own favorites in the comments.

24 comments:

  1. Concerning your #8
    "A Uniqueness Theorem for Clustering" by Reza Bosagh Zadeh and Shai Ben-David:
    The fact that fixing k makes the axioms consistent was already shown (implicitly) in the original Kleinberg paper. Another way of making the Kleinberg axioms consistent (without having to fix k) was shown by Ackerman-Ben-David at NIPS08
    (http://books.nips.cc/papers/files/nips21/NIPS2008_0383.pdf). So the actual contribution of the current UAI paper is not in proposing a consistent version of the axioms but rather in characterizing Single Linkage in a way that sheds light on when should it be the clustering algorithm of choice. Shai Ben-David

    ReplyDelete
  2. Re: #393: Learning from Measurements in Exponential Families

    It's worth noting that the objective optimized by Liang et al (2009) is identical to the one in the earlier work of Graca et al (2008). This is also the same objective as the approximation proposed by Bellare, Druck, McCallum in this year's UAI.

    [1] Kedar Bellare, Gregory Druck, and Andrew McCallum. Alternating projections for learning with expectation constraints. In UAI, 2009.

    [2] Percy Liang, Michael I. Jordan, and Dan Klein. Learning from measurements in exponential families. In ICML, 2009.

    [3] Joao Graca, Kuzman Ganchev, and Ben Taskar. Expectation maximization and posterior constraints. In NIPS 20, 2008.

    ReplyDelete
  3. Great tips. I am new to business, trying to visit more business blogs for guides and tips.
    You can be friends with me. Please come visit my site Eugene Oregon business directory when you got time. Thanks.

    ReplyDelete
  4. I usually don’t leave comments!!! Trust me! But I liked your blog…especially this post! Would you mind terribly if I put up a backlink from my site to your site? Please come visit my site
    Anaheim Business Directory
    when you got time

    ReplyDelete
  5. I enjoyed reading your work! GREAT post! I looked around for this… but I found you! Anyway, would you mind if I threw up a backlink from my site? Please come visit my site Bakersfield Yellow Page Business Directory when you got time.

    ReplyDelete
  6. Nice, I think it could be interesting to add some more entries following this one, and probably it's not only me having this opinion. Cheers! Please visit my site Roofing Contractors when you got time.

    ReplyDelete
  7. This comment has been removed by the author.

    ReplyDelete
  8. I can see that you are an expert at your field! I am launching a website soon, and your information will be very useful for me. Thanks for all your help and wishing you all the success in your business. Please come visit my site Local Business Directory Of Rochester U.S.A. when you got time.

    ReplyDelete
  9. You got a really useful blog I have been here reading for about an hour. I am a newbee and your success is very much an inspiration for me. Please come visit my site Local Business Directory Of Fort Wayne U.S.A. when you got time.

    ReplyDelete
  10. mind that a lot of people are in. The sense of wanting to help, but not knowing how or where, is something a lot of us are going through. Please come visit my site Anaheim Phone Numbers when you got time.

    ReplyDelete
  11. mind that a lot of people are in. The sense of wanting to help, but not knowing how or where, is something a lot of us are going through. Please come visit my site Bakersfield Business Phone Book when you got time.

    ReplyDelete
  12. You owe a very nice and interesting blog. Please come visit my site Phone Directory Of Memphis City Tennessee TN State when you got time.

    ReplyDelete
  13. You owe a very nice and interesting blog. Please come visit my site Business Directory Memphis when you got time.

    ReplyDelete
  14. I usually don’t leave comments!!! Trust me! But I liked your blog…especially this post! Would you mind terribly if I put up a backlink from my site to your site? Please come visit my site Business Resources Comprehensive Listings when you got time.

    ReplyDelete
  15. I usually don’t leave comments!!! Trust me! But I liked your blog…especially this post! Would you mind terribly if I put up a backlink from my site to your site? Please come visit my site Indianapolis City Business Listings when you got time.

    ReplyDelete
  16. Thank you for your good humor and for allowing yourself to be convinced that this was the right show for you to work on. Please come visit my site Anaheim Business Directory Forum Blog Classifieds when you got time.

    ReplyDelete
  17. Thank you for your good humor and for allowing yourself to be convinced that this was the right show for you to work on. Please come visit my site Anaheim Business Search Engine when you got time.

    ReplyDelete
  18. I wanted to thank you for this great read!! I definitely enjoying every little bit of it :) I have you bookmarked to check out new stuff you post. Please visit my Please visit my blog when you have time Please come visit my site Reno Business Services And Classifieds when you got time.

    ReplyDelete
  19. I wanted to thank you for this great read!! I definitely enjoying every little bit of it :) I have you bookmarked to check out new stuff you post. Please visit my Please visit my blog when you have time Please come visit my site Nevada NV Phone Directory when you got time.

    ReplyDelete
  20. This is so amazing to read. I never understood how this worked but this is so good.
    palm beach cosmetic sedation dentist

    ReplyDelete
  21. I recently came across your blog and have been reading along. I think I will leave my first comment. I don’t know what to say except that I have enjoyed reading. Nice blog. I will keep visiting this blog very often.

    ReplyDelete
  22. I would never find a better place to read as good comments as this site never seen before.Its easy to find easy to understand, and it have serious comments not sick jokes as others, thanks and please keep like this.

    ReplyDelete
  23. This is a great blog with excellent posts and links.
    thanks for sharing.

    ReplyDelete
  24. Ideal kilonuzu Formula 7 ile koruyun.

    ReplyDelete