- #317: Multi-View Clustering via Canonical Correlation Analysis (Chaudhuri, Kakade, Livescu, Sridharan). This paper shows a new application of CCA to clustering across multiple views. They use some wikipedia data in experiments and actually prove something about the fact that (under certain multi-view-like assumptions), CCA does the "right thing."
- #295: Learning Nonlinear Dynamic Models (Langford, Salakhutdinov,, Zhang). The cool idea here is to cut a deterministic classifier in half and use its internal state as a sort of sufficient statistic. Think about what happens if you represent your classifier as a circuit (DAG); then anywhere you cut along the circuit gives you a sufficient representation to predict. To avoid making circuits, they use neural nets, which have an obvious "place to cut" -- namely, the internal nodes.
- #364: Online Dictionary Learning for Sparse Coding (Mairal, Bach, Ponce, Sapiro). A new approach to sparse coding; the big take-away is that it's online and fast.
- 394: MedLDA: Maximum Margin Supervised Topic Models for Regression and Classification (Zhu, Ahmed, Xing). This is a very cute idea for combining objectives across topic models (namely, the variational objective) and classification (the SVM objective) to learn topics that are good for performing a classification task.
- #393: Learning from Measurements in Exponential Families (Liang, Jordan, Klein). Suppose instead of seeing (x,y) pairs, you just see some statistics on (x,y) pairs -- well, you can still learn. (In a sense, this formalizes some work out of the UMass group; see also the Bellare, Druck and McCallum paper at UAI this year.)
- #119: Curriculum Learning (Bengio, Louradour, Collobert, Weston). The idea is to present examples in a well thought-out order rather than randomly. It's a cool idea; I've tried it in the context of unsupervised parsing (the unsearn paper at ICML) and it never helped and often hurt (sadly). I curriculum-ified by sentence length, though, which is maybe not a good model, especially when working with WSJ10 -- maybe using vocabulary would help.
- #319: A Stochastic Memoizer for Sequence Data (Wood, Archambeau, Gasthaus, James, Whye Teh). If you do anything with Markov models, you should read this paper. The take away is: how can I learn a Markov model with (potentially) infinite memory in a linear amount of time and space, and with good "backoff" properties. Plus, there's some cool new technology in there.
- A Uniqueness Theorem for Clustering Reza Bosagh Zadeh, Shai Ben-David. I already talked about this issue a bit, but the idea here is that if you fix k, then the clustering axioms become satisfiable, and are satisfied by two well known algorithms. Fixing k is a bit unsatisfactory, but I think this is a good step in the right direction.
- Convex Coding David Bradley, J. Andrew Bagnell. The idea is to make coding convex by making it infinite! And then do something like boosting.
- On Smoothing and Inference for Topic Models Arthur Asuncion, Max Welling, Padhraic Smyth, Yee Whye Teh. If you do topic models, read this paper: basically, none of the different inference algorithms do any better than the others (perplexity-wise) if you estimate hyperparameters well. Come are, of course, faster though.
- Correlated Non-Parametric Latent Feature Models Finale Doshi-Velez, Zoubin Ghahramani. This is an indian-buffet-process-like model that allows factors to be correlated. It's somewhat in line with our own paper from NIPS last year. There's still something a bit unsatisfactory in both our approach and their approach that we can't do this "directly."
- Domain Adaptation: Learning Bounds and Algorithms. Yishay Mansour, Mehryar Mohri and Afshin Rostamizadeh. Very good work on some learning theory for domain adaptation based on the idea of stability.
my biased thoughts on the fields of natural language processing (NLP), computational linguistics (CL) and related topics (machine learning, math, funding, etc.)
30 June 2009
ICML/COLT/UAI 2009 retrospective
This will probably be a bit briefer than my corresponding NAACL post because even by day two of ICML, I was a bit burnt out; I was also constantly swapping in other tasks (grants, etc.). Note that John has already posted his list of papers.
Concerning your #8
ReplyDelete"A Uniqueness Theorem for Clustering" by Reza Bosagh Zadeh and Shai Ben-David:
The fact that fixing k makes the axioms consistent was already shown (implicitly) in the original Kleinberg paper. Another way of making the Kleinberg axioms consistent (without having to fix k) was shown by Ackerman-Ben-David at NIPS08
(http://books.nips.cc/papers/files/nips21/NIPS2008_0383.pdf). So the actual contribution of the current UAI paper is not in proposing a consistent version of the axioms but rather in characterizing Single Linkage in a way that sheds light on when should it be the clustering algorithm of choice. Shai Ben-David
Re: #393: Learning from Measurements in Exponential Families
ReplyDeleteIt's worth noting that the objective optimized by Liang et al (2009) is identical to the one in the earlier work of Graca et al (2008). This is also the same objective as the approximation proposed by Bellare, Druck, McCallum in this year's UAI.
[1] Kedar Bellare, Gregory Druck, and Andrew McCallum. Alternating projections for learning with expectation constraints. In UAI, 2009.
[2] Percy Liang, Michael I. Jordan, and Dan Klein. Learning from measurements in exponential families. In ICML, 2009.
[3] Joao Graca, Kuzman Ganchev, and Ben Taskar. Expectation maximization and posterior constraints. In NIPS 20, 2008.
Great tips. I am new to business, trying to visit more business blogs for guides and tips.
ReplyDeleteYou can be friends with me. Please come visit my site Eugene Oregon business directory when you got time. Thanks.
I usually don’t leave comments!!! Trust me! But I liked your blog…especially this post! Would you mind terribly if I put up a backlink from my site to your site? Please come visit my site
ReplyDeleteAnaheim Business Directory
when you got time
I enjoyed reading your work! GREAT post! I looked around for this… but I found you! Anyway, would you mind if I threw up a backlink from my site? Please come visit my site Bakersfield Yellow Page Business Directory when you got time.
ReplyDeleteNice, I think it could be interesting to add some more entries following this one, and probably it's not only me having this opinion. Cheers! Please visit my site Roofing Contractors when you got time.
ReplyDeleteThis comment has been removed by the author.
ReplyDeleteI can see that you are an expert at your field! I am launching a website soon, and your information will be very useful for me. Thanks for all your help and wishing you all the success in your business. Please come visit my site Local Business Directory Of Rochester U.S.A. when you got time.
ReplyDeleteYou got a really useful blog I have been here reading for about an hour. I am a newbee and your success is very much an inspiration for me. Please come visit my site Local Business Directory Of Fort Wayne U.S.A. when you got time.
ReplyDeletemind that a lot of people are in. The sense of wanting to help, but not knowing how or where, is something a lot of us are going through. Please come visit my site Anaheim Phone Numbers when you got time.
ReplyDeletemind that a lot of people are in. The sense of wanting to help, but not knowing how or where, is something a lot of us are going through. Please come visit my site Bakersfield Business Phone Book when you got time.
ReplyDeleteYou owe a very nice and interesting blog. Please come visit my site Phone Directory Of Memphis City Tennessee TN State when you got time.
ReplyDeleteYou owe a very nice and interesting blog. Please come visit my site Business Directory Memphis when you got time.
ReplyDeleteI usually don’t leave comments!!! Trust me! But I liked your blog…especially this post! Would you mind terribly if I put up a backlink from my site to your site? Please come visit my site Business Resources Comprehensive Listings when you got time.
ReplyDeleteI usually don’t leave comments!!! Trust me! But I liked your blog…especially this post! Would you mind terribly if I put up a backlink from my site to your site? Please come visit my site Indianapolis City Business Listings when you got time.
ReplyDeleteThank you for your good humor and for allowing yourself to be convinced that this was the right show for you to work on. Please come visit my site Anaheim Business Directory Forum Blog Classifieds when you got time.
ReplyDeleteThank you for your good humor and for allowing yourself to be convinced that this was the right show for you to work on. Please come visit my site Anaheim Business Search Engine when you got time.
ReplyDeleteI wanted to thank you for this great read!! I definitely enjoying every little bit of it :) I have you bookmarked to check out new stuff you post. Please visit my Please visit my blog when you have time Please come visit my site Reno Business Services And Classifieds when you got time.
ReplyDeleteI wanted to thank you for this great read!! I definitely enjoying every little bit of it :) I have you bookmarked to check out new stuff you post. Please visit my Please visit my blog when you have time Please come visit my site Nevada NV Phone Directory when you got time.
ReplyDeleteThis is so amazing to read. I never understood how this worked but this is so good.
ReplyDeletepalm beach cosmetic sedation dentist
I recently came across your blog and have been reading along. I think I will leave my first comment. I don’t know what to say except that I have enjoyed reading. Nice blog. I will keep visiting this blog very often.
ReplyDeleteI would never find a better place to read as good comments as this site never seen before.Its easy to find easy to understand, and it have serious comments not sick jokes as others, thanks and please keep like this.
ReplyDeleteThis is a great blog with excellent posts and links.
ReplyDeletethanks for sharing.
Ideal kilonuzu Formula 7 ile koruyun.
ReplyDelete