- Move to more complex, "intractable" structures.
- More with less: improving with only weak feedback.
- Learning structure: the analogue of learning graphical models.
- Learning from heterogenous data.
- Measuring sample complexity.
- Trading off expressivity with tractability.
- Integrated prediction and data mining.
- Computational complexity versus sample complexity.
The question that drives me is: to what extent can we let the NLPer forget about the fact that there's machine learning going on in the background. I think that, as of now, at least from my perspective, the answer is that the NLPer must be able to structure a search space for his problem, and devise a way for turning the "true output" into an "optimal policy" by breaking it down into regions. The rest is purely trivial crank turning (no math required). The answers to the questions above will help in this endeavor. A very basic question, hinted on before, is: when do we even need to consider structure? If we don't, life is much easier for the NLPer. But we need to be able to say when we do and when we don't need structure. And when we do, we want it to be as painless as possible.
酒店經紀PRETTY GIRL 台北酒店經紀人 ,禮服店 酒店兼差PRETTY GIRL酒店公關 酒店小姐 彩色爆米花酒店兼職,酒店工作 彩色爆米花酒店經紀, 酒店上班,酒店工作 PRETTY GIRL酒店喝酒酒店上班 彩色爆米花台北酒店酒店小姐 PRETTY GIRL酒店上班酒店打工PRETTY GIRL酒店打工酒店經紀 彩色爆米花
ReplyDelete