[ICML 2019] Day 3 - Robotics, Good ol' Sparse Coding, misc. applications, Transfer, Multitask and Active Learning
[ICML 2019] Day 3 - Robotics, Good ol’ Sparse Coding, misc. applications, Transfer, Multitask and Active Learning
Quite tired. Will only post pointers to materials that are of potential interest to me, and investigate them on a rainy week-end back in HK.
Test of Time Award:
- Online Dictionary Learning for Sparse Coding ** from my professors at ENS **
- slides
- Authors think their paper was successful and had a big impact because of a good timing (since datasets were becoming larger and larger, there was a need for more scalable matrix factorization methods), a combination of maths and engineering (they release an efficient software package: the SPAMS toolbox; pip install spams), flexibility in the usage of the package (it was used in other domains, in unexpected contexts).
Session Applications
- Exploiting Worker Correlation for Label Aggregation in Crowdsourcing
- Fast and Flexible Inference of Joint Distributions from their Marginals
- Cognitive model priors for predicting human decisions
- slides
-
Which gamble would you rather take: a 50/50 chance ofwinning/losing $100, or a 100% chance of $0? Although both gambles have equal expected value (payoff), the majority of people systematically prefer the second (Kahneman & Tversky, 1979)
This paper proposes to use synthetic data sampled from cognitive models to pre-train neural networks (inducing a cognitive inductive bias). The pre-trained neural networks are then fine-tuned on small datasets (a few hundreds or fewer data points collected by behavioral scientists) of real human decisions. The paper cites works done in the field of behavioral economics. To investigate further…
- Direct Uncertainty Prediction for Medical Second Opinions
Session Transfer and Multitask Learning
Session Active Learning
Posters
Like many papers this year, this one aims at providing an interpretable model. Interpretability/Explanability is definitely a trending topic. In brief, the latent kernel is a composition of base kernels which is learnt through a stochastic kernel process. The simple composition mechanism can provide a natural language explanation of the model.