Archive for the ‘talks’ Category

h1

Aalto tutorial 19th January 2015

January 19, 2015

Slides are here.  Are a bit preliminary!  This is gradually being reworked for MLSS 2015 in Sydney, February.

I’ve added a section on the generalised Indian Buffet Process theory from Lancelot James, my own “simplified” version for computer scientists.

h1

Talk at Hong Kong Univ. of Science and Tech.

December 4, 2014

Just gave a talk at the great Asian university HKUST where I was visiting Lancelot James, and ran into old Berkeley compatriot Dekai Wu who is long time a HKUST professor in NLP and machine translation.  Interesting to reminisce the “probability wars” in earlier AI, something Lancelot wasn’t aware of.   The talk (here in PDF) is substantially updated from the RMIT talk.

Lancelot, Wray and Dekai at HKUST

Lancelot, Wray and Dekai at HKUST

h1

A tutorial on non-parametric methods

October 30, 2014

Right now, probably the best general purpose methods for what Frank Wood and Yee Whye Teh called the Graphical Pitman-Yor Process (see their AI&Stats 2009 paper) are ours using table indicators.  I gave a tutorial on the hierarchical Dirichlet Process, and the hierarchical Pitman-Yor Process at Monash last week, and the slides are here.  Still developing these, so they will be improved.  Haven’t made the extension from hierarchical to graphical yet but its in Lan Du‘s PhD thesis.  The examples given in the tutorial are about topic models and n-gram models.

h1

Talk at RMIT on “Latent Semantic Modelling”

October 16, 2014

I realised non-parametric methods are rather like deep neural networks.  Anyway, here’s a more high level talk given at RMIT summarising recent work.

h1

Some favourite tutorials

September 10, 2014

First, if you are starting out, you need to see How to do good research, get it published in SIGKDD and get it cited!  This is an amazing tutorial from Eamonn Keogh back in 2009, but nothing changes, right?   Lots of gems in there.

Here are some tutorials from others I highly recommend, from the fabulous Video Lectures website.   The titles are pretty good descriptors of content.  Ideally, this is what students need to know for research in topic models and related material like Bayesian probability, graphical models, MCMC, etc.

Also, if you’re into graphical models, the best set of lectures on theoretical material I know is from Prof. Stephen Lauritzen (now retired), Graphical Models and Inference, a course presented at Oxfords Statistics some time ago.  He is the most outstanding researcher in this field.

h1

KDD 2014

August 28, 2014

My first KDD conference in a while!

Swapnil Mishra and I had to go to present our paper, “Experiments with Non-parametric Topic Models” (… a link to the ACM page … we paid a lot of money to make this paper available free for all).  The conference slides we presented are here, and the software is available on MLOSS called hca.  The talk on VideoLectures.net is up now.

Also ran into Aaron Li and Alex Smola who spoke after us.  I really have to implement their trick!