Posts Tagged ‘tutorial’

h1

Machine Learning tutorial at ACSW 2020

February 5, 2020

Australasian Computer Science Week is a collection of computer science events for Australian and New Zealand CS researchers. I’m giving a tutorial as part of their HDR/ECR programme on Machine Learning. Machine Learning has gone crazy in the last few years, growing exponentially and with ever vanishing publishing cycles, entering its own singularity I believe. So my slides are quite general and covering big issues rather than lots of detail. Its a longer talk so I’ll be skipping a few slides. I left some of the math in for interested readers that I wont cover much in the talk. As always, way too many perspectives and variations to pick from but I am focusing on probabilistic interpretations.

h1

ALTA 2016 Tutorial: Simpler Non-parametric Bayesian Models

April 21, 2017

They recorded my tutorial ran at ALTA late in 2016.

Part 1 and part 2 up on Youtube, about an hour each.

 

h1

Basic tutorial: Oldie but a goody …

November 7, 2015

A student reminded me of Gregor Heinrich‘s excellent introduction to topic modelling, including a great introduction to the underlying foundations like Dirichlet distributions and multinomials.  Great reading for all students!  See

  • G. Heinrich, Parameter estimation for text analysis, Technical report, Fraunhofer IGD, 15 September 2009 at his publication page.
h1

MLSS 2015 Sydney tutorial

February 23, 2015

This Sydney 2015 MLSS summer school is organised by Edwin Bonilla and held in Sydney Feb 16-25th.  My tutorial is titled “Models for Probability/Discrete Vectors with Bayesian Non-parametric Methods.”  My final version of the slides is here in PDF.

h1

Aalto tutorial 19th January 2015

January 19, 2015

Slides are here.  Are a bit preliminary!  This is gradually being reworked for MLSS 2015 in Sydney, February.

I’ve added a section on the generalised Indian Buffet Process theory from Lancelot James, my own “simplified” version for computer scientists.

h1

A tutorial on non-parametric methods

October 30, 2014

Right now, probably the best general purpose methods for what Frank Wood and Yee Whye Teh called the Graphical Pitman-Yor Process (see their AI&Stats 2009 paper) are ours using table indicators.  I gave a tutorial on the hierarchical Dirichlet Process, and the hierarchical Pitman-Yor Process at Monash last week, and the slides are here.  Still developing these, so they will be improved.  Haven’t made the extension from hierarchical to graphical yet but its in Lan Du‘s PhD thesis.  The examples given in the tutorial are about topic models and n-gram models.