Written by Tom SF Haines

Monday, 30 May 2011 
Given that I have uploaded a Latent Dirichlet Allocation implementation to my code store that uses Gibbs sampling it seemed remiss to omit an implementation that used the mean field variational method (Note that this is not the same variational method used by the original LDA paper. Its not exactly the same graphical model either  in both cases I would consider these to be (minor) improvements.). I just fixed this  it can be obtained from the usual place, in the Google code repository linked from the menu, under the directory lda_var. I also moved the gibbs sampling version to the directory lda_gibbs
Also, two updates in one month  there must be something wrong with me!

Last Updated ( Monday, 30 May 2011 )

Written by Tom SF Haines

Sunday, 08 May 2011 
Sometime last year I published a paper in which I indicated that the code would be available on my website. Being that the paper was in some minor workshop I might of delayed.. slightly. Well, better late than never, and I guess this is a good time to put the paper, and other related stuff, online as well.
The paper itself is Video Topic Modelling with Behavioural Segmentation by T. S. F. Haines and T. Xiang, and appears in the ACM Workshop on Multimodal Pervasive Video Analysis, 2010. You may download it as a pdf by clicking on the papers title.
The code itself can be found over at my Google code project, which is linked from the source code link on this website. You can also download the presentation I gave here, which includes various video files demonstrating it in action on the mile end data set. Additionally, there is a 2 minute long demonstration video I created, which can be obtained here.

Last Updated ( Sunday, 08 May 2011 )


Written by Tom SF Haines

Thursday, 31 March 2011 
A little while ago, early 2010 if I recall correctly, I had cause to implement LDA, using the Gibbs sampling method. I have finally found the time to clean the code up and stick it in my code repository, so you can now find a LDA implementation available at http://code.google.com/p/haines/. Its nothing fancy, but well commented with a few test examples and works as well as any other implementation (Well, a variational approach would be better, computationally speaking, but LDA is simple enough that it is probably not worth the effort.).

Last Updated ( Thursday, 31 March 2011 )


Written by Tom SF Haines

Friday, 04 March 2011 
Well, this website now looks less silly, bland, and maybe even halfway decent. All I've done is tweak the colours and change the background, but the effect is quite spectacular. I've been wanting to do this since the New Year, but see my last post for the reasons that didn't happen.
I am now intending to update with a greater frequency, not because I want to but because I have a backlog of stuff I want to put up here, and its not getting any shorter. Its mostly research stuff  code and papers.
And yes, in case anyone is wondering, the background is an upside down tree. Its a crop of a photo I took a few hundred meters from my parents house, just before Christmas when it had snowed heavily  went out and photographed all sorts of things. But a tree is kinda appropriate, given that this website is mostly research, most of which involves graphical models. Ok, a tree isn't a graph, and its a little too real to be considered a model, but it looks sweet, so I don't care:P


Written by Tom SF Haines

Friday, 04 March 2011 
My one (If I am being optimistic.) reader may have noticed I have been away. Though, in all probability this fictional being did not, given that I update this website so infrequently that no updates for two months is not just normal  it is to be expected. But the crux of it is that since just before Christmas my house in London lost internet, due to some cable being broken. Virgin media then lied to us, constantly for about two months, by which time we had given up and changed provider. I've detailed in the extended version of this story the scale of this screw up, but now I am back I will start updating my website again. At my usual ponderous pace:)

Last Updated ( Friday, 04 March 2011 )

Read more...


Written by Tom SF Haines

Wednesday, 05 January 2011 
I recently needed to do some work with the multivariate Gaussian distribution, in a fully Bayesian context. Surprisingly, I could not find a good reference for what I would consider a fairly basic subject area, so I wrote one (The book Bayesian Data Analysis has it all, but its hardly easy to tease out the relevant details.). By good I mean all equations given, with consistent notation throughout and, most importantly considering my use (Gibbs sampling, where I want to collapse out everything possible.), how to integrate out a Gaussian drawn from the conjugate prior, to determine the probability of a newly presented sample having been drawn from the Gaussian being estimated.
Gaussian Conjugate Prior Cheat Sheet.
Additionally, I wrote some python code to demonstrate it in action.
Gaussian Conjugate Prior Example.
Now I have gone to the effort of writing this I am going to have to sit down at some point and implement a Dirichlet process Gaussian mixture model, which is a pretty dam good density estimation method, and it would be nice to have it in my tool box. When I do I will of course upload it to my Google code repository.


