Instead of publications, perhaps citation frequency post medal or some other measure of influence. But my last question still stands, why don’t we see the `also rans’ doing similar things.

rakesh

]]>However, if a Bayesian learner (in this case language learner) can make predictions on parameters using a Dirac Decomposition, and that Dirac Decomposition is consistent, then perhaps (possibly), the learner can make inferences about the structure of the language and perhaps (possibly) generate negative evidence when an input or output violates the Dirac Decomposition Consistency.

I’m rusty and shooting from the hip here, but that’s what stood out for me.

]]>Btw, this theorem of Freedman needs infinite set of outcomes. I use finite set of outcomes. On the other hand I work in a stationary environment while statisticians assume that the parameter corresponds to an i.i.d. distribution. I believe Freedman’s argument with minor changes also applies to my setup: for a co-meager set of pairs of a parameter (=ergodic distribution) theta and a belief lambda over parameter the Bayesian estimator that is calculated with lambda will not converge to theta.

Btw II. I like Freedman’s Theorem, but I think it is relevant only if you are not a religiously committed Bayesian. To care about a negligible set of parameters you need to be somebody who only uses Bayesian estimator as a tool, but does not really believe in the prior over parameters.

Thanks !

]]>that the posterior converges to a point mass with respect

to the data generating distribution, not the Bayesian’s own

marginal distribution (as in Doob’s result). The latter is solopsistic.

If you do it w.r.t. the data generating process, you find that the

set of consistent pairs (data generating processes, priors) that

lead to consistency is a topological null set.

See my old blog post:

http://normaldeviate.wordpress.com/2012/06/14/freedmans-neglected-theorem/

Best wishes

Larry Wasserman

I blush with embarrassment! It should have been 1 cent rather than 10 cents.

Rakesh

]]>