Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] Accurate and interpretable modelling of conditional distributions (predicting densities) by decomposing joint distribution into mixed moments

I am developing methodology e.g. for very accurate modeling of joint distribution by decomposing in basis of orthonormal polynomials – where coefficients have similar interpretation as (mixed) moments (expected value, variance, skewness, kurtosis …), e.g. to model their relations, time evolution for nonstationary time series.

We can nicely see growing likelihood of such predictions as conditional distributions when adding information from succeeding variables.

While people are used to predicting values, which can be put into excel table, we can get better predictions by modelling entire (conditional) probability distributions – starting with additionally getting variance evaluating uncertainty of such predicted value e.g. as expected value.

Using such orthonormal basis to model density, we can predict its coefficients (“moments”) independently – the difference from standard predicting value is just separately predicting (MSE) e.g. a few moments, here as just linear combination for interpretatbility (could use e.g. NN instead) finally combining them into predicted density.

I have implementation and further develop it – what kind of data could you suggest to use it for? (preferably complex low dimensional statistical dependencies). ML methods to compare it with?

Slides, recent paper, its overview:

https://i.imgur.com/2xNPCIm.png

submitted by /u/jarekduda
[link] [comments]