Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Is it common practice to transform variables before training a mixture model? For example if a variable looks to be log-normal distributed, is there any harm in log-tranforming it before feeding it into a Gaussian Mixture Model? What about other transformations?
And for those of you who work with mixture models, how common is it to do things like this?
Sorry if this is a basic question, no one around me works with this stuff and I wanted to hear from people with experience.
submitted by /u/Minimum_Zucchini
[link] [comments]