Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Hello everyone, I trained my data by different techniques: dimensional analysis (DA), support vector machine (SVM), multi-layer perceptron neural network (ANN) and XGBoost (XGB).The lowest RMSE achieved in the testset was by XGB. However when I tried some combinations of input data the curve was different from other models. For example: https://i.redd.it/ol6azkkrggd31.png This phenomenon is theoretically expressed as a power of type y(x) = a * x ^ b. What are the possible causes for the model curve predicted by XGB not to follow the other models? Assumptions: (i) unbalanced continuous data of x or even y (target)? Histogram in https://i.imgur.com/EGg8eNN.png (ii) hyperparameters (a good mapping of max_depth, min_child_weight, gamma, eta and adding regularizer parameters was tested) (iii) nature of decision tree conditions Is there a way I can better generalize (as a post-prune) my model to fit it? Many thanks! submitted by /u/drainbamagex |