[D] Machine learning desktop
Helloooooo, First time poster. I just finished programming my first decision tree model on the Titanic data set starting to learn about the random forest on, supper excited to keep learning more !!
My question is what is the proformance losses or gains on having 2 1070s in nvlink or sli vs a 1080ti
For training machine learning models. I’ve read that GPU v-ram can be a limitation the 1080ti would have 11-12 gb of v-ram and the 2 1070s would have 8+8 = 16. Does the math work out like that ?
Also is it even worth it to have a desktop for machine learning projects ? Is AWS and other services like that that much better what do you think ?
submitted by /u/el_guy_el
[link] [comments]