Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] Hosting multiple large models online

I want to host some of my models as APIs for a side project, wondering if anyone has done this on typical web hosts/ how well would that go? Looking to host some BERT models for example, which take a few seconds even on my GPU inference so is it doable on web host CPUs?

TalkToTransformer website seems to be hosting GPT2 models and can handle queries quite well, anyone know how that is running?

Obviously would like to keep costs low otherwise would just rent a server with GPU..

submitted by /u/mukaj
[link] [comments]