Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I have feature vectors with 2048 elements, also my feature may change overtime e.g. new features are added. I use Faiss as a search engine but I am not quite sure how to save these vectors. Right now I am syncing local folder with AWS s3. I do not think it is optimal because I have to sync files each time I search for similarity which takes a while.
Maybe I should use a vector database (like https://github.com/a-mma/AquilaDB or some other) or is there a more optimized method to sync my local and s3 storage?
submitted by /u/_pydl_
[link] [comments]