Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Are there any deep learning frameworks that support unsigned 64-bit integers?
I know it’s very specific, but I’m working on a chess engine (think AlphaZero stuff) and we often represent the board in chess engines with several unsigned 64-bit integers. This allows us to use extremely efficient functions to generate moves: https://www.chessprogramming.org/Move_Generation I want to do this move generation on the gpu, so the gpu tasks must support unsigned 64-bit integer and their respective manipulations (bit-shifts, etc…).
I looked at tensorflow and pytorch and neither seem to support unsigned 64 bit integers, unless I’m wrong?
Maybe there’s an easier way to do this with cuda? but I don’t know how to hook cuda up with pytorch/tensorflow efficiently. I can use either tensorflow or pytorch, and I work in python.
submitted by /u/Pawnbrake
[link] [comments]