[P] Question about unsigned 64-bit integers
Are there any deep learning frameworks that support unsigned 64-bit integers?
I know it’s very specific, but I’m working on a chess engine (think AlphaZero stuff) and we often represent the board in chess engines with several unsigned 64-bit integers. This allows us to use extremely efficient functions to generate moves: https://www.chessprogramming.org/Move_Generation I want to do this move generation on the gpu, so the gpu tasks must support unsigned 64-bit integer and their respective manipulations (bit-shifts, etc…).
I looked at tensorflow and pytorch and neither seem to support unsigned 64 bit integers, unless I’m wrong?
Maybe there’s an easier way to do this with cuda? but I don’t know how to hook cuda up with pytorch/tensorflow efficiently. I can use either tensorflow or pytorch, and I work in python.