[P] Planar manipulator video dataset for testing goal-oriented prediction
I’d like to share the following toy dataset with you which is readily available in form as raw videos and tfrecords:
The dataset consists of 90 000 color videos that show a planar robot manipulator executing articulated manipulation tasks. More precisely, the manipulator grasps a circular object of random color and size and places it on top of a square object/platform of again random color and size. The initial conﬁgurations (location, size and color) of the objects were randomly sampled during generation. Different from other datasets such as the moving MNIST dataset, the samples comprise a goal-oriented task as described, making it more suitable for testing prediction capabilities of an ML model. For instance, one can use it as a toy dataset to investigate the capacity and output behavior of a deep neural network before testing it on real-world data.
Feel free to use, share and comment 🙂