摘要:This paper presents a multivariate dataset of 2866 food fipping movements, performed by 4 chefs and 5 home cooks, with diferent grilled food and two utensils (spatula and tweezers) . The 3D trajectories of strategic points in the utensils were tracked using optoelectronic motion capture . The pinching force of the tweezers, the bending force and torsion torque of the spatula were also recorded, as well as videos and the subject gaze . These data were collected using a custom experimental setup that allowed the execution of fipping movements with freshly cooked food, without having the sensors near the dangerous cooking area . Complementary, the 2D position of food was computed from the videos . The action of fipping food is, indeed, gaining the attention of both researchers and manufacturers of foodservice technology. The reported dataset contains valuable measurements (1) to characterize and model fipping movements as performed by humans, (2) to develop bio-inspired methods to control a cooking robot, or (3) to study new algorithms for human actions recognition .