The code of a machine learning system for generating realistic human movements has been opened

A team of researchers from Tel Aviv University has opened the source code associated with the MDM (Motion Diffusion Model) machine learning system, which allows generating realistic human movements. The code is written in Python using the PyTorch framework and is distributed under the MIT license. To conduct experiments, you can use both ready-made models and train the models yourself using the proposed scripts, for example, using the HumanML3D collection of three-dimensional human images. To train the system, a GPU with CUDA support is required.

The use of traditional capabilities for animating human movements is difficult due to the complications associated with the large variety of possible movements and the difficulty of formally describing them, as well as the great sensitivity of human perception to unnatural movements. Previous attempts to use generative machine learning models have had problems with quality and limited expressiveness.

The proposed system attempts to use diffusion models to generate movements, which are inherently better suited for simulating human movements, but are not without drawbacks, such as high computational requirements and control complexity. To minimize the shortcomings of diffusion models, MDM uses a transformer neural network and sample prediction instead of noise prediction at each stage, making it easier to prevent anomalies such as loss of surface contact with the foot.

To control generation, it is possible to use a text description of an action in natural language (for example, β€œa person walks forward and bends down to pick something up from the ground”) or use standard actions such as β€œrunning” and β€œjumping.” The system can also be used to edit movements and fill in lost details. The researchers conducted a test in which participants were asked to choose a better result from several options - in 42% of cases, people preferred synthesized movements over real ones.



Source: opennet.ru

Add a comment