Avatars Grow Legs: Generating Smooth Human Motion from Sparse Tracking Inputs with Diffusion Model
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
IEEE Conference on Decision and Control (CDC)
The aim of decentralized gradient descent (DGD) is to minimize a sum of n functions held by interconnected agents. We study the stability of DGD in open contexts where agents can join or leave the system, resulting each time in the addition or the removal of their function from the global objective. Assuming all functions are smooth, strongly convex, and their minimizers all lie in a given ball, we characterize the sensitivity of the global minimizer of the sum of these functions to the removal or addition of a new function and provide bounds in O min κ0.5, κ/n0.5, κ1.5/nwhere κ is the condition number. We also show that the states of all agents can be eventually bounded independently of the sequence of arrivals and departures. The magnitude of the bound scales with the importance of the interconnection, which also determines the accuracy of the final solution in the absence of arrival and departure, exposing thus a potential trade-off between accuracy and sensitivity. Our analysis relies on the formulation of DGD as gradient descent on an auxiliary function. The tightness of our results is analyzed using the PESTO Toolbox.
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Bilge Acun, Benjamin Lee, Fiodar Kazhamiaka, Kiwan Maeng, Manoj Chakkaravarthy, Udit Gupta, David Brooks, Carole-Jean Wu
Ilkan Esiyok, Pascal Berrang, Katriel Cohn-Gordon, Robert Künnemann