bnew

Veteran
Joined
Nov 1, 2015
Messages
56,031
Reputation
8,229
Daps
157,710



This is our fastest gait ever at ~0.6 m/s, > 30% speed boost since our last video in December.

We’ve improved our vestibular system, our foot trajectory, and our ground contact logic. We’ve upgraded our motion planner, and made cuts to the loop latency across the bot. Optimus is more stable and more confident overall - even during turns. We also added a slight torso & arms sway.

If these challenges speak to you, join
@AnandSwa
& our amazing Controls team!

http://tesla.com/ai
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
56,031
Reputation
8,229
Daps
157,710






1/6
Humanoid Locomotion as Next Token Prediction

We cast real-world humanoid control as a next token prediction problem, akin to predicting the next word in language. Our model is a causal transformer trained via autoregressive prediction of sensorimotor trajectories. To account for

2/6
Humanoid Locomotion as Next Token Prediction

We cast real-world humanoid control as a next token prediction problem, akin to predicting the next word in language. Our model is a causal transformer trained via autoregressive prediction of sensorimotor trajectories. To account for

3/6
the multi-modal nature of the data, we perform prediction in a modality-aligned way, and for each input token predict the next token from the same modality. This general formulation enables us to leverage data with missing modalities, like video trajectories without actions. We

4/6
train our model on a collection of simulated trajectories coming from prior neural network policies, model-based controllers, motion capture data, and YouTube videos of humans. We show that our model enables a full-sized humanoid to walk in San Francisco zero-shot. Our model can

5/6
transfer to the real world even when trained on only 27 hours of walking data, and can generalize to commands not seen during training like walking backward.

6/6
paper page:
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
56,031
Reputation
8,229
Daps
157,710

Hugging Face is launching an open source robotics project led by former Tesla scientist​

Carl Franzen @carlfranzen

March 7, 2024 6:29 AM


Workers in yellow jumpsuits assemble large spherical yellow smiley face humanoid robots in a factory.

Credit: VentureBeat made with Midjourney Niji V6




Hugging Face, the New York City-startup that maintains the popular open source repository of machine learning and AI code of the same name and the open source ChatGPT-rival Hugging Chat, is launching a new robotics project under former Tesla staff scientist Remi Cadene, according to a post from Cadene on X this morning.



Fittingly, Cadene said the Hugging Face robot project would be “open-source, not as in Open AI,” in keeping with Hugging Face’s stated ethos and also a playful jab at OpenAI’s recent response to a lawsuit from co-founder turned rival, Tesla CEO Elon Musk (Cadene’s boss until recently).



Now hiring robotics engineers​

He also said he was “looking for engineers” in Paris, France and posted a link to a job listing for an “Embodied Robotics Engineer,” which gives more clues, reading in part:

At Hugging Face, we believe ML doesn’t have to be constrained to computers and servers, and that’s why we’re expanding our team with a new opportunity for a Robotics Engineer focusing on Machine Learning/AI.

In this role, you will be responsible for designing, building, and maintaining open-source and low cost robotic systems that integrate AI technologies, specifically in deep learning and embodied AI. You will collaborate closely with ML engineers, researchers, and product teams to develop innovative solutions that push the boundaries of what’s possible in robotics and AI.


The listing also calls upon hires to “Design, build, and maintain open-source and low cost robotic systems integrating deep learning and embodied AI technologies” and “Build low cost robots with off the shelf electronic components and controllers and 3D printed parts.”



Ambitious expansion​

The move signals a major departure and ambitious expansion for Hugging Face, which until now, has primarily focused on software, not hardware.

It comes as investment and interest in the humanoid robotics and general robotics space is heating up, with Tesla pursuing its own humanoid robot Optimus (which Cadene says he worked on as part of its Autopilot group, repurposing some of the work done to make its cars move autonomously), and a rival called Figure recently raising an eye-watering $675 million from OpenAI and others to pursue its own rival robots.

Research on robots has also accelerated markedly in recent months as engineers look to the generative AI boom for new tricks from large language models (LLMs) and machine learning (ML) programs on how to train robots more quickly, cheaply and accurately. There is a general and growing interest in the tech industry toward “embodied” AI that moves it off screens and devices into machines capable of autonomously navigating the world and assisting humans with non-software related, physically demanding or tedious tasks, including household chores, hard labor, manufacturing and more.

Cadene worked at Tesla for nearly three years, from April 2021 through March 2024, according to his LinkedIn profile.

We’ve reached out to confirm the news with Hugging Face and ask for further information on the project. We will update when we hear back.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
56,031
Reputation
8,229
Daps
157,710


1/1
An artificial intelligence robot from China, which can walk in the forest.



1/1
Take a look at this biped robot developed by Chinese tech company LimX Dynamics. It has completed dynamic tests in the wild, overcoming complex terrains in forests and mountains.
 
Top