bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046




1/11
@watneyrobotics
FOLD! FOLD! FOLD!
An overnight timelapse of one of our live deployments teleoperated from over 7000 miles away. Our robots fold 24/7/365 with no human intervention, handling long-tail edge cases with no downtime.



https://video.twimg.com/ext_tw_video/1861138021095219204/pu/vid/avc1/1280x720/sLlF9tfyreiiETtr.mp4

2/11
@watneyrobotics
Teleoperation is hard. Teleoperation with high packet loss, variable latency, and constrained bandwidth is almost impossible. We built our system from the ground up (in Rust) to perform dynamic tasks with unprecedented dexterity in real world environments.



3/11
@watneyrobotics
We are extending our teleoperation infrastructure to enable remote inference for cloud-hosted robot foundation models. Robot policies that had to be invoked on edge can now be run from the cloud with no performance loss.



4/11
@watneyrobotics
We've also collected the world's largest dataset of cloth manipulation in a real world environment... Stay tuned!



5/11
@pranaysuyash
then what does tele operated mean if not human intervention? both of which in the post together don't make sense



6/11
@ziademarcus
which third world country do you use for slave labor to “teleoperate” these “without human intervention”

rust, btw



7/11
@Bunagayafrost
i think you're saying, it's being monitored by humans for safety and edge cases, but it handled folding autonomously without any intervention, right?



8/11
@MikeBravoCharly
How do they fold without a chin.



9/11
@MaxRovensky
> teleoperated
> with no human intervention

bruh



10/11
@NaturallyDragon
"teleoperated" "with no human intervention" 🤣



11/11
@sweepwhale
it's coming




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046

https://old.reddit.com/r/nextfukkin...ne_technology_has_come_a_long_way_looks_like/
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046




1/11
@UnitreeRobotics
Unitree B2-W Talent Awakening! 🥳
One year after mass production kicked off, Unitree’s B2-W Industrial Wheel has been upgraded with more exciting capabilities.
Please always use robots safely and friendly.
/search?q=#Unitree /search?q=#Quadruped /search?q=#Robotdog /search?q=#Parkour /search?q=#EmbodiedAI /search?q=#IndustrialRobot /search?q=#InspectionRobot /search?q=#IntelligentRobot /search?q=#FoundationModels /search?q=#LeggedRobot /search?q=#WheeledLegs



https://video.twimg.com/ext_tw_video/1871092619549548544/pu/vid/avc1/1920x1080/V7GIVOa5H6spKq5_.mp4

2/11
@_CryptoPete_




GfgvS4xWwAE8R1M.jpg


3/11
@xsaltwedgex
Uhhhhh… is this real or CGI?!?
😳



4/11
@AEInfinity_
What are potential use cases?



5/11
@JohnFStifter
@sierracatalina1 🤣 run



6/11
@dexteryy
Build it tank-sized with an added soul (AGI), and the Tachikoma becomes real.



GfebgutaoAA1jjM.jpg

Gfebhp3bsAEZx9A.png


7/11
@ViralMuzik1989
Black mirror robo dog episode coming IRL next door 😅



8/11
@Soulnebulize
🤯



9/11
@7n39_igolnik
Want a big one and drive this to work.



Gfd96hga4AAz9L9.jpg


10/11
@pinchthaddeus
how is this possible overnight when there was no evidence of this step change in dexterity before this



11/11
@moral_minority
What are its capabilities beyond driving around?




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046


Nvidia unveils Isaac GR00T blueprint to accelerate humanoid robotics​


Dean Takahashi@deantak

January 6, 2025 8:08 PM



Nvidia GR00T makes it easier to design humanoid robots.


Nvidia Isaac GR00T makes it easier to design humanoid robots.

Image Credit: Nvidia


Nvidia has announced an Isaac GR00T blueprint to accelerate humanoid robotics development.

At Nvidia CEO Jensen Huang’s CES 2025 keynote, Huang said that Isaac GR00T workflows for synthetic data and Nvidia Cosmos world foundation models will supercharge development of general humanoid robots.


Robots on the march​


Over the next two decades, the market for humanoid robots is expected to reach $38 billion. To address this significant demand, particularly in the industrial and manufacturing sectors, Nvidia is releasing a collection of robot foundation models, data pipelines and simulation frameworks to accelerate next-generation humanoid robot development.

The Nvidia Isaac GR00T blueprint for synthetic motion generation helps developers generate exponentially large synthetic motion data to train their humanoids using imitation learning.

Imitation learning — a subset of robot learning — enables humanoids to acquire new skills by observing and mimicking expert human demonstrations. Collecting these extensive, high-quality datasets in the real world is tedious, time-consuming and often prohibitively expensive.

Implementing the Isaac GR00T blueprint for synthetic motion generation allows developers to easily generate exponentially large synthetic datasets from just a small number of human demonstrations.

Starting with the GR00T-Teleop workflow, users can tap into Apple Vision Pro to capture human actions in a digital twin. These human actions are mimicked by a robot in simulation and recorded for use as ground truth.

The GR00T-Mimic workflow then multiplies the captured human demonstration into a larger synthetic motion dataset. Finally, the GR00T-Gen workflow, built on the Nvidia Omniverse and Nvidia Cosmos platforms, exponentially expands this dataset through domain randomization and 3D upscaling.

The dataset can then be used as an input to the robot policy, which teaches robots how to move and interact with their environment effectively and safely in Nvidia Isaac Lab, an open-source and modular framework for robot learning.


World foundation models narrow the sim-to-real gap​


robots-2.jpg
Which one is not the robot?

Also at CES, Nvidia announced Cosmos, a platform featuring a family of open, pretrained world foundation models purpose-built for generating physics-aware videos and world states for physical AI development. It includes autoregressive and diffusion models in a variety of sizes and input-data formats. The models were trained on 18 quadrillion tokens, including 2 million hours of autonomous driving, robotics, drone footage and synthetic data.

In addition to helping generate large datasets, Cosmos can reduce the simulation-to-real gap by upscaling images from 3D to real. Combining Omniverse — a developer platform of application programming interfaces and microservices for building 3D applications and services — with Cosmos is critical, because it helps minimize potential hallucinations commonly associated with world models by providing crucial safeguards through its highly controllable, physically accurate simulations.


An expanding ecosystem​


robots.jpg
Nvidia GR00T generates synthetic data for robots.

Collectively, Nvidia Isaac GR00T, Omniverse and Cosmos are helping physical AI and humanoid innovation take a giant leap forward. Major robotics companies including Boston Dynamics and Figure have started adopting and demonstrating results with Isaac GR00T.

Humanoid software, hardware and robot manufacturers can apply for early access to Nvidia’s humanoid robot developer program.





1/11
@TheHumanoidHub
Jensen, at the CES 2025 stage with 14 humanoid robots standing in the background, announced NVIDIA Isaac GR00T Blueprint.

It's a simulation workflow for synthetic motion generation, enabling developers to create large datasets for training humanoids using imitation learning.



https://video.twimg.com/ext_tw_video/1876489889535180800/pu/vid/avc1/1280x720/Vjmd5uZd6tgoGvx_.mp4

2/11
@victor_explore
bet those humanoids still can't fold laundry though 🤖



3/11
@BriscoeCrainIV
It’s happening so fast! 🦾



4/11
@HunterReveur
Impressive. I’m now wondering if we’ll see a public release by summer’25.



5/11
@kianerfaan
iron man 2 vibes



GgqrGRuW8AAJau7.jpg


6/11
@Mnewbis
Straight up thought this was William shatner for a second 😂



7/11
@LawStud0842619
Why no @Tesla_Optimus ?



8/11
@EERandomness
Interesting that Optimus was not in the lineup?



9/11
@BLUECOW009
How do I get this developer kit?



10/11
@argoexp
ARBE anyone? Nvidia just invested in them yesterday



11/11
@__plotnikova
Really impressive how fast this is happening




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046


1/11
@TheHumanoidHub
NVIDIA just introduced Cosmos, a platform for world foundation models designed for robotics.

⦿ It features advanced tokenizers, an AI-accelerated data pipeline, and integration with NVIDIA Omniverse. Humanoid makers 1X, Figure, and Agility are among the first to adopt Cosmos.

⦿ Cosmos generates synthetic, physics-based data, accelerating model training and customization.

⦿ It also features a CUDA-accelerated data processing pipeline that enables developers to process, curate, and label 20 million hours of videos in 14 days using the NVIDIA Blackwell platform.



https://video.twimg.com/ext_tw_video/1876485245312339968/pu/vid/avc1/1280x720/Res7shCLkFbL1Vsm.mp4

2/11
@TheHumanoidHub
Technical Blog:
NVIDIA Launches Cosmos World Foundation Model Platform to Accelerate Physical AI Development



3/11
@Brenten55
This Blew Me Away !!! 😮

Entire video:
https://invidious.poast.org/live/k82RwXqZHY8?si=XBzhsKBcFFgyfCJK



4/11
@lwasinam
Speaking robot: Our new AI model translates vision and language into robotic actions



5/11
@steve_ike_
NVIDIA is cooking! Wonder if this helps some of the automakers who have fallen behind in self driving tech catch up?



6/11
@leo_grundstrom
Can this be used for video generation?



7/11
@robot_machines
This is the foreseeable future of robotics.



8/11
@AppyPieInc
Game-changer for robotics! Cosmos is setting a new standard for scalable training and deployment. Excited to see how pioneers like 1X and Agility leverage this!



9/11
@BriscoeCrainIV
I’m buying more NVIDIA in the morning! 😂



10/11
@lateboomer88
This sounds like the pre version of terminator Skynet.



11/11
@saumil_chandira
Enabling both offline and online agents in one keynote




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046





1/11
@TheHumanoidHub
EngineAI posted a new video on LinkedIn:

SE01 robot walking around among people at the company's office campus in Shenzhen.

[Quoted tweet]
Finally, a humanoid robot with a natural, human-like walking gait.

Chinese company EngineAI just unveiled their life-size general-purpose humanoid SE01.


https://video.twimg.com/ext_tw_video/1877059859025375236/pu/vid/avc1/720x720/Iu_smDs6rckRXLUc.mp4
https://video.twimg.com/ext_tw_video/1849351674361757696/pu/vid/avc1/1280x720/D429gHgPIpumy_IL.mp4

2/11
@anatomyumea




3/11
@TheHumanoidHub
Optional Beyonce walk upgrade, but the knees warranty voided.



4/11
@noxlonga
terminator walks freely - mistake number one 🚨⚡



5/11
@Dieter75
That thing makes me want to reach for my RPG.



6/11
@joehansenxx
Starting to look scary actually!



7/11
@stlgotmynikeson
Oh boy



8/11
@BlackApple
Impressive



9/11
@LarryPanozzo
“Return to your homes.
A curfew is in effect.
Return to your homes.
A curfew is in effect.”



10/11
@fmh_iii
Is it possible to give that robot some shoes? It would soften the impact and reduce noise. Would SE01 still be able to stabilize properly with shoes on?



11/11
@J65516006
@quality_scents




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046





1/7
@TheAIVeteran
You've seen it walking through the street today, here is another look at the SE 01 robot by Engine AI.

You can have Gemini 2.0 reverse engineer those actuators in AI studio for fun.

"A product with human-like bionic design, offering exceptional flexibility and powerful joint dynamics. With 32 degrees of freedom throughout its body and a maximum joint torque of up to 330 N・m, it easily handles even the most complex and challenging movements.

Natural Gait, Human-like Walking
Powerful and Sustained Performance
Enhanced Training, Accelerated Evolution

ENGINEAI——Dedicated to developing world-leading general-purpose humanoid robots while continuously accelerating innovation in the embodied intelligence revolution."
-Engine AI



https://video.twimg.com/ext_tw_video/1877081630478839808/pu/vid/avc1/640x360/1ME3CoktGtRaSgFT.mp4

2/7
@TheAIVeteran
One of my favorite robot videos lately. Great music selection. Here's the SE 01 humanoid robot by Engine AI.

When the engineer hits the enter key and the plastic wrapped robot starts walking, he legit has a "wtf" moment. It was their third walking test at speed.

"The SE01 is dedicated to being the most human-like general-purpose humanoid AI Hardware. It features a set of self-developed harmonic joint module and combines reinforcement learning with imitation learning, achieving unprecedented excellence in natural gait through end-to-end neural network models.

The design of the SE01 narrows the gap between humanoid robotics and humans, making the robot more agile. With smooth, swift, and graceful walking, the SE01 leaves behind the stereotypical unnatural gait of past robots. Our relentless pursuit of technical excellence at EngineAI is set to establish new industry standards."
-Engine AI



https://video.twimg.com/ext_tw_video/1877081668823154688/pu/vid/avc1/640x360/JP6PoC8l1gekR80h.mp4

3/7
@TheAIVeteran
Bonus look at Engine AI's PM01 robot, the smaller one.

"EngineAI proudly unveils the PM01, our next-gen lightweight, high-dynamic, open-source humanoid robotic platform. With its interactive display, agile motion, and robust support for secondary development, PM01 is designed to be the most versatile tool for developers worldwide.

PM01 is now available for purchase! We invite developers, researchers, and businesses to explore the future of robotics with PM01. Let’s push the boundaries of what robotics can achieve across different industries and use cases.

ENGINEAI——Dedicated to developing world-leading general-purpose humanoid robots while continuously accelerating innovation in the embodied intelligence revolution."
-Engine AI



https://video.twimg.com/ext_tw_video/1877084513253445632/pu/vid/avc1/1280x720/4iJV1Vi-a3QkcWCC.mp4

4/7
@IOI101IOI101I
I like to think of a future were kids will be in their garage building robots the way we use to build like soapbox cars.



5/7
@TheAIVeteran
Really, if we build robust software now, then robots could end up essentially plug-and-play with their hardware. There are already modular robots that can have different attachments added, such as the Borg robot. So, may not be too far off. There's already lots of robotics toys out there, though.



6/7
@RayZrDev
I wish they'd make another season west world.



7/7
@TheAIVeteran
Soon, we can just make Westworld for real. Maybe without the shooting, though.




To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046



Robots Are Learning to Conduct Surgery on Their Own by Watching Videos​


Just don't tell them about porn.

By Thomas Maxwell Published December 30, 2024 | Comments (28)

Researchers have demonstrated how AI can train robots to mimic human doctors to perform surgeries.
Researchers have demonstrated how AI can train robots to mimic human doctors to perform surgeries. Johns Hopkins University

The artificial intelligence boom is already starting to creep into the medical field through the form of AI-based visit summaries and analysis of patient conditions. Now, new research demonstrates how AI training techniques similar to those used for ChatGPT could be used to train surgical robots to operate on their own.

Researchers from John Hopkins University and Stanford University built a training model using video recordings of human-controlled robotic arms performing surgical tasks. By learning to imitate actions on a video, the researchers believe they can reduce the need to program each individual movement required for a procedure. From the Washington Post:

The robots learned to manipulate needles, tie knots and suture wounds on their own. Moreover, the trained robots went beyond mere imitation, correcting their own slip-ups without being told ― for example, picking up a dropped needle. Scientists have already begun the next stage of work: combining all of the different skills in full surgeries performed on animal cadavers.

To be sure, robotics have been used in the surgery room for years now—back in 2018, the “surgery on a grape” meme highlighted how robotic arms can assist with surgeries by providing a heightened level of precision. Approximately 876,000 robot-assisted surgeries were conducted in 2020. Robotic instruments can reach places and perform tasks in the body where a surgeon’s hand will never fit, and they do not suffer from tremors. Slim, precise instruments can spare nerve damage. But robotics are typically guided manually by a surgeon with a controller. The surgeon is always in charge.

The concern by skeptics of more autonomous robots is that AI models like ChatGPT are not “intelligent,” but rather simply mimic what they have already seen before, and do not understand the underlying concepts they are dealing with. The infinite variety of pathologies in an incalculable variety of human hosts poses a challenge, then—what if the AI model has not seen a specific scenario before? Something can go wrong during surgery in a split second, and what if the AI has not been trained to respond?

At the very least, autonomous robots used in surgeries would need to be approved by the Food and Drug Administration. In other cases where doctors are using AI to summarize their patient visits and make recommendations, FDA approval is not required because the doctor is technically supposed to review and endorse any information they produce. That is concerning because there is already evidence that AI bots will make bad recommendations, or hallucinate and include information in meeting transcripts that was never uttered. How often will a tired, overworked doctor rubber-stamp whatever an AI produces without scrutinizing it closely?

It feels reminiscent of recent reports regarding how soldiers in Israel are relying on AI to identify attack targets without scrutinizing the information very closely. “Soldiers who were poorly trained in using the technology attacked human targets without corroborating [the AI] predictions at all,” a Washington Post story reads. “At certain times the only corroboration required was that the target was a male.” Things can go awry when humans become complacent and are not sufficiently in the loop.

Healthcare is another field with high stakes—certainly higher than the consumer market. If Gmail summarizes an email incorrectly, it is not the end of the world. AI systems incorrectly diagnosing a health problem, or making a mistake during surgery, is a much more serious problem. Who in that case is liable? The Post interviewed the director of robotic surgery at the University of Miami, and this is what he had to say:

“The stakes are so high,” he said, “because this is a life and death issue.” The anatomy of every patient differs, as does the way a disease behaves in patients.

“I look at [the images from] CT scans and MRIs and then do surgery,” by controlling robotic arms, Parekh said. “If you want the robot to do the surgery itself, it will have to understand all of the imaging, how to read the CT scans and MRIs.” In addition, robots will need to learn how to perform keyhole, or laparoscopic, surgery that uses very small incisions.

The idea that AI will ever be infallible is hard to take seriously when no technology is ever perfect. Certainly, this autonomous technology is interesting from a research perspective, but the blowback from a botched surgery conducted by an autonomous robot would be monumental. Who do you punish when something goes wrong, who has their medical license revoked? Humans are not infallible either, but at least patients have the peace of mind of knowing they have gone through years of training and can be held accountable if something goes wrong. AI models are crude simulacrums of humans, behave sometimes unpredictably, and have no moral compass.

Another concern is whether relying too much on autonomous robots to conduct surgeries could end up resulting in doctors having their own abilities and knowledge atrophy; similar to how facilitating dating through apps results in relevant social skills becoming rusty.

If doctors are tired and overworked—a reason researchers suggested why this technology could be valuable— perhaps the systemic problems causing a shortage should be addressed instead. It has been widely reported that the U.S. is experiencing an extreme shortage of doctors due to the increasing inaccessibility of the field. The country is on track to experience a shortage of 10,000 to 20,000 surgeons by 2036, according to the American Association of Medical Colleges.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
58,833
Reputation
8,672
Daps
163,046
A Chinese robotics company has introduced a robotic dog capable of running 100 meters in under 10 seconds.



 
Top