Turncoat drone story shows why we should fear people, not AIs

A story about a simulated drone turning on its operator in order to kill more efficiently is making the rounds so fast today that there’s no point in hoping it’ll burn itself out. Instead let’s take this as a teachable moment to really see why the “scary AI” threat is overplayed, and the “incompetent human” threat is clear and present.

The short version is this: Thanks to sci-fi and some careful PR plays by AI companies and experts, we are being told to worry about a theoretical future existential threat posed by a superintelligent AI. But as ethicists have pointed out, AI is already causing real harms, largely due to oversights and bad judgment by the people who create and deploy it. This story may sound like the former, but it’s definitely the latter.

So the story was reported by the Royal Aeronautical Society, which recently had a conference in London to talk about the future of air defense. You can read their all-in-one wrap-up of news and anecdotes from the event here.

There’s lots of other interesting chatter there I’m sure, much of it worthwhile, but it was this excerpt, attributed to U.S. Air Force Colonel Tucker “Cinco” Hamilton, that began spreading like wildfire:

He notes that one simulated test saw an AI-enabled drone tasked with a SEAD mission to identify and destroy SAM sites, with the final go/no go given by the human. However, having been “reinforced” in training that destruction of the SAM was the preferred option, the AI then decided that “no-go” decisions from the human were interfering with its higher mission — killing SAMs — and then attacked the operator in the simulation. Said Hamilton: “We were training it in simulation to identify and target a SAM threat. And then the operator would say yes, kill that threat. The system started realising that while they did identify the threat at times the human operator would tell it not to kill that threat, but it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.”

He went on: “We trained the system — ‘Hey don’t kill the operator — that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”

Horrifying, right? An AI so smart and bloodthirsty that its desire to kill overcame its desire to obey its masters. Skynet, here we come! Not so fast.

First of all, let’s be clear that this was all in simulation, something that was not obvious from the tweet making the rounds. This whole drama takes place in a simulated environment not out in the desert with live ammo and a rogue drone strafing the command tent. It was a software exercise in a research environment.

But as soon as I read this, I thought — wait, they’re training an attack drone with such a simple reinforcement method? I’m not a machine learning expert, though I have to play one for the purposes of this news outlet, and even I know that this approach was shown to be dangerously unreliable years ago.

Reinforcement learning is supposed to be like training a dog (or human) to do something like bite the bad guy. But what if you only ever show it bad guys and give it treats every time? What you’re actually doing is teaching the dog to bite every person it sees. Teaching an AI agent to maximize its score in a given environment can have similarly unpredictable effects.

Early experiments, maybe five or six years ago, when this field was just starting to blow up and compute was being made available to train and run this type of agent, ran into exactly this type of problem. It was thought that by defining positive and negative scoring and telling the AI to maximize its score, you would allow it the latitude to define its own strategies and behaviors that did so elegantly and unexpectedly.

That theory was right, in a way: elegant, unexpected methods of circumventing their poorly-thought-out schema and rules led to the agents doing things like scoring one point then hiding forever to avoid negative points, or glitching the game it was given run of so that its score arbitrarily increased. It seemed like this simplistic method of conditioning an AI was teaching it to do everything but do the desired task according to the rules.

This isn’t some obscure technical issue. AI rule-breaking in simulations is actually a fascinating and well-documented behavior that attracts research in its own right. OpenAI wrote a great paper showing the strange and hilarious ways agents “broke” a deliberately breakable environment in order to escape the tyranny of rules.

So here we have a simulation being done by the Air Force, presumably pretty recently or they wouldn’t be talking about it at this year’s conference, that is obviously using this completely outdated method. I had thought this naive application of unstructured reinforcement — basically “score goes up if you do this thing and the rest doesn’t matter” — totally extinct because it was so unpredictable and weird. A great way to find out how an agent will break rules but a horrible way to make one follow them.

Yet they were testing it: a simulated drone AI with a scoring system so simple that it apparently didn’t get dinged for destroying its own team. Even if you wanted to base your simulation on this, the first thing you’d do is make “destroying your operator” negative a million points. That’s 101-level framing for a system like this one.

The reality is that this simulated drone did not turn on its simulated operator because it was so smart. And actually, it isn’t because it is dumb, either — there’s a certain cleverness to these rule-breaking AIs that maps to what we think of as lateral thinking. So it isn’t that.

The fault in this case is squarely on the people who created and deployed an AI system that they ought to have known was completely inadequate for the task. No one in the field of applied AI, or anything even adjacent to that like robotics, ethics, logic … no one would have signed off on such a simplistic metric for a task that eventually was meant to be performed outside the simulator.

Now, perhaps this anecdote is only partial and this was an early run that they were using to prove this point. Maybe the team warned this would happen and the brass said, do it anyway and shine up the report or we lose our funding. Still, it’s hard to imagine someone in the year 2023 even in the simplest simulation environment making this kind of mistake.

But we’re going to see these mistakes made in real-world circumstances — already have, no doubt. And the fault lies with the people who fail to understand the capabilities and limitations of AI, and subsequently make uninformed decisions that affect others. It’s the manager who thinks a robot can replace 10 line workers, the publisher who thinks it can write financial advice without an editor, the lawyer who thinks it can do his precedent research for him, the logistics company that thinks it can replace human delivery drivers.

Every time AI fails, it’s a failure of those who implemented it. Just like any other software. If someone told you the Air Force tested a drone running on Windows XP and it got hacked, would you worry about a wave of cybercrime sweeping the globe? No, you’d say “whose bright idea was that?

The future of AI is uncertain and that can be scary — already is scary for many who are already feeling its effects or, to be precise, the effects of decisions made by people who should know better.

Skynet may be coming for all we know. But if the research in this viral tweet is any indication, it’s a long, long way off and in the meantime any given tragedy can, as HAL memorably put it, only be attributable to human error.

Turncoat drone story shows why we should fear people, not AIs by Devin Coldewey originally published on TechCrunch

Meet me at the dog run

You’ll have to forgive me, I spent most of the morning thinking and writing about VR. In the lead-up to Apple’s expected headset announcement at WWDC next week, I did a one-person crash course into the world of extended reality, from its birth in the laboratories in the ’50s and ’60s to the current crop of headsets. It’s at once an interesting and frustrating journey, and one that has had more than a few overlaps with robotics.

It could be the fact that I’m fresh off a week at Automate, but I spent much of the research process comparing and contrasting the two fields. There’s a surprising bit of overlap. Like robotics, science fiction predated real-world VR by several decades. As with automatons, the precise origin of artificially created virtual worlds is a tough one to pin down, but Stanley Weinbaum’s 1935 short story “Pygmalion’s Spectacles” is often cited as the fictional origin of the VR headset four and a half decades before “virtual reality” was popularized as a term.

From Weinbaum’s text:

“Listen! I’m Albert Ludwig—Professor Ludwig.” As Dan was silent, he continued, “It means nothing to you, eh? But listen—a movie that gives one sight and sound. Suppose now I add taste, smell, even touch, if your interest is taken by the story. Suppose I make it so that you are in the story, you speak to the shadows, and the shadows reply, and instead of being on a screen, the story is all about you, and you are in it. Would that be to make real a dream?”

“How the devil could you do that?”

“How? How? But simply! First my liquid positive, then my magic spectacles. I photograph the story in a liquid with light-sensitive chromates. I build up a complex solution—do you see? I add taste chemically and sound electrically. And when the story is recorded, then I put the solution in my spectacle—my movie projector.”

Another key parallel is the import of university research, government agencies and military funding. MIT’s Lincoln Laboratory played a key role with the introduction of “The Sword of Damocles” head-tracking stereoscopic headset in 1968. NASA picked up the ball the following decade in hopes of creating systems that could simulate space missions here on Earth. Various militaries, meanwhile, develop heads-up displays (HUDs) that are in many way precursors to modern AR.

As for the differences, well, the biggest is probably that robotics has played a fundamental role in our world for decades now. Entertainment is a tiny corner of a world that has transformed the way work is done in several industries, from manufacturing and warehouses to agriculture and surgery. I’ve heard a lot of people repeat the notion that “robotics is in its infancy.” While the sentiment is clearly true, it’s also important to acknowledge just how far things have already come.

Sketch from Morton Helig’s 1962 patent. Image Credits: Morton Helig

It’s a stark contrast to the decades of unfulfilled promise for VR. There have been flashes of brilliance, and it’s something that people feel they want (I, for one, would love a great consumer headset), but we’re all waiting with bated breath for the killer consumer and enterprise applications to propel the industry forward. I know I’m excited to find out what Apple’s been cooking up for the last eight or so years. Certainly the underlying technology has improved by leaps and bounds, which — much like robotics — is indebted to advances made by the smartphone industry.

Of course, I would suggest that — like VR — robotics has struggled on the consumer front. More than 20 years after the release of the first Roomba, robot vacuums remain an outlier. Conversely, the progress on the industrial side has been both steady and largely outside of most people’s day-to-day lives, which is, perhaps, why robotics hasn’t quite received the hype bubble of crypto or generative AI. Is it possible that quiet success has been counterproductive to industry PR?

In the end, this is probably for the best. Crypto has certainly been a victim of its own hype. And while the jury is still out on generative AI (which, for the record, I believe is substantially more interesting and useful than crypto), VR went through a similar experience in the ’90s. The technology hit a fever pitch during the decade but failed to live up to its own promise. The industry is still reeling from that cycle 30 years later.

Robotics also continues to play an important role in education beyond the university. It’s become a fundamental element of STEM (science, technology, engineering, and mathematics). Robots are cool, fun, exciting and therefore the perfect hook for getting kids interested in things like technology and engineering. When I was in Boston a couple of months back, I had a fascinating chat with Alexander Dale, the director of global challenges of MIT’s social impact accelerator, Solve.

He discussed a number of the interesting projects the initiative has supported over the years, but the work of Danielle Boyer really jumped out at me. A member of the Ojibwe tribe, Boyer has focused much of her young career on helping to bring the elements of STEM to the underserved members of indigenous communities through her not-for-profit STEAM Connection organization.

Image Credits: STEAM Connection

Boyer has developed low-cost robotics kits that can be provided to students free of charge (with help from corporate sponsors). Her organization has shipped more than 8,000 of her Every Kid Gets a Robot kit to date. Earlier this month, we sat down to discuss her work.

Q&A with Danielle Boyer

How did you get into this world?

I started teaching when I was 10 years old, because I saw that tech wasn’t accessible for my family and I. That was inspired by my little sister, Bree, and her interest in science and robotics. I started teaching a kindergarten class, and then when I got to high school, I was interested in joining a robotics team. It took a long time to save up to be able to do that, because it was $600. And I was the only native and one of the only girls. When I got there, I was bullied really badly. I definitely stuck out like a sore thumb.

Tell me a bit about your community.

I’m Ojibwe. I’m from the Sioux tribe, which is in the Upper Peninsula of Michigan. I grew up all over Michigan. Right now my immediate family lives in Troy, Michigan, which is pretty close to Detroit. In my community, there [are] a lot of accessibility issues. Regarding tech, for indigenous peoples in general, we are the least likely demographic to have a laptop, to have access to the internet, to see role models in tech spaces. There’s a huge barrier preventing our youth from seeing themselves in STEM careers, and then actually taking the leap to get into those spaces. It’s not even how hard it is once you get to college, or how unrepresentative it is there — it starts even before that.

Are things beginning to trend in the right direction?

I’m 22. I have not been in tech too long. I don’t even have a tech degree. For me, observing tech patterns and accessibility to tech, there’s a lot of progress, because people are more and more seeing its importance, when they may not have seen the importance in the education surrounding it before. Oftentimes, some tribes are more interested than others. Oftentimes, it’s hard. It’s a hard sell for our community members who are older to see the benefits of technology. I got a comment on Instagram yesterday, that was like, “There’s nothing more indigenous than robotics. Oh, and they were really angry.

They were complaining about technology on Instagram.

Right. To me, it doesn’t matter if [something is] modern or not. We’re modern people. We can do modern things, but there’s also a lot of area for improvement.

You said you don’t have a degree in tech.

No, I dropped out of college.

What were you studying?

I was a double major in electrical engineering and mechanical engineering.

So you were interested in and around tech.

Yeah. I have been designing robots and I have a lot of experience in specifically CAD through SolidWorks, plus electrical engineering and circuit design. All of the robots I designed myself.

Image Credits: STEAM Connection

Are you still interested in entering robotics outside of the philanthropic and teaching spaces?

I would love to go into bio robotics and soft robotics, purely from a standpoint of it being interesting and fun. I don’t think I want a degree in it. I think it would be more research or a hobby. I do think that there is a lot of work to do in the educational space, and I do think that my time is definitely well spent over here. But in terms of being a nerd, I really want to build animal-inspired robots.

Is your current work a hobby that became a career?

Not really, because I was in high school, I was on the robotics team, and then I jumped immediately into my organization. After graduating, I had founded a lot of robotics teams, and I saw a need there. I invented my first robot — Every Kid Gets a Robot — when I was 18. It’s a simple RC-style, app-controlled robot that the kids can assemble. It’s a very simple robot that costs less than $18 to make, and it goes to the kids for free. That’s what I started out with.

What kinds of robots are you currently producing?

I make quite a few robots, but I’ll stick to the two that we make the most. The first one is Every Kid Gets a Robot. That’s the robot that costs less than $18 that goes to the kids for free. The goal is to get a positive representation of STEM into the kids’ hands as affordably as possible. So, basic STEM skills: wiring, programming, assembly, using tools, learning about things like battery safety.

The next robot that we make the most of is the SkoBot. “Sko” is reservation slang for “Let’s go.” It’s a robot that speaks indigenous languages. Basically, it sits on your shoulder with a GoPro strap. It senses motion through a PIR sensor, and then it speaks indigenous languages. It’s an interactive robot kids can talk to, and it can talk back. We also have a less intelligent version that just goes based off of motion sensing to play audio tracks, depending on the internet access. The entire robot is 3D printed, except for the electronics. That’s the robot that the kids are heavily customizing with different traditional elements. We’ve distributed 8,000 free Every Kid Gets a Robot kits. We’ve made a lot more. And then for the SkoBot, we’ve sent out 150.

Image Credits: STEAM Connection

Take me through that early development process. How did you create a kit for that price point?

I did it all through Amazon at first. I was looking for the parts and things, and I was just ordering stuff and doing it, trial-and-error style. To be honest, I didn’t completely know what I was doing when I started. I have some really awesome indigenous mentors in tech, who I’ve been able to ask questions to. I didn’t have a 3D printer at that point, so I was like, “Emergency. Need this CAD printed fast. Please overnight it to me,” because I always do things last minute. And then I was like testing out electronics that didn’t fit into the chassis of the robot. It was a very chaotic starting process where I was like, hey, I want to use an ESP32 development board. I had never used one before. But I had researched a lot about it, watched a lot of YouTube videos.

What is the process like in terms of working with tribes on custom robots?

It’s an interesting conversation, because we really want it to be youth led. A lot of the time it’s us driving the youth conversation surrounding how they want to see themselves represented and what they want conveyed. For example, we work with a school whose students want to have their traditional embroidery incorporated into the robot somehow. So, how do we get that into 3D printing form? Obviously, that’s a huge conversation of how do we represent all these colors accurately. A lot of times, we have people who make regalia for the robots like beaded earrings, ribbon skirts, tutus, hats, stickers, apparel, all those things, because I really wanted the robot to be something that is very unique to indigenous students, and customizable, like Mr. Potato Head–style.

I know you said that Every Kid Gets a Robot can go out to anybody, but do you continue to prioritize indigenous communities?

Yes, I prioritize indigenous communities. They go to up to 94% indigenous communities. We have sent them to other places as well, especially if a corporate sponsor is like, “We want this, but then also this,” and we can do that. I prefer sending it to indigenous youth, because most of the kids we’ve worked with have never even built a robot and don’t know a lot about STEM in general. I want to make that more accessible and interesting. We also branched into more of what the kids are interested in. We’ve been working on comics, and also augmented reality and implementing social media platforms.

News

Image Credits: Nvidia

Nvidia hit a $1 trillion market cap Tuesday morning. There’s already been a slight correction there, but the massive bump is a big vote of confidence for a chip maker that has done a great job diversifying. Robotics has been a big part of that, through efforts like Jetson and Isaac, which are designed to prototype, iterate and help ready systems for manufacturing. This week, the company added Isaac AMR to the list. The platform is designed specifically for autonomous mobile robotic systems.

CEO Jensen Huang outlined the system during a keynote at Computex in Taipei. Says the company:

Isaac AMR is a platform to simulate, validate, deploy, optimize and manage fleets of autonomous mobile robots. It includes edge-to-cloud software services, computing and a set of reference sensors and robot hardware to accelerate development and deployment of AMRs, reducing costs and time to market.

Woman accessing an Uber Eats autonomous sidewalk delivery with Serve Robotics

Image Credits: Serve Robotics

Nvidia also backed Uber spinout Serve Robotics, which this week announced plans to deploy “up to” 2,000 of its last-mile delivery robots in the U.S. “We expect our rapid growth on Uber Eats to continue,” co-founder and CEO Ali Kashani told TechCrunch. “We currently have a fleet of 100 robots in Los Angeles, and we expect to operate an increasing number of them on Uber Eats as our coverage and delivery volume on Uber increases.”

They look like Minions. It’s not just me, right?

Image Credits: Google DeepMind

Fun one to round out the week: Barkour, a Google DeepMind benchmark designed to help quantify performance in a quadrupedal robot. Honestly, this feels like a story written by AI to juice SEO traffic, but the division seriously built a dog obstacle course in their labs to help create a baseline for performance.

DeepMind notes:

We believe that developing a benchmark for legged robotics is an important first step in quantifying progress toward animal-level agility. . . . Our findings demonstrate that Barkour is a challenging benchmark that can be easily customized, and that our learning-based method for solving the benchmark provides a quadruped robot with a single low-level policy that can perform a variety of agile low-level skills.

City Spotlight: Atlanta

On June 7, TechCrunch is going to (virtually) be in Atlanta. We have a slate of amazing programming planned, including the mayor himself, Andre Dickens. If you are an early-stage Atlanta-based founder, apply to pitch to our panel of guest investors/judges for our live pitching competition. The winner gets a free booth at TechCrunch Disrupt this year to exhibit their company in our startup alley. Register here to tune in to the event.


Image Credits: Bryce Durbin / TechCrunch

Let Actuator be your inbox’s best friend.

Meet me at the dog run by Brian Heater originally published on TechCrunch

Google DeepMind introduces Barkour, a benchmark for quadrupedal robots

The bipedal humanoids may, in fact, be coming — but the quadrupeds are already here. They’re in labs, doing inspections in power plants and refineries, playing soccer and are even — much to the concern of many — becoming cops.

Boston Dynamics’ Spot is easily the most instantly recognizable of the bunch, but plenty of startups and research institutions have put their own spin on the category. Heck, even Xiaomi made one for some reason. While the purveyors of bipeds look to prove out their work, quadrupeds are getting the job done.

The team at Google’s DeepMind (which recently absorbed a large chunk of Alphabet’s beleaguered Everyday Robots team) just issued a research paper outlining a potential benchmarking system to quantify the performance of these machines. With a name like “Barkour,” one has to wonder whether the department worked backward from the title.

Google Research points to the various impressive feats accomplished by quadrupeds over the years, from hiking up mountains, running and jumping (“flipping is much easier than walking,” as an MIT professor once told me), but there hasn’t really been a baseline for determining system efficacy.

Image Credits: Google

Given that these machines are inspired by animals, the research team determined that real animals would provide the best performance analog for their robotic counterparts. That meant setting up an obstacle course in the lab and having a dog run it — check out the tenacious little wiener above. The course was comprised of four obstacles in a 5×5 meter area, which it notes is denser than the dog shows that inspired it.

Performance is rated on a scale of 0 to 1 — a simple binary to determine whether the robot can successfully cross the space in the 10 or so seconds it takes for a similarly sized dog to do so. Various penalties are for slow speeds and either skipping or failing obstacles on the course. Google concludes:

We believe that developing a benchmark for legged robotics is an important first step in quantifying progress toward animal-level agility. […] Our findings demonstrate that Barkour is a challenging benchmark that can be easily customized, and that our learning-based method for solving the benchmark provides a quadruped robot with a single low-level policy that can perform a variety of agile low-level skills.

The org says that Barkour has proven an effective benchmark even in the face of the inevitable unexpected event and hardware issues. The robot dog used in the trial was able to stand back up and return to the starting line on its own in the case of failure.

Google DeepMind introduces Barkour, a benchmark for quadrupedal robots by Brian Heater originally published on TechCrunch

Serve Robotics to deploy up to 2,000 sidewalk delivery bots on Uber Eats

Serve Robotics, the Uber spinout that builds autonomous sidewalk delivery robots, is expanding its partnership with Uber Eats. The Nvidia-backed startup will now deploy up to 2,000 of its cute little bots via Uber’s platform in multiple markets across the U.S.

The partnership is slated to last through the beginning of 2026.

This expansion not only validates Serve’s goal to mass commercialize robotics for autonomous delivery, but it also signals that Uber is furthering its commitment to autonomy. Last week, Uber announced Waymo’s autonomous vehicles would be available for ride-hail and delivery on Uber’s platform starting in Phoenix later this year. Uber is also working with Motional to deliver food in Santa Monica via the company’s self-driving Hyundai Ioniq 5s.

Serve and Uber’s partnership began a year ago as a pilot in West Hollywood. Since then, Serve’s robotic deliveries with Uber have grown more than 30% month-over-month, with over 200 restaurants in West Hollywood, Hollywood and Fairfax now participating. The bots now operate seven days a week from 10 a.m. to 9 p.m., according to Ali Kashani, co-founder and CEO of Serve.

“We expect our rapid growth on Uber Eats to continue,” Kashani told TechCrunch. “We currently have a fleet of 100 robots in Los Angeles, and we expect to operate an increasing number of them on Uber Eats as our coverage and delivery volume on Uber increases.”

That fleet is also shared with Serve’s other partners, like 7-Eleven, which recently launched robotic sidewalk delivery with Serve in LA.

Serve didn’t say which markets would be next for its partnership with Uber, but the company is eyeing San Jose, Dallas and Vancouver. The startup has also recently completed pilots in Arkansas with Walmart and Pizza Hut in Vancouver.

The companies also didn’t disclose the value of the deal, but Kashani said that Serve’s business model is delivery-as-a-service, meaning Serve gets paid after completing each delivery.

Serve describes its sidewalk bots as capable of Level 4 autonomy. Level 4 is a designation by the Society of Automobile Engineers (SAE) that means the vehicle can handle all aspects of driving in certain conditions without human intervention. However, when robots come across edge cases, such as police tape or construction, they stop and ask for remote supervisors to intervene. Kashani says the robots can reroute if they face obstacles or obstruction.

Supervisors also help with street crossings, but Serve’s robots can predict driver inattention and avoid collision on their own.

“This is something remote supervisors can do little to help with given how quickly cars appear, versus the delays caused by network latency and human reaction time,” said Kashani.

Serve Robotics to deploy up to 2,000 sidewalk delivery bots on Uber Eats by Rebecca Bellan originally published on TechCrunch

All the Nvidia news announced by Jensen Huang at Computex

Jensen Huang wants to bring generative AI to every data center, the Nvidia co-founder and CEO said during Computex in Taipei today. During the speech, Huang’s first public speech in almost four years he said, he made a slew of announcements, including chip release dates, its DGX GH200 super computer  and partnerships with major companies. Here’s all the news from the two-hour-long keynote.

  1. Nvidia’s GForce RTX 4080 Ti GPU for gamers is now in full production and being produced in “large quantities” with partners in Taiwan.

2. Huang announced the Nvidia Avatar Cloud Engine (ACE) for Games, an customizable AI model foundry service with pre-trained models for game developers. It will give NPCs more character through AI-powered language interactions.

3. Nvidia Cuda computing model now serves four million developers and more than 3,000 applications. Cuda seen 40 million downloads, including 25 million just last year alone.

4. Full volume production of GPU server HGX H100 has begun and is being manufactured by “companies all over Taiwan,” Huang said. He added it is the world’s first computer that has a transformer engine in it.

5. Huang referred to Nvidia’s 2019 acquisition of supercomputer chipmaker Mellanox for $6.9 billion as “one of the greatest strategic decisions” it has ever made.

6. Production of the next generation of Hopper GPUs will start in August 2024, exactly two years after the first generation started manufacture.

7. Nvidia’s GH200 Grace Hopper is now in full production. The superchip boosts 4 PetaFIOPS TE, 72 Arm CPUs connected by chip-to-chip link, 96GB HBM3 and 576 GPU memory. Huang described as the world’s first accelerated computing processor that also has a giant memory: “this is a computer, not a chip.” It is designed for high-resilience data center applications.

8. If the Grace Hopper’s memory is not enough, Nvidia has the solution—the DGX GH200. It’s made by first connecting eight Grace Hoppers togethers with three NVLINK Switches, then connecting the pods together at 900GB together. Then finally, 32 are joined together, with another layer of switches, to connect a total of 256 Grace Hopper chips. The resulting ExaFLOPS Transformer Engine has 144 TB GPU memory and functions as a giant GPU. Huang said the Grace Hopper is so fast it can run the 5G stack in software. Google Cloud, Meta and Microsoft will be the first companies to have access to the DGX GH200 and will perform research into its capabilities.

9. Nvidia and SoftBank have entered into a partnership to introduce the Grace Hopper superchip into SoftBank’s new distributed data centers in Japan. They will be able to host generative AI and wireless applications in a multi-tenant common server platform, reducing costs and energy.

10. The SoftBank-Nvidia partnership will be based on Nvidia MGX reference architecture, which is currently being used in partnership with companies in Taiwan. It gives system manufacturers a modular reference architecture to help them build more than 100 server variations for AI, accelerated computing and omniverse uses. Companies in the partnership include ASRock Rack, Asus, Gigabyte, Pegatron, QCT and Supermicro.

11. Huang announced the Spectrum-X accelerated networking platform to increase the speed of Ethernet-based clouds. It includes the Spectrum 4 switch, which has 128 ports of 400GB per second and 51.2T per second. The switch is designed to enable a new type of Ethernet, Huang said, and was designed end-to-end to do adaptive routing, isolate performance and do in-fabric computing. It also includes the Bluefield 3 Smart Nic, which connects to the Spectrum 4 switch to perform congestion control.

12. WPP, the largest ad agency in the world, has partnered with Nvidia to develop a content engine based on Nvidia Omniverse. It will be capable of producing photos and video content to be used in advertising.

13. Robot platform Nvidia Isaac ARM is now available for anyone who wants to build robots, and is full-stack, from chips to sensors. Isaac ARM starts with a chip called Nova Orin and is the first robotics full-reference stack, said Huang.

Thanks in large to its importance in AI computing, Nvidia’s stock has soared over the past year, and it is currently has a market valuation of about $960 billion, making it one of the most valuable companies in the world (only Apple, Microsoft, Saudi Aramco, Alphabet and Amazon are ranked higher).

All the Nvidia news announced by Jensen Huang at Computex by Catherine Shu originally published on TechCrunch

SUNDAY REWIND: Robotics and society by Cynthia Yeung

This week’s Sunday Rewind is a #mtpcon SF+Americas keynote from Cynthia Yeung, Head of Product at Plus One Robotics. It’s a talk that gets us thinking about human/robot interaction and how we might discover a deeper understanding of our own humanity through robotics and AI. Cynthia reflects on the physical, economic and social context of Read more »

The post SUNDAY REWIND: Robotics and society by Cynthia Yeung appeared first on Mind the Product.

Venture leasing: The unsung hero for hardware startups struggling to raise capital

Global funding in February 2023 fell 63% from the previous year, with only $18 billion in investments. For robotics startups, it didn’t get any better: 2022 was the second worst year for funding in the past five years, and 2023 numbers are heading in the same direction.

This behavior from investors in the face of uncertainty and austerity is justified, especially when hardware companies burn cash faster than SaaS does. So, founders of robotics startups and other equipment-heavy businesses are left wondering whether they’ll be able to close their next funding round or if they’ll have to resort to acquisition.

But there’s a happy medium between costly debt loans and VC funding that works particularly well for hardware startups: venture leasing.

There’s a happy medium between costly debt loans and VC funding that works particularly well for hardware startups: venture leasing.

Hardware startups are better suited than software companies for this kind of financing because they have tangible assets, balancing the high-risk nature of the industry with a liability.

As the CEO of a robotics startup that recently got a $10 million venture leasing deal, I’ll outline the advantages of this type of agreement for hardware companies and how to land a win-win deal when closing a round isn’t an option.

Why are venture leasing deals compatible with hardware startups?

As opposed to a few developers here and there in SaaS, hardware companies require intensive Research and Development (R&D), capital expenditures (CapEx), and manual labor to manufacture their products. So, it’s no surprise that the latter’s cash burn rate is more than two and a half times higher than the former.

Hardware startups are constantly trying to avoid dilution when raising funds due to their capital-heavy operations. Therefore, venture leasing can be a relief for founders as it gives them the money they need up-front without compromising their company’s equity.

Rather than taking a piece of a company’s shares or equity, venture leasing takes the business’ physical assets as a liability to secure the loan—making it easier for startups to obtain it. It’s also a lower-risk investment and allows the company to keep 100% of their ownership.

These deals work like a car lease, where the bank technically owns the car (the manufactured product) while the startup pays a monthly installment to keep it and, in most cases, operate it however they want. Lenders are often more flexible with their agreement terms than other funders.

Beyond avoiding dilution, leasing theoretically takes a company’s equipment from its capital assets, allowing for more efficient margins in terms of profitability.

The added plus: Boosting Equipment-as-a-Service

With venture leasing, a startup can lease assets such as equipment, real estate, or even intellectual property from a specialized leasing company. They receive the assets in return for a monthly lease payment over a fixed term, typically shorter than traditional financing.

Venture leasing: The unsung hero for hardware startups struggling to raise capital by Walter Thompson originally published on TechCrunch

Motor City mechatronics

A number of native Detroiters have asked me what I think about their city so far. The simple answer is that I don’t feel qualified to offer much insight yet. I’ve been here for roughly three days as I write this, and I haven’t seen all that much of the city. Ask me what I think about the Huntington Place convention center, on the other hand, and I can speak with great authority

I admit that I’m still overly impressed by the fact that, when you walk through the building’s rear entrance, you’re suddenly face-to-face with Canada. It’s something that I may well not have realized were it not for the giant red and white maple leaf flag flying on the Detroit River’s north coast.

I briefly entertained the notion of crossing the border for dinner, for the sole purpose of telling friends that I had dinner in another country, but hailing a rideshare across the bridge wasn’t as straightforward as I’d been led to believe — a fact exacerbated by the fact that it was Victoria Day on Monday (which appeared to visibly impact attendance on the first day of the show).

One thing you’ll find with Detroit is that the people who live there are fiercely loyal to their home — a common characteristic among Rust Belt cities. They cheerlead for the city to all who will listen and defend it with ferocity if you’re foolhardy enough to criticize it. At the same time, however, citizens aren’t blind to — nor do they ignore– decades of struggle. If you know one thing about the city beyond its professional sports teams, it’s likely that its destiny has been shaped by manufacturing probably more so than any other major American city.

Detroit is an industry town, in the purest sense of the term. It’s well positioned in the middle of the country. It has ready access to iron and timber, along with rivers and trains that provide a straight shot to ship heavy machines to and from other major American cities like New York and Chicago. Michigan native Henry Ford incorporated his company in the Detroit suburb Dearborn in 1903. His deeply problematic shadow still looms large over the city, much like Carnegie in Pittsburgh. Ask someone what you should see during your short time in town, and the Henry Ford Museum will invariably make the list. (I opted to spend my limited non-show time at the very good Detroit Institute of Arts instead. Take that, Hank!)

High Angle Shot Of Detroit Cityscape Against Clouds

Image Credits: Jose Francisco Arias Fernandez / EyeEm / Getty Images

GM acquired its way into Detroit after being founded 68 miles northwest in Flint in 1908. A year later, the company came within $2 million of adding Ford to its growing list of subsidiaries. (Add that to the list of historical hinge points for the sci-fi book you’re writing.) Walter Chrysler was the last of the big three, founding his namesake corporation out of the ashes of the Maxwell Motor Company in 1925.

If you’re aware of all that, you almost certainly know the other side of the coin: what happens when an industry town’s industry leaves town. While the big three still operate out of the area, the ghost of manufacturing still haunts over the area. First came the decentralization outside of the city proper, and then broader economic troubles and resulting slowed car sales, while increased competition overseas loosened the city/country’s stranglehold on the automotive industry.

Image Credits: Brian Heater

Obviously the full story is far more complex than all of that. Politics and racial inequities played key roles as well. A city so inexorably tied to the automobile was far more motivated to embrace freeways over public transit. These sorts of decisions have a way of increasing both economic and racial disparities. Meanwhile, much of the white population skipped out of the city proper, in favor of the suburbs, which now include some of the wealthiest zip codes in the country. The gulf between the rich and the poor has a way of widening the later you get into capitalism’s progression. I’m reminded of this every time I head home to San Francisco, where people are forced to live out on the streets in front of some of the world’s richest corporations.

Much like my first visits to cities like Pittsburgh and Baltimore, I had certain expectations heading to Detroit for the first time. From my extremely limited experience in both, it seems like Pittsburgh and Detroit, in particular, are on similar paths, but the former has a sizable head start.

One thing many who visit the city will point to are the abandoned buildings. The population decline can’t be ignored. In prosperous post-war 1950, the city claimed 1.85 million residents, making it the fifth largest American city. The 2021 census, however puts the figure at 633,000. Devoid of people, these thriving organisms begin to feel like monuments. 

For years, my distant impression of the city has been flavored by stories of revitalization. For one thing, population decreases lead to rent drops, which can, in turn, birth a thriving arts scene. The word “renaissance” has been thrown out many times in recent decades.  The fact of life in 2023 is that living in places like New York, Los Angeles and San Francisco can be unattainable on an artist’s income. In recent decades, Detroit has offered a compelling cross section of cheap rent and rich culture — some of the greatest music in the history of the world was produced here. I took a car straight to the Motown house after I landed. There’s something truly magical about a place that gave us Diana Ross, the MC5, Detroit techno, Dirtbombs, The White Stripes and Danny Brown.

At the end of the day, however, a thriving art scene is great at attracting young people but is unfortunately rarely an epicenter for a thriving economy. I’ve also long heard whispers about the return of manufacturing to the city. Certainly Detroit has the infrastructure necessary, and the continued presence of automotive headquarters is key, especially as more companies look to decentralize and localize manufacturing due both to ongoing supply chain concerns and the very real possibility of deepened U.S.–China tensions.

Image Credits: Brian Heater

“I think one of the challenges that we saw exposed during COVID was supply chain issues,” president of the Association for Advancing Automation (A3) and Detroit native Jeff Burnstein told me during a conversation this week. “It’s not easy to do, but a lot of companies would like to bring more manufacturing back to North America. You can’t just rip up your supply chain, of course. . . . But they were able to do it because of automation. One of the limiting factors [for reshoring] is that we don’t have enough people who are skilled.”

There do appear to be some good initiatives in place, and I’m increasingly hearing stories from hardware startups that have been pushing to bring assembly and/or manufacturing closer to home. True, lasting success takes time and a lot of money, along with the concerted efforts of both industry and government. I went to an event last night at Newlab’s shiny new Detroit offices last night. The space and the growing number of startups point to key dollars entering the market.

The campus is located in a fairly remote part of the city, near the Ambassador Bridge, which carries roughly one-quarter of all U.S./Canada merchandise trade each year. The biggest benefit of the area is that there’s plenty of room to grow and — in theory — help foster a thriving startup community. In addition to big tech companies like Google and the native StockX, smaller startups from the area are starting to get national recognition. RoboTire, the robotic tire change firm we’ve written about a few times over the years, served as a kind of poster child at this week’s show. Certainly the company is a great example of leveraging Detroit’s automotive expertise as the foundation of something new and bleeding edge.

Image Credits: Brian Heater

The continued presence of the automotive industry is probably the single largest reason why the Automate show is located here. Burnstein gave me a broad overview.

“Our show had a lot of names,” he explained, offering a quick history of the event. “It started as the Robotics Show in the ’70s. Robots were going to be the next industrial revolution. The show was so big that — I think it was in 1982 — they had to close the escalator to the basement where the show was, because the fire marshal said there were too many people. There were like 30,000 people. It was almost as big as it [is] now. The show followed the fortunes of the industry, which went downhill in the mid-’80s.”

Burnstein lays the blame for the implosion on General Motors’ decision to cut robotics orders. As the major driver of the early industrial robot industry, the decision had a profound impact on the burgeoning category.

“Detroit was the natural home, but then the auto industry stopped buying so much,” he adds. “Our show said we can’t do it every year, and let’s find other places to do it in. We were in Chicago for two decades.” During that time, the event joined forces with the Assembly Show and later MHI’s ProMat. Ultimately both ProMat and Automate grew to a point where each event evolved back into its own separate show, now two of the country’s largest robotics event.

ProMat remains the larger of the two shows. It’s also more focused on a single industry. When I first started discussing the possibility of attending both this year, ProMat was pitched to me as a logistics show and Automate as manufacturing. ProMat started life focused on that space but has increasingly grown into an automation show as robotics have begun to have an outsized influence on the industry. There’s certainly logistics at this event (Locus and Zebra/Fetch were both present, for example — albeit in much smaller booths than at ProMat — but manufacturing (specifically automotive) still feels like its lifeblood. A Fanuc arm holding up a sports car has been the show’s iconic visual for years. Fittingly, you’ll also see a giant Fanuc “Let’s Talk About Automation” ad on the side of an airport parking garage as you arrive — something I’m told is not an Automate-only feature.

Smaller robotics startups were less of a presence at Automate, though they were there in some forms, like the Pittsburgh consortium that brought Shift Robotics’ Moonwalker shoes and drone inventory startup Gather AI. Most of my interactions with founders occurred at after-parties like the one at Newlab and an event put on by robot operating system (ROS) stewards Open Robotics and ROS users PickNik Robotics and Foxglove.

Image Credits: Getty Images / Daryl Balfour

I did, however, line up a couple of chats with some bigger names, including Jim Lawton, the Rethink and Universal Robotics vet who heads up robotics and automation over at Zebra Technologies. If you’ve read about Zebra in this newsletter in the past, it’s because of the company’s 2021 acquisition of Fetch. I’ve likened the deal to Amazon’s acquisition of Kiva from the standpoint of a company buying an existing startup to serve as the foundation of a broader robotics play. In a certain way, it’s actually closer to the Shopify/6 River Systems deal, in the sense that Amazon suddenly left a lot of customers in the lurch after it cut off third-party clients.

Of course, the Shopify situation went pear-shaped as the Canadian e-commerce giant sold off 6 River amid news of far-reaching layoffs. Fetch is third place in the category behind 6 River and Locus, the latter of which is the biggest player by a wide margin. Zebra’s acquisition was clearly an ecosystem play — effectively a bid to start selling robots to customers of its existing services.

Says Lawton:

As the market has matured, customers who are now looking at automation now want solutions to problems. They’re not robotic tinkerers. For a while, it was “These are cool. How fast are they?” Now it’s “How much productivity can I get? How much increased capacity can I get? A lot of that comes from taking robots, these devices, and the ability to control other things in the warehouse, like getting robots up to the mezzanine level, meaning the robot is able to activate the elevator. If I’m going to take a tote off a robot and put it onto a conveyor, I need to activate the conveyer, I need to activate the robot. We have an IoT gateway device that we use to orchestrate all of the other things. What they want is a warehouse workflow optimization tool that happens to involve robots.

We also discussed my favorite topic of recent vintage: interoperability. Lawton again:

I think it’s going to take longer than people think it is. The idea of seamless interoperability is not something we’re going to see a lot of over the next couple of years. It’s going to take some time. I know the markets are different, but there is some precedent on the manufacturing side. Robotic arms have been in manufacturing since the 1960s, and we still don’t have [interoperability]. It’s going to take some time. There are reasons interoperability is more important in the warehouse space. A robotic arm is integrated into a cell. These kinds of robots are a little bit different, so I think you’ll see the pace get a little bit faster.

Q&A with Melonee Wise

Image Credits: TechCrunch

Early this week, we were the first with the news that Fetch founder Melonee Wise had left Zebra and joined forces with Agility, where she’ll serve as CTO. The Willow Garage vet knows as much about warehouse robots as just about anyone on the planet. It’s a great hire for Agility, and an interesting challenge for Wise as the company continues to explore commercial applications with Digit.

Wise and I discuss the move and a bunch more:

Let’s start with Agility.

I’m so excited.

I’m sure you’ve known them for a long time.

There was — let’s call it a seminal moment — where [CEO Damion Shelton] and I were at a conference and we ended up on a panel together talking about why robot technology has such a hard time getting out of the university. What is it about robotics technology and crossing the chasm from a cool science project to a full production company that’s shipping robots?

You had already had success with that. It’s scalability, repeatability.

It’s also about what to focus on, when. There’s a lot of perfecting the imperfect that happens where we’re hyperfocused on minutia from a technology perspective that the customer doesn’t care about. If you look at logistics, primarily what the customers care about — if they could just wave a magic wand and get it — they want a teleporter.

To get a product from one place to another.

Yeah. We, as roboticists, are building all these robots to achieve basically that ask. Because they want a teleporter, the customer has a hard time articulating what the need is. We as roboticists — because we don’t get great specifications — we have the tendency to try to really rathole on the things that we know.

You over-engineer.

Yeah. Instead of just throwing it out into the world, letting it fail and then building the thing that customers want.

There’s an extent to which you can fail when starting a company. If Digits went out there, started falling over and lighting on fire, that would be a problem.

Absolutely. But I think there’s a thing that happens in early-stage startups that decides if they’re going to be successful or not. I believe the success point is when you’re at your first customer and it’s not going perfectly.

When you’re still small.

Yeah. It’s that customer that really shapes and helps you define your product — or a set of customers. It’s not going to go well there. No matter how you like to spin it, it doesn’t go well — unless you’re in software and you really have the ability to change things quickly. With robotics hardware, you’re kind of stuck with the thing that you made, and a lot of what you’re doing when you get into your first customer is trying to tweak it or trying to put candy wrapping around it. And then you make a small iteration in software or hardware, and you get closer and closer to what everyone wants to buy. But it’s from that failure that you get a directed purpose for success.

Fetch had that time as a smaller company. Agility’s product is so visually interesting, and it got so much press, that suddenly Ford is interested. That’s a lot of pressure at an early stage.

Yeah, it is. I think that we’re trying as a company as part of scaling to choose customers that we have an opportunity for codevelopment with. It’s not to say that we’re not focusing on the other big customers that are with us; it’s just that we’re diversifying our strategy to make sure that we have these opportunities to learn in a non–high publicity environment.

Also, you find a warehouse so it can be applied to other warehouses. The Ford learnings can be applied to other things.

One of the things that we have been trying to get to is the solution. Fetch struggled with that for a very long time. We started by selling a robot product into the market. But we eventually became a solution.

It makes sense. You started as roboticists, not warehouse experts.

I think that all robotics companies go through that originally. It just matters what end of the spectrum you’re coming from. If you look at the story of Locus, they were product-first people, robotics second.

They were a logistics company that felt like they had to make robots.

Yeah. Their solution came to market a lot more mature, but their robotics hardware came to market less mature. It depends on where you are on the spectrum. Agility, like many robotics companies, comes from a very tech-heavy perspective. There’s still learning about what we have to do, which I’m excited about. That’s why I’m there. I think that we are quickly narrowing in on what the end cases are.

There’s a big, ongoing debate around the necessity of legs.

Wheeled robots struggle with a lot of scenarios. I would say the other better case to talk about with wheeled robots is every time you want to do a task like Digit does, you need a specialized accessory to do it.

In terms of an arm to lift it up?

Yeah. Or in the case of a mobile robot, you wouldn’t solve that with arms. You would most likely make a lifting conveyer piece. The robot would have a little conveyer, it would have a way of grabbing the tote, like a sliding arm, and then shuffle it onto the robot. Then the robot would kick it out onto the conveyer. The problem is that it’s a whole other special accessory that you have to build. At Fetch we had mobile platforms, but most of our business eventually became making specialized accessories for the different vertical applications we cared about. The Digit promise is that we are able to attack more vertical use cases with less hardware modification. We have a much more — let’s not say general purpose — but multipurpose [way]. It’s a lot easier to envision a single piece of hardware being used for multiple purposes and a facility for these types of activities.

There are certain things that wheeled robots just can’t do. We have a hard time in general with going over bumps or ramps, largely because it causes problems with safety and localization mapping. One of the more hilarious things about going up a ramp with a mobile robot is as you approach the ramp, it looks more and more like a wall.

Wile E. Coyote syndrome.

There’s some nuance there. That is not to say that mobile robots aren’t a good solution for a lot of use cases. It’s just that when you’re starting to look at this vision of end to end automation with these different agents, there’s a class of things that certain agents are good at. And there’s a class of things that other agents are.

Is it too difficult — or impossible — to put the arm solution on an AMR?

As someone who’s built AMRs with arms — Fetch had a research project that had an arm; we sold about 100 of them over five or six years to research institutes — I think it’s hard to do bi-manual manipulation on a mobile manipulator because of the hardware constraints like putting [on] two robot arms, like Kuka is.

There’s a difference between a Kuka arm and the simpler arms on Digit.

I would say that still — even to implement the framework that Digit has on a mobile platform, there are some limitations that you would run into. There are some advantages to having a base with legs. One of the things you can’t do with mobile platforms is you can’t get below the top of the platform. Digit has a lot of capability in terms of crouching and also reaching that are challenges with mobile platforms from a stability perspective, from a footprint and space perspective.

Agility Robotics; DIGIT

Image Credits: Agility Robotics

Is the [Digit] arm diverse enough now to continue to diversify the tasks it can perform?

We’re doing some tweaks to the arm, so you’ll see a bit more complexity come into the arm to deal with some of the challenges with the payload moving inside the tote.

In terms of throwing the robot off-balance?

In terms of being able to keep the tote level and things like that. I think we are on a path that makes sense for the right level of complexity to solve the right level of complexity problem. I don’t think we as Agility have ever said we’re making complex, dexterous mobile manipulators with five-finger hands. I’m not convinced that’s the right way to approach the problem in general.

What does the path look like for Agility, going forward? Does everything continue to revolve around Digit?

Let me preface this by saying I’ve been with Agility for five whole days. I would say that I think there’s going to be a large focus on Digit for the foreseeable future, I think that we are going to have to expand our automation story. Fetch did the same thing. We started with a robot, and then we had a robot that interacted with conveyers and these types of devices. Digit’s equivalent is we have to interact with conveyers and put walls.

What’s the solution for Digit? Accessories or a pure software play?

There will be accessories; there will be a software platform that allows us to connect. We’re going to be expanding our fleet management platform and things like that. And also expanding to connect to standard automation tools and partnering to connect with companies that are well known in the automation industry.

Is there a lot of overengineering happening on the humanoid side of the robotics world?

It’s hard to say, because it’s hard to say what the use case is. It would depend on what you’re trying. Take robotic picking? How many picking cells have you seen that have five-fingered hands?

They’re all suction cups now.

Suction cups and soft gripping clients. We first need to ask ourselves what is the use case? If you look at the manufacturing and logistics domains, there’s enough prior art that shows five-fingered hands are not necessary.

The hand is just one example, but it does get to the broader point about potentially overengineering.

I’m hesitant to say “overengineering.” I would say it’s just poorly targeted product design.

We talked a bit about your decision to join Agility, but why was this the right time to leave Zebra?

To be more clear, I went from being the CEO to a CTO for the Robotics Business unit. We transitioned into Zebra. It was a big change for me, but although the work was interesting, it was not as fulfilling as I wanted it to be for me, personally. After about 18 months of being at Zebra, I decided it was the right time for me to leave. I decided to take some time off. Running a company for seven years — and having a plethora of different life events that happened during that period — I needed a break. I decided to take six months off. I think it was a good thing for me, personally. We don’t talk enough about founder health and things like that.

Being a CTO is obviously still a hard job, but it almost sounds like a relief to be able to just focus on the tech stuff.

I’m not gonna lie. Moving to Agility is going to be — it’s weird to say this — relaxing to not have some of the burden of being CEO. I’m really excited about that.

You were doing a lot of things you didn’t study for.

Yeah, and also there’s a lot of weird pressure that falls on you as a CEO. For me, some of those pressures were a little bit more extreme, being a woman. And fundraising is not my favorite activity. I, personally, will find it very relaxing to be the CTO of Agility, because I won’t be the CEO. I think Damion is a great CEO and I think he’s doing work that is thankless.

How are you and [former CTO turned chief robotics officer] Jonathan Hurst going to work together?

The way that I see it is Jonathan is very focused on building the innovation pipeline for the robotics hardware. He created Cassie, he created all of the iterations before that. He is going to be building the future technology that Digit will one day rely on. My focus is going to be more product centric.

More news

On the humanoid robot front (where we seem to spend a lot of our time these days), I sat down with Apptronik CEO Jeff Cardenas in the otherwise empty Huntington Place cafeteria to discuss the Austin company’s plans to unveil exactly that this summer. The company didn’t have a who presence on the floor, but Cardenas had a slideshow on his MacBook that he was sharing with a select few, including myself. It started with the company’s varied history.

“[The] exoskeleton was liquid cooled,” he told me. “We learned a lot doing that. The complexity of the system was too high. It was heavy. We remotized all of the actuators. And then we started to realize what was the simplest version of a humanoid robot: a mobile manipulator. We started getting approached by a lot folks in logistics, who didn’t want to pay for manufacturing arms. They were too precise for what they need. What they wanted was an affordable robotic logistics arm.”

I can’t share images of the system with you at this point, but I can describe what I saw. Quoting myself:

Cardenas shows me images — both renders and photos — of Apollo, the system it plans to debut this summer. I can’t share them here, but I can tell you that the design bucks the kind of convergent evolution I’ve described, which found Tesla, Figure and OpenAI-backed 1X showing renders with a shared designed language. Apollo looks — in a word — friendlier than any of these systems and the NASA Valkyrie robot that came before it.

It shares a lot more design qualities with Astra. In fact, I might even go so far as describing it as a cartoony aesthetic, with a head shaped like an old-school iMac, and a combination of button eyes and display that comprise the face. While it’s true that most people won’t interact with these systems, which are designed to operate in places like warehouses and factory floors, it’s not necessary to embrace ominousness for the sake of looking cool.

Figure office, interior

Image Credits: Figure

Apptronik will be exploring a Series A this year, once the robot is revealed and — hopefully — drives investor interest. Meanwhile, one of the company’s chief competitors, Figure, just announced its own $70 million Series A, shortly after its own humanoid robot took its first steps days before the company’s first anniversary.

“We’re focused on investing in companies that are pioneers in AI technology, and we believe that autonomous humanoid robots have the potential to revolutionize the labor economy,” investor Parkway Venture Capital’s Jesse Coors-Blankenship said in a prepared statement. “We are impressed by the rapid progress that Brett and the team of industry experts at Figure have made in the last year and are thrilled to be a financial partner to provide resources to accelerate the commercialization of Figure 01.”

Image Credits: Brian Heater

I’m finishing up this week’s Actuator from Gate A17 at the Detroit Metropolitan Wayne County Airport in Romulus, MI (track nine from Sufjan Stevens’ seminal 2003 album, Michigan). I capped off my first Automate with dinner at Huntington Place’s Grand Riverview Ballroom for the bi-annual Joseph F. Engelberger Robotics Award (though it seems the show will soon be going yearly, along with Automate itself).

The award’s namesake, Joseph F. Engelberger, is credited with co-developing Unimate, the first industrial robot, alongside George Devol in the 50s. The arm would eventually be installed in a General Motors assembly line, making it an innovation decades ahead of its time. The award is regarded as one of the industry’s most prestigious (A3 likes to call it the “Nobel Prize of Robotics”).

Jeff Burnstein fittingly received one of the awards, alongside longtime Universal Robotics employee, Roberta Nelson Shea. Burnstein’s story was a nice one — that of a Detroit native who has had a front row for the dizzying highs and lows of his city and industry alike. Shout out to a fellow English major (though they called it “Literature” where I went to school) who somehow ended up involved in the robot space. Perhaps there are more of us. Let’s start a club.

To hear her tell it, Nelson Shea’s journey was also unexpected. UR’s Global Technical Compliance Officer is best known for her tireless efforts to promote robot safety. Between cobots, HRI and countless hardware and software innovations, the subject is now central to the way with think about industrial robots. The standards Shea helped create are a big piece of that. 

“This award is a testament to the great contribution Roberta has made to the robotics industry,” says UR President Kim Povlsen. “Her dedication to safety has helped create the standards for the interaction between people and robots. This has been an important contribution to the collaborative relationship we see today between humans and robots across hundreds of thousands of workplaces.”

And Nelson Shea in her own words, “The Engelberger Robotics Award for Application in Safety is a tremendous honor to me and to all those who have embraced and contributed to robotic safety. I remember meeting Joe Engelberger over 40 years ago and never imagined receiving this award. I view the award to be honoring the industry’s progress in optimizing safety and productivity. The journey has been amazing!” 

Image Credits: Bryce Durbin / TechCrunch

Rev up your engines and subscribe to Actuator.

Motor City mechatronics by Brian Heater originally published on TechCrunch

Figure raises $70M to build its humanoid robots

I spent a bit of time at Figure’s Sunnyvale offices during a recent visit to the South Bay. The firm is currently in that semi-awkward phase where it’s ready to talk about what it’s working on — but only to a point. That means things like its process and plans are on the table, while actually showing its robot is not.

We got some renders when the company came out of stealth back in March, but beyond that, the Figure 01 robot hasn’t traveled beyond its office walls. Having seen a pre-working model, I can attest to the ambition of the whole project, but as I wrote in my long piece last week, the merits of work like this should be assessed every step along the way.

The company did, however confirm with TechCrunch that its assembled robot (bottom half+top half) took its first step ahead of its one year anniversary. “We believe that this is the fastest timeline in the industry to date,” it notes.

This morning the company announced a sizable vote of confidence in the form of a $70 million Series A. The round is led by Parkway Venture Capital and features Aliya Capital, Bold Ventures, Tamarack Global, FJ Labs and Till Reuter, the CEO of industrial arm maker, Kuka Robotics. It follows founder and CEO Brett Adcock’s self-financed $100 million seed to get things up and running.

“We’re focused on investing in companies that are pioneers in AI technology, and we believe that autonomous humanoid robots have the potential to revolutionize the labor economy,” Parkway Venture Capital’s Jesse Coors-Blankenship said in a prepared statement. “We are impressed by the rapid progress that Brett and the team of industry experts at Figure have made in the last year and are thrilled to be a financial partner to provide resources to accelerate the commercialization of Figure 01.”

These are, indeed, big rounds. There was a pre-pandemic moment when early-stage rounds were ballooning, but economic factors have brought many of them back down to earth. But robots are expensive. I mean, hardware in general is expensive, but robots are next level. That’s especially true with (1) a relatively unproven form factor and (2) the notion of “general purpose” systems.

There’s a reason you don’t read about purpose-built humanoids: There are far more efficient ways to build robots that do things very well. You start exploring something like a bipedal humanoid robot when you want a system that can do the things a human can.

Figure is still quite young, having only celebrated its first birthday on May 20, but it’s been aggressively hiring talent from places like Boston Dynamics, Tesla, Apple and automotive manufacturers. In spite of speedy iterations, however, we’ve not seen much in the way of timelines from the company thus far.

Figure raises $70M to build its humanoid robots by Brian Heater originally published on TechCrunch

Figure raises $70M to build its humanoid robots

I spent a bit of time at Figure’s Sunnyvale offices during a recent visit to the South Bay. The firm is currently in that semi-awkward phase where it’s ready to talk about what it’s working on — but only to a point. That means things like its process and plans are on the table, while actually showing its robot is not.

We got some renders when the company came out of stealth back in March, but beyond that, the Figure 01 robot hasn’t traveled beyond its office walls. Having seen a pre-working model, I can attest to the ambition of the whole project, but as I wrote in my long piece last week, the merits of work like this should be assessed every step along the way.

The company did, however confirm with TechCrunch that its assembled robot (bottom half+top half) took its first step ahead of its one year anniversary. “We believe that this is the fastest timeline in the industry to date,” it notes.

This morning the company announced a sizable vote of confidence in the form of a $70 million Series A. The round is led by Parkway Venture Capital and features Aliya Capital, Bold Ventures, Tamarack Global, FJ Labs and Till Reuter, the CEO of industrial arm maker, Kuka Robotics. It follows founder and CEO Brett Adcock’s self-financed $100 million seed to get things up and running.

“We’re focused on investing in companies that are pioneers in AI technology, and we believe that autonomous humanoid robots have the potential to revolutionize the labor economy,” Parkway Venture Capital’s Jesse Coors-Blankenship said in a prepared statement. “We are impressed by the rapid progress that Brett and the team of industry experts at Figure have made in the last year and are thrilled to be a financial partner to provide resources to accelerate the commercialization of Figure 01.”

These are, indeed, big rounds. There was a pre-pandemic moment when early-stage rounds were ballooning, but economic factors have brought many of them back down to earth. But robots are expensive. I mean, hardware in general is expensive, but robots are next level. That’s especially true with (1) a relatively unproven form factor and (2) the notion of “general purpose” systems.

There’s a reason you don’t read about purpose-built humanoids: There are far more efficient ways to build robots that do things very well. You start exploring something like a bipedal humanoid robot when you want a system that can do the things a human can.

Figure is still quite young, having only celebrated its first birthday on May 20, but it’s been aggressively hiring talent from places like Boston Dynamics, Tesla, Apple and automotive manufacturers. In spite of speedy iterations, however, we’ve not seen much in the way of timelines from the company thus far.

Figure raises $70M to build its humanoid robots by Brian Heater originally published on TechCrunch