Disney/Lucasfilm donates $1.5 million to FIRST

A day after the big Episode IX reveal, Disney and subsidiary Lucas film announced that it will be donating $1.5 million to FIRST . The non-profit group was founded by Dean Kamen in 1989 to help teach STEM through initiatives like robotics competitions.

Disney’s money will go to provide education and outreach to the underserved communities on which FIRST focuses. Details are pretty thin on precisely what the partnership will entail, but Disney’s certainly got a lot to gain from this sort of outreach — and Lucasfilm knows a thing or two about robots.

The Star Wars: Force for Change announcement was made in conjunction with Lucasfilm’s annual Star Wars Celebration in Chicago. Yesterday the event hosted a panel with the cast of the upcoming film that included a teaser trailer and title reveal.

“Star Wars has always inspired young people to look past what is and imagine a world beyond,” Lucasfilm president Kathleen Kennedy said in a release tied to the news. “It is crucial that we pass on the importance of science and technology to young people—they will be the ones who will have to confront the global challenges that lie ahead. To support this effort, Lucasfilm and Disney are teaming up with FIRST to bring learning opportunities and mentorship to the next generation of innovators.”

It’s been a good week for FIRST investments. Just yesterday Amazon announced its own commitment to the group’s robotics offerings.

IAM Robotics puts a unique spin on warehouse automation

Before robots get to do the fun stuff, they’re going to be tasked with all of the things humans don’t want to do. It’s a driving tenet of automation — developing robotics and AI designed to replace dull, dirty and dangerous tasks. It’s no surprise, then, that warehouses and fulfillment centers have been major drivers in the field.

Earlier this week, we reported that Amazon would be acquiring Canvas, adding another piece to its portfolio, adding to the 100,000 or so robotics it currently deploys across 25 or so fulfillment centers. Even Boston Dynamics has been getting into the game, acquiring a vision system in order to outfit its Handle robot for the warehouse life.

Like so much of the robotics world, Pittsburgh is a key player in the world of automation. IAM Robotics is one of the more compelling local entrants in the space. We paid the company a visit on a recent trip to town. Located in a small office outside of the city, the startup offers a unique take on the increasingly important pick and place robotics, combining a robotic arm with a mobile system.

“What’s unique about IAM robotics is we’re the only ones with a mobile robot that is also capable of manipulating objects and moving things around the warehouse by itself,” CEO Joel Reed told TechCrunch. “It doesn’t require a person in the loop to actually physically handle things. And what’s unique about that is we’re empowering machine with AI and computer vision technologies to make those decisions by itself. So it’s fully autonomous, it’s driving around, using its own ability to see.”

The startup has mostly operated quietly, in spite of a $20 million venture round led by KCK late last year. After a quick demo in the office, it’s easier to see how early investors have found promise in the company. Still, the demo marks a pretty stark contrast from the Bossa Nova warehouse where we spent the previous day.

There are a couple of small rows of groceries in a corner of the office space, a few feet away from where the rest of IAM’s staff is at work. A pair of the company’s Swift robots go to work, traveling up and down the small, makeshift aisle. When the robot locates the desired product on a shelf, a long, multi-segmented arm drops down, positioning itself in front of a box. The suction cup tip attaches to the product, then the arm swivels back around to release it into a bin.

Used correctly, the Swift could help companies staff difficult-to-fill positions, while adding a layer of efficiency in the warehouse. “Our customers or prospective customers are looking to automate to both reduce costs, but also to alleviate this manual labor shortage,” says Reed. “So we have a younger generation that’s just more interested in doing jobs like gig economy jobs, drive for Uber, Lyft, those kinds of things, because they can make more money than they could in working at a warehouse.”

Amazon teams with FIRST for robotics education grants

Earlier this week, Amazon confirmed that it would be acquiring Canvas, a little known Colorado based startup that creates autonomous driving systems for warehouse fulfillment. It’s the latest in a long line of moves that finds the company looking seriously at automation for the future of retail plans.

Last year, the company announced the launch of Amazon Future Engineer, a program designed facilitate STEM education in classrooms. To help launch the program, Amazon is with STEM non-profit FIRST to create a series of robotics grants specifically targeted underrepresented populations.

The plan kicks off with 100 schools in 21 states, starting in fall of this year. The program features support for FIRST robotics teams and teacher education, along with an additional $10,0000 in funding and a tour of local fulfillment centers. The company says it currently has more than 100,000 robots deployed across its warehouses.

“Amazon is helping FIRST in our goal to make robotics teams and programs available in every school,” FIRST founder Dean Kamen said in a release tied to the news. “In FIRST, every kid on every team can go pro. They gain a hands-on learning pathway in technology, computer science and engineering that propels them forward and inspires innovation.”

Earlier this year, the company announced that would be be investing in New York area classrooms, ahead of the (since abandoned) launch of its HQ2.

Uber spent $457 million on self-driving and flying car R&D last year

Uber spent $457 million last year on research and development of autonomous vehicles, flying cars (known as eVTOLs) and other “technology programs” and will continue to invest heavily in the futuristic tech even though it expects to rely on human drivers for years to come, according to the company’s IPO prospectus filed Thursday.

R&D costs at Uber ATG, the company’s autonomous vehicle unit, its eVTOL unit Uber Elevate and other related technology represented one-third of its total R&D spend. Uber’s total R&D costs in 2018 were more than $1.5 billion.

Uber filed its S-1 on Thursday, laying the groundwork for the transportation company to go public next month. This comes less than one month after competitor Lyft’s debut on the public market. Uber is listing under the New York Stock Exchange under the symbol “UBER,” but has yet to disclose the anticipated initial public offering price.

Uber believes that autonomous vehicles will be an important part of its offerings over the long term, namely that AVs can increase safety, make rides more efficient and lower prices for customers.

However, the transportation company struck a more conservative tone in the prospectus on how and when autonomous vehicles will be deployed, a striking difference from the early days of Uber ATG when former CEO Travis Kalanick called AVs an existential risk to the business.

Uber contends there will be a long period of “hybrid autonomy” and it will continue to rely on human drivers for its core business for the foreseeable future. Uber said even when autonomous vehicle taxis are deployed, it will still need human drivers for situations that “involve substantial traffic, complex routes, or unusual weather conditions.” Human drivers will also be needed during concerts, sporting events and other high-demand events that will “likely exceed the capacity of a highly utilized, fully autonomous vehicle fleet,” the company wrote in the S-1.

Here’s an excerpt from the S-1:

Along the way to a potential future autonomous vehicle world, we believe that there will be a long period of hybrid autonomy, in which autonomous vehicles will be deployed gradually against specific use cases while Drivers continue to serve most consumer demand. As we solve specific autonomous use cases, we will deploy autonomous vehicles against them. Such situations may include trips along a standard, well-mapped route in a predictable environment in good weather.

Uber contends it is well-suited to balance that potentially awkward in-between phase when both human drivers and autonomous vehicles will co-exist on its platform.

“Drivers are therefore a critical and differentiating advantage for us and will continue to be our valued partners for the long-term,” Uber wrote.

Despite Uber’s forecast and more tempered tone, the company is pushing ahead on autonomous vehicles.

Uber ATG was founded in 2015 in Pittsburgh with just 40 researchers from Carnegie Robotics and Carnegie Mellon University . Today, Uber ATG has more than 1,000 employees spread out in offices in Pittsburgh, San Francisco and Toronto.

Uber acknowledged under the risk factors section of the S-1 that it could fail to develop and successfully commercialize autonomous vehicle technologies or could be undercut by competitors, which would threaten its ride-hailing and delivery businesses.

Uber’s view of which companies pose the biggest threat to the company was particularly interesting. The company named nearly a dozen potential competitors, a list that contained a few of the usual suspects like Waymo, GM Cruise and Zoox, as well as less-known startups such as May Mobility and Anthony Levandowski’s new company, Prontio.ai. Other competitors listed in the S-1 include Tesla, Apple, Aptiv, Aurora and Nuro. Argo AI, the subsidiary of Ford, was not listed.

ATG has built more than 250 self-driving vehicles and has three partnerships — Volvo, Toyota and Daimler — that illustrates the company’s mult-tiered strategy to AVs.

Uber has a first-party agreement with Volvo. Under the agreement announced in August 2016, Uber owns Volvo vehicles, has added its AV tech and plans to deploy those cars on its own network.

Its partnership with Daimler is on the other extreme. In that partnership, announced in January 2017, Daimler will introduce a fleet of its own AVs on the Uber network. This is similar to Lyft’s partnership with Aptiv.

Finally, there’s Toyota, a new partnership just announced in August 2018, that is a hybrid of sorts of the other two. Uber says it expects to integrate its autonomous vehicle technologies into purpose-built Toyota vehicles to be deployed on its network.

Claire Delaunay will be speaking at TC Sessions: Robotics + AI next week at UC Berkeley

We’re a week out from our third-annual TC Sessions: Robotics event, and we still have some surprises left to announce. I know, we’re just as surprised as you are. We’ve already announced that Marc Raibert, Colin Angle, Melonee Wise and Anthony Levandowski will be joining us in Berkeley next week, and today we’re adding Claire Delaunay to the list of distinguished names.

Delaunay is VP of engineering at NVIDIA. Prior to NVIDIA, she worked as the Director of Engineering at Uber, after the ride-sharing service acquired her startup, Otto. She has also worked as the robotics program lead at Google.

She is currently the head of NVIDIA Isaac. The company’s robotics platform is designed to make it easier for companies of various experience levels and means to develop robots. Delaunay will discuss the platform and showcase some of NVIDIA’s in-house robotics reference devices, including Kaya and Carter.

Speaking of NVIDIA, TechCrunch is partnering with them on April 17 (the day before the conference) to host a Deep Learning for Robotics workshop at UC Berkeley. This in-person workshop will teach you how to implement and deploy an end-to-end project through hands-on training in eight hours led by an instructor. Click here to learn more about the workshop.

Hear from Delaunay and other awesome speakers next week at TC Sessions: Robotics+AI. Purchase your $349 tickets now before prices go up $100 at the door. Student tickets are just $45 – book yours now.

MIT’s recycling sorting robot can ‘feel’ the difference between materials

RoCycle is, naturally, short for “recycling robot,” the latest offering out of MIT’s CSAIL lab. The pick and place ‘bot utilizes a unique combination of sensors to help distinguish the material differences of objects, in order to sort them ahead of the recycling process.

Built on top of a Rethink Robotics Baxter, the system utilizes a teflon gripper with built in sensors that are capable of determining an object’s makeup based on size and stiffness. It’s not completely perfect just yet.

The technology works thusly,

The team’s gripper first uses its “strain sensor” to estimate an object’s size, and then uses its two pressure sensors to measure the force needed to grasp an object. These metrics – along with calibration data on the size and stiffnesses of objects of different material types – is what gives the gripper a sense of what material the object is made of. (Since the tactile sensors are also conductive, they can detect metal by how much it changes the electrical signal.)

The school notes that it’s got a pretty good 85 percent accuracy rate when attempting to identify stationary objects.

That number drops a good deal down to 63 percent when objects are moving in a conveyer belt type situation — which is the kind of real world scenario in which such an object would likely be deployed. Much of the inaccuracy stems from products like cans where an outer paper level covers tin or aluminum.

The team is working on it by adding more sensors. It also seems likely that such a technology would be used in tandem with existing system, which use magnets to separate out metals and vision to determine material properties. CSAIL plans to combine the technology with a vision system in its own future experiments.

Hanson recruits Softbank Robotics scientist as CTO

Hanson Robotics announced this morning that it has hired Amit Kumar Pandey, Softbank Europe’s Chief Scientist, as Chief Technology Officer (CTO) and Chief Science Officer (CSO). In his six years at Softbank, Pandey ran a various robotics initiatives, including the company’s Social Interaction and Intelligence of the Robot program.

At Hanson, he will be charged with leading the company’s commercial products and fostering its AI applications.

The Hong Kong-based robotics company is likely best known for Sophia, the uncanny humanoid robot with a see-through head designed to carry on a conversation. Earlier this year, the company took to Kickstarter to launch Little Sophia, a home version of the robot designed to help teach kids STEM.

“One of the greatest technological revolutions of the 21st century is the emergence of robotics and AI together as an innovative ecosystem,” Pandey said in a release tied to the news. With helpful AI and robotics, we hope to improve the quality of people’s lives, leading toward a healthier, safer, happier and better educated society. By joining Hanson Robotics, I am confident we are moving closer toward this greater innovative ecosystem.”

Prior to Softbank, Pandey focused on humanoid The French National Centre for Scientific Research (LAAS-CNRS).

Amazon acquires autonomous warehouse robotics startup Canvas Technology

Amazon has acquired Boulder, Colo.-based warehouse robotics startup Canvas Technology, TechCrunch has learned. The deal makes a lot of sense from the outside, adding another important piece to Amazon Robotics’ growing portfolio of fulfillment center machines.

Amazon confirmed the acquisition with TechCrunch. “We are inspired by Canvas Technology’s innovations, and share a common vision for a future where people work alongside robotics to further improve safety and the workplace experience,” a spokesperson said in a statement. “We look forward to working with Canvas Technology’s fantastic team to keep inventing for customers.”

Founded in 2015, Canvas has already showcased some impressive technologies, including a fully autonomous cart system that positions the startup as a direct competitor with the likes of Bay Area-based Fetch. The startup raised a $15 million Series A led by Playground Global.

The Canvas Autonomous Cart was on display at Playground’s open house roughly this time last year, doing an impressive job avoiding people and obstacles in the crowded space. The system utilizes 3D imaging and an in-house software solution that can be applied to other hardware — essentially operating like a self-driving car in a warehouse setting.

Canvas should make a nice addition to Amazon Robots’ offerings. The division was created after the company’s 2012 acquisition of Kiva Systems, whose shelving robotics now serve as a kind of robotic epicenter to Amazon’s many fulfillment centers.

On our recent tour of the company’s JFK8 in Staten Island, the company noted that it currently has some 100,000 systems deployed across 25 fulfillment centers. That number is a combination of Amazon’s own systems, along with devices from third parties, including Japanese industrial giant, Fanuc. Clearly, however, the company is looking to put its own stamp on the systems going forward in a push to increase delivery efficiency through automation.

Safety has been a big factor, as well. It always is with these sorts of collaborative robotics, of course, but Amazon’s fulfillment centers have a built-in extra layer of scrutiny. Earlier this year we got a sneak peek at the company’s robotic safety vest, designed to give employees an added layer of protection when entering the fenced-off section of the floor that sports the Kiva systems.

Canvas, on the other hand, brings its own built-in safety with its autonomous vision system. The hardware is designed to more directly interact with workers on the floor. It’s easier to imagine the company adopting the technology for some of its existing systems, as well.

Notably, Canvas co-founder and CTO Nima Keivan will be onstage at our Robotics + AI event next week in Berkeley, where he will no doubt be able to provide some extra insight for his robotics startup panel.

The startup behind Walmart’s shelf-scanning robots

Earlier this week, Walmart announced a wide-scale expansion of in-store technology. The news came, of course, as the superstore chain is fighting Amazon’s encroachment with everything it has. The list includes several robotics technologies — a category in which Amazon has been invested for several years, beginning with the acquisition of Kiva systems in 2012.

Pittsburgh startup Bossa Nova Robotics has been a big player in Walmart’s ambitions for a while now, as well. Back in 2017, the retail giant announced that it would be rolling out Bossa Nova’s shelf-scanning robots in 50 of its stores.

We paid a visit to the company on a recent trip to the Steel City. The company’s out-of-the-way headquarters are housed in a massive warehouse. Past the nondescript front offices, where employees work at computers, is a cavernous area that’s home to a workshop for iterating its robots, coupled with large aisles of shelves designed to mimic the stores where the technologies will eventually be deployed.

The shelves are stocked with groceries that will look immediately familiar to anyone who’s spent time in an American supermarket. There are clothing items on hangers, as well, along with a series of technology toys that offer a small peek into Bossa Nova’s not-so-distant history.

There’s Penbo, the interactive robotic penguin, who’s perpetually pregnant with an egg. And, of course, Prime 8, the programmable gorilla that runs on his two prominent forearms. The products made it to market with limited success, mostly failing to predict the fickle whims of tech-crazy kids.

While the company was created in 2005 when its founders were PhD students at nearby Carnegie Mellon, it wasn’t until it opted for something more mundane that it really truly found its way.

Capitalizing on the “dull” aspect of automation’s “Three Ds,” the company developed a system capable of taking inventory of store shelves.

“If you think about it, what’s the last innovation that’s happened in a store?,” CTO and co-founder Sarjoun Skaff asks rhetorically. “I can think of self-checkouts and barcodes, really. And retailers have always been very good at tracking a product through the supply chain — from the manufacturer, on the boat, through customs and the distribution center. When it hits the stores, it’s a black box. It’s anyone’s guess whether it’s on the shelf, it’s in a back room, whether there’s some theft or a misplaced product.”

Bossa Nova’s system is capable of tracking boxes and making sure things are where they should be. It’s a surprisingly complex task, especially in stores as large as the ones run by Walmart . It’s also incredibly dull and repetitive — as someone who’s worked in retail, I can attest to this.

Like many in the automation business, however, Skaff balks at the suggestion that the technology will replace human employees — at least in the short term.

“Our robot doesn’t have arms right now, so it’s not replacing the manual labor of restocking a shelf,” he says. “It’s displacing the tedious task of looking for problems, which is really mind-numbing. And in large stores, it’s really hard for a human to compete. Associates have had to do it, because there was no alternative. As soon as we can tell you where the problems are, you can spend your time fixing them, restocking the shelves and spending more time with shoppers.”

Walmart’s certainly been invested in the company. The retailer was Bossa Nova’s first major partner, after piloting the technology in some smaller stores. Even with a relatively small number like 50 locations, the deal was an acceleration from 0-60 for a startup of Bossa Nova’s size, as it worked to create custom versions of its autonomous scanners.

A full list of partner stores hasn’t been released, but if you’re lucky, you can catch one of its robots cruising up and down the aisles of a local supermarket, three times a day. As for whether Walmart might eventually absorb Bossa Nova, as Amazon did with Kiva, Skaff is cautious.

“What we would like to do is serve the entire industry, because fundamentally, the problem we’re solving is universal,” he explains. “The entire world has the problem, and nobody has a solution. For large stores, the best solution is a combination of robotics and AI.”

Uber, Lyft, and the challenge of transportation startup profits

How much does transportation cost you?

In most cities, bus or subway fare might set you back $3 or so. A tank of gas, maybe $30 or $40 depending on your car. An hour of street parking? Sometimes it’s free, sometimes it’s a few bucks. And you can usually snag an economy seat on a round-trip U.S. domestic flight for under $300.

These numbers probably ring true for most people. There’s just one problem: Everything you know about the cost of transportation is wrong.

Despite a massive infusion of venture capital into the transportation sector over the past few years, mobility startups are starting to learn what every transportation business has known for generations: transportation profits are elusive, and the system is mainly held together by subsidies. Will this be the first generation of transportation businesses to escape history?