Where May Mobility’s self-driving shuttles might show up next

May Mobility might be operating low-speed self-driving shuttles in three U.S. cities, but its founders don’t view this as just another startup racing to deploy autonomous vehicle technology.

They describe the Ann Arbor-based company as a transportation service provider. As May Moblility’s co-founder and COO Alisyn Malek told TechCrunch, they’re in the “business of moving people.” Autonomous vehicle technology is just the “killer feature” to help them do that. 

TechCrunch recently spent the day with May Mobility in Detroit, where it first launched, to get a closer look at its operations, learn where it might be headed next and why companies in the industry are starting to back off previously ambitious timelines.

Malek will elaborate on what markets are most appealing to May Mobility while on stage at TC Sessions: Mobility on July 10 in San Jose. Malek will join Lia Theodosiou-Pisanelli, head of partner product and programs at Aurora, to talk about what product makes the most sense for autonomous vehicle technology.

Watch a plane land itself truly autonomously for the first time

A team of German researchers have created an automatic landing system for small aircraft that lets them touch down not only without a pilot, but without any of the tech on the ground that lets other planes do it. It could open up a new era of autonomous flight — and make ordinary landings safer to boot.

Now it would be natural to think that with the sophisticated autopilot systems that we have today, a plane could land itself quite easily. And that’s kind of true — but the autoland systems on full-size aircraft aren’t really autonomous. They rely on a set of radio signals emitted by stations only found at major airports: the Instrument Landing System, or ILS.

These signals tell the plane exactly where the runway is even in poor visibility, but even so an “automatic” landing is rarely done. Instead, the pilots — as they do elsewhere — use the autopilot system as an assist, in this case to help them locate the runway and descend properly. A plane can land automatically using ILS and other systems, but it’s rare and even when they do it, it isn’t truly autonomous — it’s more like the airport is flying the plane by wire.

But researchers at Technische Universität München (TUM, or think of it as Munich Tech) have created a system that can land a plane without relying on ground systems at all, and demonstrated it with a pilot on board — or rather, passenger, since he kept his hands in his lap the whole time.

tum plane

The automated plane comes in for a landing.

A plane making an autonomous landing needs to know exactly where the runway is, naturally, but it can’t rely on GPS — too imprecise — and if it can’t use ILS and other ground systems, what’s left? Well, the computer can find the runway the way pilots do: with its eyes. In this case, both visible-light and infrared cameras on the nose of the plane.

TUM’s tests used a a single-passenger plane, a Diamond DA42 that the team outfitted with a custom-designed automatic control system and a computer vision processor both built for the purpose, together called C2Land. The computer, trained to recognize and characterize a runway using the cameras, put its know-how to work in May taking the plane in for a flawless landing.

tumlanding

autotum

As test pilot Thomas Wimmer put it in a TUM news release: “The cameras already recognize the runway at a great distance from the airport. The system then guides the aircraft through the landing approach on a completely automatic basis and lands it precisely on the runway’s centerline.”

You can see the full flight in the video below.

This is a major milestone in automated flight, since until now planes have had to rely on extensive ground-based systems to perform a landing like this one — which means automated landings aren’t currently possible at smaller airports or should something go wrong with the ILS. A small plane like this one is more likely to be at a small airport with no such system, and should a heavy fog roll in, an autoland system like this might be preferable to a pilot who can’t see in infrared.

Right now the tech is very much still experimental, not even at the level where it could be distributed and tested widely, let alone certified by aviation authorities. But the safety benefits are obvious and even as a backup or augmentation to the existing, rarely used autoland systems it would likely be a welcome addition.

These humanoid robots can autonomously navigate cinder block mazes thanks to IHMC

Programming robots to walk on flat, even ground is difficult enough, but Florida’s Institute for Human and Machine Cognition (IHMC) is talking the grander challenge of making sure bipedal robots can successfully navigate rough terrain. The research organization has been demonstrating its work in this area since 2016, but its latest video (via Engadget) shows the progress its made.

In the new video, IHMC’s autonomous footstep planning programming is at work on both Boston Dynamics’ Atlas robot, and the NASA-developed Valkyrie robot (humanoid robots have the coolest names). This video shows off navigation of a heaping pile of cinder blocks, as well as narrower paths which are trickier to navigate because of limited navigation options.

Basically, IHMC manages these complex navigation operations by specifying a beginning and end point for the robot, and then mapping all possible paths on a footstep-by-footstep basis, evaluating the cost of each and ultimately arriving at a best possible path – all of which can occur relatively quickly on modern hardware.

These robots can also quickly adapt to changes in the environment and path blockage thanks to IHMC’s work, and it can even manage single-path tightrope style walking (albeit on a narrow row of cinder books, not on an actual rope).

There’s still work to be done – the team at IHMC says that it’s having about a 50 percent success rate on narrow paths, but its ability to navigate rough terrain with these robots and its software is at a much higher 90 percent, and it’s pretty near a perfect track record on flat ground.

BMW and Daimler partner on autonomous driving, first results of team-up in market by 2024

Global automakers BMW and Daimler will join forces in a new long-term partnership to co-develop automated driving technologies, including levels of automation all the way up to SAE Level 4, which is defined as full self-driving, no human intervention required, but only under exactly defined conditions or domains – steering wheel and brakes not necessarily even present I the car.

This BMW/Daimler partnership includes developing automated driving technologies that precede Level 4, too, including advanced driver assistance features like smart cruise control and automated parking. And while it isn’t in scope of this specific arrangement, the two car makers also say that talks continue about expanding their cooperation to cover highly-automated driving within denser urban areas and in city driving conditions.

It’s a non-exclusive arrangement, which is the new normal in autonomous vehicle technology development, where cross-manufacturer partnerships have been increasingly common, and where we’ve also seen legacy automakers turn with fair frequency to startups and younger technology companies to supplement their in-house development efforts.

Daimler and BMW aim to develop a “scalable platform for automated driving” through their combined efforts, which the companies say is open for participation form both other automakers and tech providers. The resulting platform will also be made available to other OEMs under license.

Independently, Daimler is currently working on deploying its first Level 4/Level 5 self-driving vehicle pilot program in an urban environment in partnership with Bosch, and aims to have that operational this year. BMW’s next big automated driving push will be alongside its iNEXT lines of vehicles, with Level 3 technologies targeted release along with the first of those models in 2021. Both partners expect to implement the results of this partnership specifically in their own respective model series vehicles beginning in 2024, however.

NASA picks a dozen science and tech projects to bring to the surface of the Moon

With the Artemis mission scheduled to put boots on lunar regolith as soon as 2024, NASA has a lot of launching to do — and you can be sure none of those launches will go to waste. The agency just announced 12 new science and technology projects to send to the Moon’s surface, including a new rover.

The 12 projects are being sent up as part of the Commercial Lunar Payload Services program, which is — as NASA Administrator Jim Bridenstine has emphasized strongly — part of an intentional increase in reliance on private companies. If a company already has a component or rover or craft ready to go and meeting a program’s requirements, why should NASA build it from scratch at great cost?

In this case the selected projects cover a wide range of origins and intentions. Some are repurposed or spare parts from other missions, like the Lunar Surface Electromagnetics Experiment. LuSEE is related to the Park Solar Probe’s STEREO/Waves instrument and pieces from MAVEN, re-engineered to make observations and measurements on the moon.

moonrangerOthers are quite new. Astrobotic, which was also recently awarded an $80 million contract to develop its Peregrine lunar lander, will now also be putting together a rover, which it calls MoonRanger (no relation to the NES game). This little bot will autonomously traverse the landscape within half a mile or so of its base and map it in 3D.

The new funding from NASA amounts to $5.6M, which isn’t a lot to develop a lunar rover from scratch — no doubt it’s using its own funds and working with its partner, Carnegie Mellon University, to make sure the rover isn’t a bargain bin device. With veteran rover engineer Red Whittaker on board, it should be a good one.

“MoonRanger offers a means to accomplish far-ranging science of significance, and will exhibit an enabling capability on missions to the Moon for NASA and the commercial sector. The autonomy techniques demonstrated by MoonRanger will enable new kinds exploration missions that will ultimately herald in a new era on the Moon,” said Whittaker in an Astrobotic news release.

The distance to the lunar surface isn’t so far that controlling a rover directly from the surface is nearly impossible, like on Mars, but if it can go from here to there without someone in Houston twiddling a joystick, why shouldn’t it?

To be clear, this is different from the upcoming CubeRover project and others that are floating around in Astrobotic and Whittaker’s figurative orbits.

“MoonRanger is a 13 kg microwave sized rover with advanced autonomous capabilities,” Astrobotic’s Mike Provenzano told me. “The CubeRover is a 2 kg shoebox sized rover developed for light payloads and geared for affordable science and exploration activities.”

While both have flight contracts, CubeRover is scheduled to go up on the first Peregrine mission in 2021, while MoonRanger is TBD.

Another NASA selection is the Planetary Science Institute’s Heimdall, a new camera system that will point downward during the lander’s descent and collect super-high-resolution imagery of the regolith before, during, and after landing.

heimdall

“The camera system will return the highest resolution images of the undisturbed lunar surface yet obtained, which is important for understanding regolith properties. We will be able to essentially video the landing in high resolution for the first time, so we can understand how the plume behaves – how far it spreads, how long particles are lofted. This information is crucial for the safety of future landings,” said the project’s R. Aileen Yingst in a PSI release.

The regolith is naturally the subject of much curiosity, since if we’re to establish a semi-permanent presence on the Moon we’ll have to deal with it one way or another. So Projects like Honeybee’s PlanetVac, which can suck up and test materials right at landing, or the Regolith Adherence Characterization, which will see how the stuff sticks to various materials, will be invaluable.

RadSatg Deployed w Crop

RadSat-G deployed from the ISS for its year-long mission to test radiation tolerance on its computer systems.

Several projects are continuations of existing projects that are great fits for lunar missions. For example, the lunar surface is constantly being bombarded with all kinds of radiation, since the Moon lacks any kind of atmosphere. That’s not a problem for machinery like wheels or even solar cells, but for computers radiation can be highly destructive. So Brock LaMere’s work in radiation-tolerant computers will be highly relevant to landers, rovers, and payloads.

LaMere’s work has already been tested in space via the Nanoracks facility aboard the International Space Station, and the new NASA funding will allow it to be tested on the lunar surface. If we’re going to be sending computers up there that people’s lives will depend on, we better be completely sure they aren’t going to crash because of a random EM flux.

The rest of the projects are characterized here, with varying degrees of detail. No doubt we’ll learn more soon as the funding disbursed by NASA over the next year or so helps flesh them out.

VizibleZone wants to make pedestrians more visible to autonomous cars

Israel-based Viziblezone wants to make it easier for self-driving cars to spot pedestrians, even if they aren’t yet in the field of view of the car’s sensors. I haven’t heard of all that many pedestrian detection startups, but there is a first for everything and if nothing else, Viziblezone is tackling an important issue, with millions of pedestrians being injured by cars every year.

To warn cars of nearby pedestrians, Viziblezone, which is part of OurCrowd’s Labs/02 incubator, is relying on smartphones and detectors inside the vehicle. The company argues that its solution will become especially useful once self-driving cars become the norm.

“Viziblezone offers a cost-effective, software-based ‘pedestrian detector’ that effectively turns in-vehicle and mobile phone RF facilities into a kind of an ‘Iron Dome’ for people on the streets and sidewalks,” said Gabi Ofir, the company’s CEO and founder, whose resume includes 20 years of work on communication protocols at Motorola . “By utilizing the wide distribution of mobile devices among pedestrians, it transforms them into “smart beacons” that cars can see and then avoid. ”

The company’s solution is obviously looking ahead to self-driving cars, but the company argues that its solution, when coupled with smartphones in the car, could be used now. Since it doesn’t rely on visual sensors, it works under in any weather and at ranges of up to 150 meters.

image009

MIT develops tiny ‘walking’ motor that help more complex robots self-assemble

MIT Micro Robots 0It’s becoming increasingly apparent that robots of the future will be less ‘Wall-E’ and more ‘Voltron meets ant swarm’ – case in point, this new ambulatory motor created by MIT professor Neil Gershenfeld and his students at the school. The motor above is little more than a magnet and coil with some structural parts, but it can ‘walk’ back and forth or make the gears of a more complicated machine move back and forth.

On its own, this little moving microbes is impressive enough, but its real potential lies in what could happen were it to be assembled with others of its ilk, and with other building-block robotics components made up of simple parts, which is the vision of Gershenfeld and his research team. Previously, they’ve already shown that other core components can be assembled from the same limited set of fundamental ingredients, and in future, the idea is that these tiny core machines could actually automatically self-assemble into larger structures capable of carrying out specific tasks.

micro robots 2

These tiny bots can also move gears, which is key in terms of having them build bigger, more context systems. Credit: MIT.

That’s right: This little moving motor and its ilk might one day become part of a system that can become a agricultural robot one minute and a disaster relief bot the next. That’s an end-state that will take a lot of work to achieve, but Gershenfeld is already working with MIT graduate student Will Langford on a machine that combines 3D-printing with automated circuit builds, but that can handle much more sophisticated creation of fully functional robots using only digital blueprints as input.

Automated self-assembly is a tempting carrot in the world of cutting-edge robotics, and it’s obvious why. Here’s hoping they just don’t achieve T-1000 efficacy without proper behavioural limitations in place.

Team studies drone strikes on airplanes by firing them into a wall at 500 MPH

Bird strikes are a very real danger to planes in flight, and consequently aircraft are required to undergo bird strike testing — but what about drones? With UAV interference at airports on the rise, drone strike testing may soon be likewise mandatory, and if it’s anything like what these German researchers are doing, it’ll involve shooting the craft out of air cannons at high speed.

The work being done at Fraunhofer EMI in Freiburg is meant to establish some basic parameters for how these things ought to be tested.

Bird strikes, for example, are tested by firing a frozen poultry bird like a chicken or turkey out of an air cannon. It’s not pretty, but it has to be done. Even so, it’s not a very good analogue to a drone strike.

“From a mechanical point of view, drones behave differently to birds and also weigh considerably more. It is therefore uncertain, whether an aircraft that has been successfully tested against bird strike, would also survive a collision with a drone,” explained Fraunhofer’s Sebastian Schopferer in a news release.

The team chose to load an air cannon up with drone batteries and engines, since those make up most of any given UAV’s mass. The propellers and arms on which they’re mounted are generally pretty light and will break easily — compared with a battery weighing the better part of a kilogram, they won’t add much to the damage.

drone testing

The remains of a drone engine and battery after being propelled into the plate on the left at hundreds of miles per hour.

The drones were fired at speeds from 250 to 570 miles per hour (115 to 255 meters per second by their measurement) at aluminum plates of up to 8 millimeters of thickness. Unsurprisingly, there was “substantial deformation” of the plates and the wingless drones were “completely destroyed.” Said destruction was recorded by a high-speed camera, though unfortunately the footage was not made available.

It’s necessary to do a variety of tests to determine what’s practical and what’s unnecessary or irrelevant — why spend the extra time and money firing the drones at 570 when 500 does the same level of damage? Does including the arms and propellers make a difference? At what speed is the plate in danger of being pierced, necessitating additional protective measures? And so on. A new rig is being constructed that will allow acceleration (and deceleration) of larger UAVs.

With enough testing the team hopes that not only could such things be standardized, but simulations could be built that would allow engineers to virtually test different surfaces or materials without a costly and explosive test rig.

Startups at the speed of light: Lidar CEOs put their industry in perspective

As autonomous cars and robots loom over the landscapes of cities and jobs alike, the technologies that empower them are forming sub-industries of their own. One of those is lidar, which has become an indispensable tool to autonomy, spawning dozens of companies and attracting hundreds of millions in venture funding.

But like all industries built on top of fast-moving technologies, lidar and the sensing business is by definition built somewhat upon a foundation of shifting sands. New research appears weekly advancing the art, and no less frequently are new partnerships minted, as car manufacturers like Audi and BMW scramble to keep ahead of their peers in the emerging autonomy economy.

To compete in the lidar industry means not just to create and follow through on difficult research and engineering, but to be prepared to react with agility as the market shifts in response to trends, regulations, and disasters.

I talked with several CEOs and investors in the lidar space to find out how the industry is changing, how they plan to compete, and what the next few years have in store.

Their opinions and predictions sometimes synced up and at other times diverged completely. For some, the future lies manifestly in partnerships they have already established and hope to nurture, while others feel that it’s too early for automakers to commit, and they’re stringing startups along one non-exclusive contract at a time.

All agreed that the technology itself is obviously important, but not so important that investors will wait forever for engineers to get it out of the lab.

And while some felt a sensor company has no business building a full-stack autonomy solution, others suggested that’s the only way to attract customers navigating a strange new market.

It’s a flourishing market but one, they all agreed, that will experience a major consolidation in the next year. In short, it’s a wild west of ideas, plentiful money, and a bright future — for some.

The evolution of lidar

I’ve previously written an introduction to lidar, but in short, lidar units project lasers out into the world and measure how they are reflected, producing a 3D picture of the environment around them.

Check out the breakout sessions at TC Sessions: Mobility

TC Sessions: Mobility on July 10 in San Jose is fast approaching. Get ready for a superb lineup of speakers like Dmitri Dolgov (Waymo), Eric Allison (Uber) and Summer Craze Fowler (Argo AI). See the full agenda here.

In addition to the outstanding main stage content, TechCrunch is proud to partner with today’s leading mobility players for a full day of breakout sessions. These breakout sessions will give attendees deeper insights into overcoming some of mobility’s biggest challenges and answering questions directly from today’s industry leaders.

Breakout Session Lineup


How much data is needed to make Autonomous Driving a Reality?
Presented by: Scale AI

We are in the early days of autonomous vehicles, and what’s necessary to go into production is still very much undecided. Simply to prove that these vehicles are safer than driving with humans will require more than 1 billion miles driven. Data is a key ingredient for any AI problem, and autonomy is the mother of all AI problems. How much data is really needed to make autonomy safe, reliable, and widespread, and how will our understanding of data change as that becomes a closer reality? Sponsored by Scale AI.


Think Big by Starting Small: Micromobility Implications to the Future of Mobility

Presented by: Deloitte

A host of new micromobility services have emerged to address a broader range of transportation needs – bikesharing, electric scooters and beyond. The urban emergence of micromobility offers powerful lessons on finding the right balance between fostering innovations that will ultimately benefit consumers and broader transportation systems, while safeguarding public interests. Sponsored by Deloitte.


If You Build It, Will They Buy? – The Role of the FleetTech Partner in the Future Mobility Ecosystem with Brendan P. Keegan
Presented by: Merchants Fleet

The future will bring a convergence of new technologies, services, and connectivity to the mobility space – but who will manage and connect it all? Explore how FleetTech is creating the mobility ecosystem to help organizations embrace technologies – adopting your innovations through trials and pilots and bringing them to market. Sponsored by Merchants Fleet.


The Economics of Going Electric: Constructing NextGen EV Business Models
Presented by: ABB

How do we make the rapidly growing EV industry operational and scalable? Join ABB, HPE and Microsoft for a discussion on how government, industry, providers and suppliers are addressing market shifts and identifying solutions to build successful business models that support the future of mobility. Moderated and sponsored by ABB.


Bringing Efficiency to Closed-Course AV Testing with Atul Acharya
Presented by: AAA Northern California, Nevada & Utah

Looking to jump-start or accelerate your automated vehicle test operations? AAA has built its expertise by operating GoMentum Stations and performing safety assessments on multiple AVs and proving grounds. Join AAA as it shares its collective technical and operational learnings and testing results that will bring efficiency to your testing efforts. Sponsored by AAA Northern California, Nevada & Utah.


Friction-Free Urban Mobility
Presented by: Arrive

What does the future of seamless, urban mobility look like? How do mobility-as-a-service providers and connected vehicles work together to power transportation in a smart city? And which platform will aggregate all of the providers? In what promises to be a thought-provoking discussion, Arrive’s COO Dan Roarty will lay the foundation for what a city’s connected future will look like and outline key steps needed to achieve it. Sponsored by Arrive.


Michigan’s Mobility Ecosystem
Presented by: PlanetM

Revolutionary things can happen when some of the brightest minds in technology come together in one room. This Breakout Session will offer key insights into Michigan’s mobility ecosystem: the people, places and resources dedicated to the evolution of transportation mobility. Following a brief discussion, attendees will have the opportunity to connect with the people and companies moving the world forward through technology innovation and collaboration. Sponsored by PlanetM.


We hope to see you at TC Sessions: Mobility on July 10. Tickets are still on sale but selling fast. Book your $395 general admission ticket here. Students, grab a $45 here.