Space startup Wyvern wants to make data about Earth’s health much more accessible

The private space industry is seeing a revolution driven by cube satellites, which are affordable, lightweight satellites that are much easier than traditional satellites to design, build and launch. It’s paving the way for new businesses like Wyvern, an Alberta-based startup that provides a very specific service that wouldn’t even have been possible to offer a decade ago: Relatively low-cost access to hyperspectral imaging taken from low-Earth orbit, which is a method for capturing image data of Earth across many more bands than we’re able to see with our eyes or traditional optics.

Wyvern’s founding team, including CEO Chris Robson, CTO Kristen Cote, CFO Callie Lissinna and VP of Engineering/COO Kurtis Broda, had experience building satellites through their schooling, including working on building the first ever satellite in space designed and built in Alberta, Ex-Alta 1. They’ve also developed their own proprietary optical technology to develop the kind of imagery that will best serve the needs of the clients they’re pursuing. Their first target market, for instance, are farmers, who will be able to log into the commercial version of their product and get up-to-date hyperspectral imaging data of their fields, which can help them optimize yield, detect changes in soil makeup that will tell them if they have too little nitrogen, or even help them spot invasive plants and insects.

“We’re doing all sorts of things that directly affect the bottom line farmers,” explained Robson in an interview. “If you can detect them, and you can quantify them, and the farmers can make decisions on how to act and ultimately how to increase the bottom line. A lot of those things you can’t do with multi-spectral [imaging] right now, for example, you can speciate with multi-spectral, so you can’t detect invasive species.”

Multi-spectral imaging, in contrast to hyperspectral imaging, measures light on average in between 3 to 15 bands, while hyperspectral can manage as many as hundreds of adjoining or neighboring bands, which is why it can do more specialist things like identifying the species of animals on the ground in an observed area from a satellite’s perspective.

Hyperspectral imaging is already a proven technology in use around the world for exactly these purposes, but the main way it’s captured is via drone airplanes, which Robson says is much more costly and less efficient than using CubeSats in orbit.

“Drone airplanes are really expensive, and with us, we’re able to provide it for 10 times less than a lot of these drones [currently in use,” he said.

Wyvern’s business model will focus on owning and operating the satellites, and providing access to the data it caters to customers in a way that’s easy for anyone to access and use.

“Our key differentiator is the fact that we allow access to actual actionable information,” Robson said. “Which means that if you want to order imagery, you do it through a web browser, instead of calling somebody up and waiting one to three days to get a price on it, and to find out whether they could even do what you’re asking.”

Robson says that it’s only even become possible and affordable to do this because of advances in optics (“Our optical system allows us to basically put what should be a big satellite into the form factor of a small one without breaking the laws of physics,” Robson told me), small satellites, data storage and monitoring stations, and privatized launches making space accessible through hitching a ride on a launch alongside other clients.

Wyvern will also occupy its own, underserved niche providing this highly specialized info, first to agricultural clients, and then expanding to five other verticals including forestry, water quality monitoring, environmental monitoring and defense. This isn’t something other more generalist satellite imaging providers like Planet Labs will likely be interested in pursuing, Robson said, because it’s an entirely different kind of business with entirely different equipment, clientele and needs. Eventually, Wyvern hopes to be able to open up access to the data it’s gathering even more broadly.

“You have the right to access [information regarding] the health of the Earth regardless of who you are, what government you’re under, what country you’re a part of or where you are in the world,” he said. “You have the right to see how other humans are treating the Earth, and to see how you’re treating the Earth and how your country is behaving. But you also have the right to take care of the Earth, because we’re super predators. We’re the most intelligent species. We are we have we have the responsibility of being stewards of the Earth. And part of that, though, is being able to add almost omniscient of what’s going on in the Earth in the same way that we understand what’s going on in our bodies. That’s that’s what we want for people.”

Right now, Wyvern is very early on the trajectory of making this happen – they’re working on their first round of funding, and have been speaking to potential customers and getting their initial product validation work finalized. But with actual experience building and launching satellites, and a demonstrated appetite for what they want to build, it seems like they’re off to a promising start.

One of NASA’s robotic astronaut helpers just flew on its own in space for the first time

NASA’s very own free-floating Companion Cube equivalent took its own first tentative ‘steps’ in space today, demonstrating its ability to rotate on its own in zero gravity inside the International Space Station. The robot, called ‘Bumble’ and one of a series of Astrobee robots that NASA developed to work along with astronauts on the ISS, is the first ever to fly on its own in space.

Bumble’s first flight wouldn’t necessarily wow at an airshow – the robot essentially flew a foot forward and rotated a bit. But they’re important basic maneuvers in terms of making sure the robot’s propulsion system is working and tuned correctly. Eventually, the plan is for these to operate autonomously and do some basic maintenance work, as well as support experiments, so it needs to be operating exactly as intended before it starts freely sharing space with tender human astronauts.

The Astrobee line currently counts three individual robots among its members, including Bumble, Honey (also already on the ISS) and Queen, which is coming up in July on the next resupply mission, if all goes to plan. Each is equipped with cameras to document experiments performed by humans, and they can network to actually move equipment around. The robots can also dock at a companion station to charge, and each has a little perching arm that lets it grab on to stuff to anchor itself or hold things.

Bumble blinks!

The 1-foot cubed bots were developed at NASA’s Ames research facility, and once fully operational, should free up astronauts to focus on things that only humans can handle – and there’s plenty of that work to go around on the ISS in terms of experimentation and research.

 

NASA’s historic Apollo 11 launch comes to your phone in AR

We’re nearing the 50th anniversary of the Apollo 11 Moon landing on July 24, and to celebrate, there’s a new interactive augmented reality app called 321 LAUNCH that will bring a faithful recreation of the launch to your mobile device, and any surface you happen to have to hand.

Their are actually two AR experiences launches as part of this project, including the launch simulator that will start on July 16th and provide eight days of live broadcast content until the actual launch day of the 24th. The other will be part of the USA TODAY app, since USA TODAY is putting together the broadcast along with FLORIDA TODAY. This embedded AR content will focus on educational material about the technology behind NASA’s Apollo program, including the Saturn V rocket that brought the lander to the Moon.

The 321 LAUNCH app is available now, and provides a launch simulation ahead of the live broadcast that lets you follow step-by-step as the rocket is assembled, moved to the pad and ultimately launched. Helpful descriptions provide a great summary of what’s happening at each step, and you can do this anywhere you find a flat surface.

It’s a great way to easily and accessibly experience the launch and learn more about the technology NASA developed to make it happen, along with learning some general info about spaceflight and what goes into launching any rocket, plus it’s a free download.

NASA’s historic Apollo 11 launch comes to your phone in AR

We’re nearing the 50th anniversary of the Apollo 11 Moon landing on July 24, and to celebrate, there’s a new interactive augmented reality app called 321 LAUNCH that will bring a faithful recreation of the launch to your mobile device, and any surface you happen to have to hand.

Their are actually two AR experiences launches as part of this project, including the launch simulator that will start on July 16th and provide eight days of live broadcast content until the actual launch day of the 24th. The other will be part of the USA TODAY app, since USA TODAY is putting together the broadcast along with FLORIDA TODAY. This embedded AR content will focus on educational material about the technology behind NASA’s Apollo program, including the Saturn V rocket that brought the lander to the Moon.

The 321 LAUNCH app is available now, and provides a launch simulation ahead of the live broadcast that lets you follow step-by-step as the rocket is assembled, moved to the pad and ultimately launched. Helpful descriptions provide a great summary of what’s happening at each step, and you can do this anywhere you find a flat surface.

It’s a great way to easily and accessibly experience the launch and learn more about the technology NASA developed to make it happen, along with learning some general info about spaceflight and what goes into launching any rocket, plus it’s a free download.

NASA’s X-59 supersonic jet will have a 4K TV instead of a forward window

NASA’s X-59 QueSST experimental quiet supersonic aircraft will have a cockpit like no other — featuring a big 4K screen where you’d normally have a front window. Why? Because this is one weird-looking plane.

The X-59, which is being developed by Lockheed Martin on a $247 million budget, is meant to go significantly faster than sound without producing a sonic boom, or indeed any noise “faster than a car door closing,” at least to observers on the ground.

Naturally in order to do this the craft has to be as aerodynamic as possible, which precludes the cockpit bump often found in fighter jets. In fact, the design can’t even have the pilot up front with a big window, because it would likely be far too narrow. Check out these lines:

The cockpit is more like a section taken out of the plane just over the leading edge of the rather small and exotically shaped wings. So while the view out the sides will be lovely, the view forward would be nothing but nose.

To fix that, the plane will be equipped with several displays, the lower ones just like you might expect on a modern aircraft, but the top one is a 4K monitor that’s part of what’s called the eXternal Visibility System, or XVS. It shows imagery stitched together from two cameras on the craft’s exterior, combined with high-definition terrain data loaded up ahead of time.

It’s not quite the real thing, but pilots spend a lot of time in simulators (as you can see here), so they’ll be used to it. And the real world is right outside the other windows if they need a reality check.

Lockheed and NASA’s plane is currently in the construction phase, though no doubt some parts are still being designed as well. The program has committed to a 2021 flight date, an ambitious goal considering this is the first such experimental, or X-plane, that the agency has developed in some 30 years. If successful, it could be the precursor to other quiet supersonic craft and could bring back supersonic overland flight in the future.

That’s if Boom doesn’t beat them to it.

Orbit Fab becomes first startup to supply water to ISS, paving the way for satellite refueling

Not even two years into its existence, orbital fuel supply startup Orbit Fab has chalked up a big win – successfully supplying the International Space Station with water, a first for any private company. It’s a big deal because providing water to the ISS involved a multi-day refueling process, done in microgravity, using processes and equipment Orbit Fab developed itself.

The key ingredient here, per ISS U.S. National Laboratory COO Kenneth Shields, which was the contracting agency for Orbit Fab’s refueling test, is that this method of resupply is totally out of spec in terms of how this process was designed to work on the ISS. By creating and successfully demonstrating a system that the ISS designers never conceived, Orbit Fab has shown that both private companies and NASA have the flexibility needed to build business models on existing space infrastructure.

The technology Orbit Fab developed and demonstrated has broader applications than just moving water around in space. Water was used in this example specifically because it’s one of the most inert propellants used in spaceflight thrusters, but the methods could extend to other common propellants, and make it possible to refuel satellites in orbit. Orbit Fab is working towards establish standard for satellite refueling interfaces to be used in orbital hardware, which could go a long way towards making it common practice to develop reusable satellites, instead of sticking with the more or less disposable hardware model used today.

Startups like Orbit Fab are the key to unlocking true commercialization of space, by identifying points in the value chain where innovation or improvement can lead to cost or resource efficiencies and ensure that space business is actually also viable business, in terms of profit potential.

NASA’s best-yet photo of asteroid Bennu nails the dramatic lighting

NASA has taking a breathtaking new image of near-earth asteroid Bennu, captured by its OSIRIS-REx spacecraft from its new, second orbit of the extraterrestrial object. The new picture, snapped on June 13, provides a pretty much full-length view of Bennu, with half of its surface lit by the sun and the other half in near-complete shadow.

The image also manages to capture an interesting feature of Bennu – the protruding ‘mole’ of its largest boulder, which sticks out at the bottom end of the asteroid (as oriented in this photo) and breaks up its relatively smooth contours. OSIREX-REx took this photo from only around 0.4 miles away from the rock, which is about two football fields away. From here, the cameras on the probe can capture detail as small as just 1.6 feet across – basically the size of an Xbox One – on the asteroid’s surface.

This orbit breaks a record for closest orbit by a spacecraft of any object in our solar system, and will help NASA researchers investigate debris plumes spotted coming from the surface of the asteroid as it careens through space. Meanwhile, we immediately benefit thanks to glamour shots like this one:

NASA taps CMU to develop robots to help turn pits on the Moon into potential habitats

Lunar rovers are cool – but imagine how much cooler they’d be if they could also rappel. Researchers at Carnegie Mellon University will try to make rappelling robots a reality, after having been selected by NASA as the recipient of a new $2 million research grant aimed at coming up with new technology to help robots explore ‘pits’ on the Moon.

Yes, pits, as distinct from craters, which are essentially surface features caused by meteorite impacts. These pits are more akin to sinkholes or caves on earth, with surface access but also with large underground hollow caverns and spaces that might provide easier access to minerals and water ice – and that might even serve as ready-made shelter for future Lunar explorers.

CMU Robotics Institute Professor Red Whittaker put forward a potential mission design that would aim to use intelligent, agile and fast robots to study these pits close up, since the they’ve been spotted by lunar orbital observers but these images don’t really provide the kind of detail needed to actually discover if the sinkholes will be useful to future Moon missions, or how.

Whittaker’s draft plan, which is codenamed ‘Skylight,’ would use robots that have a degree of autonomy to self-select where to look in their surface investigations, and they’d also need to act quickly: Once lunar night sets in, they’d be offline permanently, so they’d get about one week of active use time per the mission parameters.

NASA’s ambitious mission to send astronauts back to the lunar surface by 2024, and to establish a base on the Moon by 2028, will benefit from the kind of scouting provided by missions like ‘Skylight,’ but timing will be tight – current projections estimate 2023 as the target for when such a mission might happen.