Suse’s OpenStack Cloud dissipates

Suse, the newly independent open-source company behind the eponymous Linux distribution and an increasingly large set of managed enterprise services, today announced a bit of a new strategy as it looks to stay on top of the changing trends in the enterprise developer space. Over the course of the last few years, Suse put a strong emphasis on the OpenStack platform, an open-source project that essentially allows big enterprises to build something in their own data centers akin to the core services of a public cloud like AWS or Azure. With this new strategy, Suse is transitioning away from OpenStack . It’s ceasing both production of new versions of its OpenStack Cloud and sales of its existing OpenStack product.

“As Suse embarks on the next stage of our growth and evolution as the world’s largest independent open source company, we will grow the business by aligning our strategy to meet the current and future needs of our enterprise customers as they move to increasingly dynamic hybrid and multi-cloud application landscapes and DevOps processes,” the company said in a statement. “We are ideally positioned to execute on this strategy and help our customers embrace the full spectrum of computing environments, from edge to core to cloud.”

What Suse will focus on going forward are its Cloud Application Platform (which is based on the open-source Cloud Foundry platform) and Kubernetes-based container platform.

Chances are, Suse wouldn’t shut down its OpenStack services if it saw growing sales in this segment. But while the hype around OpenStack died down in recent years, it’s still among the world’s most active open-source projects and runs the production environments of some of the world’s largest companies, including some very large telcos. It took a while for the project to position itself in a space where all of the mindshare went to containers — and especially Kubernetes — for the last few years. At the same time, though, containers are also opening up new opportunities for OpenStack, as you still need some way to manage those containers and the rest of your infrastructure.

The OpenStack Foundation, the umbrella organization that helps guide the project, remains upbeat.

“The market for OpenStack distributions is settling on a core group of highly supported, well-adopted players, just as has happened with Linux and other large-scale, open-source projects,” said OpenStack Foundation COO Mark Collier in a statement. “All companies adjust strategic priorities from time to time, and for those distro providers that continue to focus on providing open-source infrastructure products for containers, VMs and bare metal in private cloud, OpenStack is the market’s leading choice.”

He also notes that analyst firm 451 Research believes there is a combined Kubernetes and OpenStack market of about $11 billion, with $7.7 billion of that focused on OpenStack. “As the overall open-source cloud market continues its march toward eight figures in revenue and beyond — most of it concentrated in OpenStack products and services — it’s clear that the natural consolidation of distros is having no impact on adoption,” Collier argues.

For Suse, though, this marks the end of its OpenStack products. As of now, though, the company remains a top-level Platinum sponsor of the OpenStack Foundation and Suse’s Alan Clark remains on the Foundation’s board. Suse is involved in some of the other projects under the OpenStack brand, so the company will likely remain a sponsor, but it’s probably a fair guess that it won’t continue to do so at the highest level.

Nadella warns government conference not to betray user trust

Microsoft CEO Satya Nadella, delivering the keynote at the Microsoft Government Leaders Summit in Washington, DC today, had a message for attendees to maintain user trust in their tools technologies above all else.

He said it is essential to earn user trust, regardless of your business. “Now, of course, the power law here is all around trust because one of the keys for us, as providers of platforms and tools, trust is everything,” he said today. But he says it doesn’t stop with the platform providers like Microsoft. Institutions using those tools also have to keep trust top of mind or risk alienating their users.

“That means you need to also ensure that there is trust in the technology that you adopt, and the technology that you create, and that’s what’s going to really define the power law on this equation. If you have trust, you will have exponential benefit. If you erode trust it will exponentially decay,” he said.

He says Microsoft sees trust along three dimensions: privacy, security and ethical use of artificial intelligence. All of these come together in his view to build a basis of trust with your customers.

Nadella said he sees privacy as a human right, pure and simple, and it’s up to vendors to ensure that privacy or lose the trust of their customers. “The investments around data governance is what’s going to define whether you’re serious about privacy or not,” he said. For Microsoft, they look at how transparent they are about how they use the data, their terms of service, and how they use technology to ensure that’s being carried out at runtime.

He reiterated the call he made last year for a federal privacy law. With GDPR in Europe and California’s CCPA coming on line in January, he sees a centralized federal law as a way to streamline regulations for business.

As for security, as you might expect, he defined it in terms of how Microsoft was implementing it, but the message was clear that you needed security as part of your approach to trust, regardless of how you implement that. He asked several key questions of attendees.

“Cyber is the second area where we not only have to do our work, but you have to [ask], what’s your operational security posture, how have you thought about having the best security technology deployed across the entire chain, whether it’s on the application side, the infrastructure side or on the endpoint, side, and most importantly, around identity,” Nadella said.

The final piece, one which he said was just coming into play was how you use artificial intelligence ethically, a sensitive topic for a government audience, but one he wasn’t afraid to broach. “One of the things people say is, ‘Oh, this AI thing is so unexplainable, especially deep learning.’ But guess what, you created that deep learning [model]. In fact, the data on top of which you train the model, the parameters and the number of parameters you use — a lot of things are in your control. So we should not abdicate our responsibility when creating AI,” he said.

Whether Microsoft or the US government can adhere to these lofty goals is unclear, but Nadella was careful to outline them both for his company’s benefit and this particular audience. It’s up to both of them to follow through.

Satya Nadella looks to the future with edge computing

Speaking today at the Microsoft Government Leaders Summit in Washington DC, Microsoft CEO Satya Nadella made the case for edge computing, even while pushing the Azure cloud as what he called “the world’s computer.”

While Amazon, Google and other competitors may have something to say about that, marketing hype aside, many companies are still in the midst of transitioning to the cloud. Nadella says the future of computing could actually be at the edge where computing is done locally before data is then transferred to the cloud for AI and machine learning purposes. What goes around, comes around.

But as Nadella sees it, this is not going to be about either edge or cloud. It’s going to be the two technologies working in tandem. “Now, all this is being driven by this new tech paradigm that we describe as the intelligent cloud and the intelligent edge,” he said today.

He said that to truly understand the impact the edge is going to have on computing, you have to look at research, which predicts there will be 50 billion connected devices in the world by 2030, a number even he finds astonishing. “I mean this is pretty stunning. We think about a billion Windows machines or a couple of billion smartphones. This is 50 billion [devices], and that’s the scope,” he said.

The key here is that these 50 billion devices, whether you call them edge devices or the Internet of Things, will be generating tons of data. That means you will have to develop entirely new ways of thinking about how all this flows together. “The capacity at the edge, that ubiquity is going to be transformative in how we think about computation in any business process of ours,” he said. As we generate ever-increasing amounts of data, whether we are talking about public sector kinds of use case, or any business need, it’s going to be the fuel for artificial intelligence, and he sees the sheer amount of that data driving new AI use cases.

“Of course when you have that rich computational fabric, one of the things that you can do is create this new asset, which is data and AI. There is not going to be a single application, a single experience that you are going to build, that is not going to be driven by AI, and that means you have to really have the ability to reason over large amounts of data to create that AI,” he said.

Nadella would be more than happy to have his audience take care of all that using Microsoft products, whether Azure compute, database, AI tools or edge computers like the Data Box Edge it introduced in 2018. While Nadella is probably right about the future of computing, all of this could apply to any cloud, not just Microsoft.

As computing shifts to the edge, it’s going to have a profound impact on the way we think about technology in general, but it’s probably not going to involve being tied to a single vendor, regardless of how comprehensive their offerings may be.

Daily Crunch: Render wins the Startup Battlefield

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.

1. And the winner of Startup Battlefield at Disrupt SF 2019 is… Render

In the beginning, there were 20 startups. After three days of fierce competition, we now have a Battlefield champion.

That winner is Render, which has created a managed cloud platform to serve as an alternative to traditional cloud providers such as AWS, Azure and GCP. And the runner-up is OmniVis, which aims to make cholera detection as quick, simple and cheap as a pregnancy test.

2. Next Insurance raises $250M from Munich Re, becomes a unicorn

Next Insurance sells insurance products to small businesses. And Germany-based Munich Re, one of the world’s largest reinsurers, was the sole investor in its new round.

3. Roku to launch low-cost versions of its soundbar and subwoofer under Walmart’s onn brand

In September, Roku debuted the Smart Soundbar and Wireless Subwoofer, both at $180 each. The Walmart onn-branded Smart Soundbar and Wireless Subwoofer, meanwhile, will only cost $129 each.

4. No one could prevent another ‘WannaCry-style’ attack, says DHS official

Jeanette Manfra, the assistant director for cybersecurity for Homeland Security’s Cybersecurity and Infrastructure Security Agency, said at Disrupt SF that the 2017 WannaCry cyberattack was uniquely challenging because it spread so quickly: “I don’t know that we could ever prevent something like that.”

5. NASA shares 3D Moon data for CG artists and creators

The data set includes not just imagery but depth data, making it simple to build an incredibly detailed 3D map of the Moon.

6. As Sinai Ventures returns first fund, partner Jordan Fudge talks new LA focus

Fudge and co-founder Eric Reiner are centralizing the Sinai Ventures team in Los Angeles for its next fund — a bet on the rising momentum of the local startup ecosystem and their vision to be the city’s leading Series A and B firm. (Extra Crunch membership required.)

7. This week’s TechCrunch podcasts

We’ve got a new episode of Equity recorded at Disrupt, with Alex and Kate discussing why San Francisco remains a startup hub. (And keep an eye out later today for a bonus episode of Original Content.)

Harness launches Continuous Insights to measure software team performance

Jyoti Bansal, CEO and co-founder at Harness, has always been frustrated by the lack of tools to measure software development team performance. Harness is a tool that provides Continuous Delivery as a Service, and its latest offering, Continuous Insights, lets managers know exactly how their teams are performing.

Bansal says a traditional management maxim says that if you can’t measure a process, you can’t fix it, and Continuous Insights is designed to provide a way to measure engineering effectiveness. “People want to understand how good their software delivery processes are, and where they are tracking right now, and that’s what this product, Continuous Insights, is about,” Bansal explained.

He says that it is the first product in the market to provide this view of performance without pulling weeks or months of data. “How do you get data around what your current performance is like, and how fast you deliver software, or where the bottlenecks are, and that’s where there are currently a lot of visibility gaps,” he said. He adds, “Continuous Insights makes it extremely easy for engineering teams to clearly measure and track software delivery performance with customizable, dashboards.”

Harness measures four key metrics as defined by DevOps Research and Assessment (DORA) in their book Accelerate. These include deployment frequency, lead time, mean-time-to-recovery and failure change rate. “Any organization that can do a better job with these would would really out-innovate their peers and competitors,” he said. Conversely companies doing badly on these four metrics are more likely to fall behind in the market.

pasted image 0

Image: Harness

By measuring these four areas, it not only provides a way to track performance, he sees it as a way to gamify these metrics where each team tries to outdo one another around efficiency. While you would think that engineering would be the most data-driven organization, he says that up until now it has lacked the tooling. He hopes that Harness users will be able to bring that kind of rigor to engineering.

Render announces object storage service at TechCrunch Disrupt

It was a big day for startup Render, which participated in the TechCrunch Disrupt Startup Battlefield today. While it was at it, it also announced some upgrades to its managed cloud platform.

First of all, it announced the ability to spin up object storage in the cloud, while greatly simplifying the tasks associated with adding storage. CEO and founder Anurag Goel says that the storage option is something customers have been requesting, and as with their other services, they handle a lot of the heavy lifting for them.

“One of the things that our users want us to do next is to build out object storage. Even though they can use things like Amazon S3 and other cloud storage options, they know that Render is going to be easier for them to use. So they really want object storage, and they want everything in one place,” Goel explained.

If you want to do that today without Render, you would have to spin up a virtual machine in the cloud, attach the storage, set up backup schedules and take care of all of these other associated tasks, and what Render is doing with Render Disk, is stripping that all away and managing the process for them.

While the startup was at it, it also developed a concept called infrastructure as code. This allows developers to define their infrastructure requirements in a yaml file. When the developer sends the file to GitHub, Render can build the infrastructure for the customer on the fly based on the contents of this file.

Finally, they are offering a one-click launch to customers. This could come in handy for companies who are offering free trials or open source tools, to enable users to launch their applications with a single click from GitHub and it will load all of the required files.

Render challenges the cloud’s biggest vendors with cheaper, managed infrastructure

Render, a participant in the TechCrunch Disrupt SF Startup Battlefield, has a big idea. It wants to take on the world’s biggest cloud vendors by offering developers a cheaper alternative that also removes a lot of the complexity around managing cloud infrastructure.

Render’s goal is to help developers, especially those in smaller companies, who don’t have large DevOps teams, to still take advantage of modern development approaches in the cloud. “We are focused on being the easiest and most flexible provider for teams to run any application in the cloud,” CEO and founder Anurag Goel explained.

He says that one of the biggest pain points for developers and startups, even fairly large startups, is that they have to build up a lot of DevOps expertise when they run applications in the cloud. “That means they are going to hire extremely expensive DevOps engineers or consultants to build out the infrastructure on AWS,” he said. Even after they set up the cloud infrastructure, and move applications there, he points out that there is ongoing maintenance around patching, security and identity access management. “Render abstracts all of that away, and automates all of it,” Goel said.

It’s not easy competing with the big players on scale, but he says so far they have been doing pretty well, and plan to move much of their operations to bare metal servers, which he believes will help stabilize costs further.

“Longer term, we have a lot of ideas [about how to reduce our costs], and the simplest thing we can do is to switch to bare metal to reduce our costs pretty much instantly.” He says the way they have built Render will make that easier to do. The plan now is to start moving their services to bare metal in the fourth quarter this year.

Even though the company only launched in April, it is already seeing great traction. “The response has been great. We’re now doing over 100 million HTTP requests every week. And we have thousands of developers and startups and everyone from people doing small hobby projects to even a major presidential campaign,” he said.

Although he couldn’t share the candidate’s name, he said they were using Render for everything including their infrastructure for hosting their web site and their back-end administration. “Basically all of their cloud infrastructure is on Render,” he said.

Render has raised a $2.2 million seed round and is continuing to add services to the product, including several new services it will announce this week around storage, infrastructure as code and one-click deployment.

Salesforce is building an office tower in Sydney, pledging 1000 new jobs in next five years

Salesforce announced this week that it’s building another shiny tower. This one will be in Sydney with views of the harbor and the iconic Sydney Opera House. The company has also committed to adding 1000 new jobs in the next five years and to building the tower in a sustainable fashion.

In fact, Salesforce is pledging the new tower will be one of the greenest buildings in the country when they are finished. “The building has achieved Sydney’s first-ever WELL core and shell Platinum pre-certification, the highest obtainable pre-certification, and will achieve a 6 Star Green Star Design and As-Built rating, representing world excellence in sustainable design,” Salesforce’s Elizabeth Pinkham wrote in a blog post announcing the project.

As is Salesforce’s way, it’s going to be the tallest building in the city when it’s done, and will sit in the Circular Quay, part of the central business district in the city, and will house shops and restaurants on the main floor. As with all of its modern towers, it’s going to dedicate the top floor to allow for flexible use for employees, customers and partners. The building will also boast a variety of spaces including a Salesforce Innovation Center for customers along with social lounges, mindfulness areas and a variety of spaces for employees to collaborate.

Salesforce has had a presence in Sydney for over 15 years, according to the company, and this tower is an attempt to consolidate that presence into a single, modern space with room to expand over the next five years and add hundreds of new employees.

The announcement comes on the heels of the one earlier this year that the company was building a similarly grand project in Dublin to centralize operations in that city where it has had a presence since 2001.

AWS IQ matches AWS customers with certified service providers

AWS has a lot going on, and it’s not always easy for customers to deal with the breadth of its service offerings on its own. Today, the company announced a new service called AWS IQ that is designed to connect customers with certified service providers.

“Today I would like to tell you about AWS IQ, a new service that will help you to engage with AWS Certified third party experts for project work,” AWS’s Jeff Barr wrote in a blog post introducing the new feature. This could involve training, support, managed services, professional services or consulting. All of the companies available to help have received associate, specialty or professional certification from AWS, according to the post.

You start by selecting the type of service you are looking for, such as training or professional services, then the tool walks you through the process of defining your needs, including providing a title, description and what you are willing to pay for these services. The service then connects the requestor with a set of providers that match the requirements. From there, the requestor can review expert profiles and compare the ratings and offerings in a kind of online marketplace.

AWS IQ start screen

You start by selecting the type of service you want to engage

Swami Sivasubramanian, vice president at AWS, says they wanted to offer a way for customers and service providers to get together. “We built AWS IQ to serve as a bridge between our customers and experts, enabling them to get to work on new projects faster and easier, and removing many of the hassles and roadblocks that both groups usually encounter when dealing with project-based work,” he said in a statement.

The company sees this as a particularly valuable tool for small and medium-sized vendors, which might lack the expertise to find help with AWS services. The end result is that everyone should win. Customers get direct access to this community of experts, and the experts can more easily connect with potential customers to build their AWS consulting practice.

Microsoft OneDrive Personal Vault rolls out worldwide, launches expandable storage

Earlier this summer, Microsoft introduced an extra layer of security to its Dropbox competitor, OneDrive. The security features, called OneDrive Personal Vault, allow users to protect their files with two-step verification, like a fingerprint or facial recognition, PIN code, or a one-time code sent through email, SMS or Microsoft Authenticator. At the time of launch, however, the feature was only available to select markets. Today, it’s rolling out worldwide and introducing new features, including expandable storage.

The company said OneDrive Personal Vault would initially be available to Australia, New Zealand and Canada, but would reach all OneDrive users by the end of the year.

With today’s expansion, it’s a little ahead of schedule as it’s just now the end of September.

Personal Vault is available to all OneDrive users, with some limitations.

For those using OneDrive’s free or standalone 100GB storage plan, you’re able to store up to 3 files in Personal Vault. Office 365 subscribers can store as many files as they want, up to their storage limits.

Stronger authentication is the key selling point for Personal Vault, but it also comes with additional security measures. This includes “Scan and Shoot,” which lets you scan documents or shoot photos directly to Personal Vault, bypassing your device storage, like the camera roll. Personal Vault will also automatically lock files after a period of inactivity, restricts sharing on the files saved to prevent accidental shares, and automatically syncs files to a BitLocker-encrypted area of the hard drive on Windows 10 PCs.

ba2d0566 5e67 43ce 998a fa1aa6517dbeIn addition to the global launch of Personal Vault, Microsoft also today introduced new storage options for One Drive, plus new features like PC Folder backup and dark mode.

Starting today, OneDrive users will now be able to add storage to their plans in 200GB increments, starting at $1.99 per month.

Meanwhile, PC Folder backup will allow OneDrive to back up your desktop, documents and picture folders from your Windows PC to the cloud, similar to rival desktop apps from Dropbox and Google Drive, for example. This option is available to Windows 7, 8 and 10 PCs. On Windows 10, it’s integrated so users can even opt to enable it during Windows setup or updates.

And OneDrive will now support a dark mode on iOS 13.

Personal Vault is live globally, as of today.