US safety regulators expand Tesla Autopilot investigation

U.S. federal safety regulators have “upgraded” its investigation into Tesla’s Autopilot advanced driver assistance system after discovering new incidents of the EVs crashing into parked first-responder vehicles.

The National Highway Traffic Safety Administration said in a notice Thursday that it was expanding its preliminary evaluation of Tesla Autopilot systems to an engineering analysis. This means that NHTSA will extend its existing crash analysis, evaluate additional data sets and perform vehicle evaluations as well as evaluate whether Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision, according to the agency.

The escalation is a critical and required step before NHTSA can issue a recall. An estimated 830,000 Tesla vehicles are involved in the probe, according to agency documents.

Tesla did not respond to a request for comment.

In a statement from NHTSA, the agency issued a reminder that “no commercially available motor vehicles today are capable of driving themselves.”

“Every available vehicle requires the human driver to be in control at all times, and all State laws hold the human driver responsible for operation of their vehicles,” an agency spokesperson said in a statement to TechCrunch. “Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly. NHTSA is empowered with robust enforcement tools to protect the public, to investigate potential safety issues, and act when we find evidence of noncompliance or an unreasonable risk to safety.”

NHTSA opened in August 2021 a preliminary investigation into Tesla Autopilot, citing 11 incidents in which vehicles crashed into parked first responder vehicles while the system was engaged. In those crashes, Tesla vehicles had engaged either Autopilot or a feature called Traffic Aware Cruise Control.

Most of the incidents took place after dark and occurred despite “scene control measures,” such as emergency vehicle lights, road cones and an illuminated arrow board signaling drivers to change lanes, the documents stated at the time.

Musk talks Tesla demand, EV startups, and scooters in expansive interview

Elon Musk weighed in on Tesla, SpaceX and his multitude of other companies — including that social media business he’s trying to purchase — during a wide-reaching 80-minute interview Tuesday that covered demand for EVs, the need for raw materials, the problem with hydrogen and the most promising EV startups.

While much of what Musk talked about at the Financial Times’ Future of the Car conference in London has been shared before, there were a few items that stood out, including that Tesla could stop taking customer orders for its vehicles.

Tesla may stop taking orders

For Tesla, the issue is not demand but supply. “Right now demand is exceeding production to a ridiculous degree,” Musk said. “We’re actually probably going to limit [or] stop taking orders for anything beyond a certain period of time.”

The automaker still aims to produce 20 million cars annually by 2030, a figure Musk said he chose because it represents 1% of the global fleet.

“But it’s not a promise,” he noted. “It’s an aspiration. I think we’ve got a good chance of getting there.”

That compares with the 930,000 vehicles Tesla produced last year, a figure that’s been “roughly equally difficult” to reach as getting to 20 million.

Musk admires Volkswagen

When asked to name the most impressive EV startup operating today, Musk named Volkswagen, just one day after Volkswagen CEO Herbert Diess said from the same conference stage that Tesla proved stronger than the German juggernaut expected.

“I think the company making the most progress besides Tesla is actually VW, which is not a startup but can be viewed in some ways as a startup from an electric vehicle standpoint,” Musk said.

He also said that there are several strong companies coming out of China, which accounts for more than a quarter of Tesla’s global sales and where Tesla plans to expand its Shanghai Gigafactory.

“There’s just a lot of super talented, hardworking people in China that strongly believe in manufacturing. They won’t just be burning the midnight oil, they’ll be burning the 3 a.m. oil. They won’t even leave the factory type of thing, whereas in America, people are trying to avoid going to work at all.”

Tesla will remain open source

Musk reiterated his standing invitation to automakers to make use of Tesla’s patents to build upon its Autopilot system.

“We only patent things in order to prevent others from creating this minefield of patents that inhibit progress with electric vehicles,” he said. “But we’re never going to really prosecute anyone for using our patents. So let’s just say you can use any Tesla patents for free, so I think hopefully that’s helpful to others.”

But Tesla needs a year to prove itself before automakers may consider it, Musk said.

“The traditional car makers will solve electrification. It’s not fundamentally difficult at this point to make electric cars. The thing that I think they may be interested in licensing is Tesla Autopilot full self-driving, and I think that would save a lot of lives.”

Buying a mining factory is not out of the question

As EV makers face a shortage of raw materials to make lithium-ion batteries, Tesla has been signing long-term deals with mining companies worldwide to secure its supply. But the automaker is not above becoming more involved with the earth-moving business.

“It’s not that we wish to buy mining companies, but if that’s the only way to accelerate the transition then we will do that,” Musk said. “There’s no arbitrary limitations on what’s needed to accelerate. We’ll just tackle whatever set of things are needed to accelerate sustainable energy, and doing mining and refining or buying a mining company provided we think we can do it.”

But scooters are out of the question

When asked if Tesla plans to make a car smaller and more affordable than the Model 3, such as a scooter, Musk spoke against the micromobility device.

“Scooters are very dangerous,” he said. “I don’t recommend anyone drive a scooter. If there’s ever an argument between a scooter and a car, it will lose.”

MIT study finds Tesla drivers become inattentive when Autopilot is activated

By the end of this week, potentially thousands of Tesla owners will be testing out the automaker’s newest version of its “Full Self-Driving” beta software, version 10.0.1, on public roads, even as regulators and federal officials investigate the safety of the system after a few high-profile crashes.

A new study from the Massachusetts Institute of Technology lends credence to the idea that the FSD system, which despite its name is not actually an autonomous system but rather an advanced driver assist system (ADAS), may not actually be that safe. Researchers studying glance data from 290 human-initiated Autopilot disengagement epochs found drivers may become inattentive when using partially automated driving systems.

“Visual behavior patterns change before and after [Autopilot] disengagement,” the study reads. “Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead.”

Tesla CEO Elon Musk has said that not everyone who has paid for the FSD software will be able to access the beta version, which promises more automated driving functions. First, Tesla will use telemetry data to capture personal driving metrics over a seven-day period in order to ensure drivers are still remaining attentive enough. The data may also be used to implement a new safety rating page that tracks the owner’s vehicle, which is linked to their insurance.

The MIT study provides evidence that drivers may not be using Tesla’s Autopilot (AP) as recommended. Because AP includes safety features like traffic-aware cruise control and autosteering, drivers become less attentive and take their hands off the wheel more. The researchers found this type of behavior may be the result of misunderstanding what the AP features can do and what its limitations are, which is reinforced when it performs well. Drivers whose tasks are automated for them may naturally become bored after attempting to sustain visual and physical alertness, which researchers say only creates further inattentiveness.

The report, titled “A model for naturalistic glance behavior around Tesla Autopilot disengagements,” has been following Tesla Model S and X owners during their daily routine for periods of a year or more throughout the greater Boston area. The vehicles were equipped with the Real-time Intelligent Driving Environment Recording data acquisition system1, which continuously collects data from the CAN bus, a GPS and three 720p video cameras. These sensors provide information like vehicle kinematics, driver interaction with the vehicle controllers, mileage, location and driver’s posture, face and the view in front of the vehicle. MIT collected nearly 500,000 miles’ worth of data. 

The point of this study is not to shame Tesla, but rather to advocate for driver attention management systems that can give drivers feedback in real time or adapt automation functionality to suit a driver’s level of attention. Currently, Autopilot uses a hands-on-wheel sensing system to monitor driver engagement, but it doesn’t monitor driver attention via eye or head-tracking.

The researchers behind the study have developed a model for glance behavior, “based on naturalistic data, that can help understand the characteristics of shifts in driver attention under automation and support the development of solutions to ensure that drivers remain sufficiently engaged in the driving tasks.” This would not only assist driver monitoring systems in addressing “atypical” glances, but it can also be used as a benchmark to study the safety effects of automation on a driver’s behavior.

Companies like Seeing Machines and Smart Eye already work with automakers like General Motors, Mercedes-Benz and reportedly Ford to bring camera-based driver monitoring systems to cars with ADAS, but also to address problems caused by drunk or impaired driving. The technology exists. The question is, will Tesla use it?

U.S. safety regulator opens investigation into Tesla Autopilot following crashes with parked emergency vehicles

U.S. auto regulators have opened a preliminary investigation into Tesla’s Autopilot advanced driver assistance system, citing 11 incidents in which vehicles crashed into parked first responder vehicles while the system was engaged.

The Tesla vehicles involved in the collisions were confirmed to have either have had Autopilot or a feature called Traffic Aware Cruise Control engaged, according to investigation documents posted on the National Highway Traffic and Safety Administration’s website. Most of the incidents took place after dark and occurred despite “scene control measures” such as emergency vehicle lights, road cones, and an illuminated arrow board signaling drivers to change lanes.

“The investigation will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” the document says.

The investigation covers around 765,000 Tesla vehicles that span all currently available models: Tesla Model Y, Model X, Model S, and Model 3. The 11 incidents or fires resulted in 17 injuries and one fatality. They occurred between January 2018 and July 2021.

This is not the first time Tesla’s Autopilot has fallen under the scrutiny of NHTSA, the country’s top vehicle safety regulator. In 2017, the agency investigated an incident that resulted in a fatal crash in 2016, though the EV maker was found to be at no-fault in the accident. NHSTA has investigated a further 25 crashes involving Tesla’s ADAS since, the Associated Press reported when it broke the story Monday.

In June, NHTSA issued an order requiring automakers to report crashes involving vehicles equipped with ADAS or Levels 3-5 of automated driving systems.

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” an agency spokesperson told TechCrunch on Monday.

TechCrunch has reached out to Tesla, which has dissolved its media relations division, for comment and will update the story if the company responds.

Drivers for Elon Musk’s Loop get a script about their ‘great leader’

Drivers for Elon Musk’s underground Loop system in Las Vegas have been instructed to bypass passengers’ questions about how long they have been driving for the company, declare ignorance about crashes, and shut down conversations about Musk himself.

Using public records laws, TechCrunch obtained documents that detail daily operations at the Loop, which opened in June to transport attendees around the Las Vegas Convention Center (LVCC) using modified Tesla vehicles. Among the documents is a “Ride Script” that every new recruit must follow when curious passengers ask questions.

The script shows just how serious The Boring Company (TBC), which built and operates the system, is about controlling the public image of the new system, its technology and especially its founder, Elon Musk.

“Your goal is to provide a safe ride for the passengers, not an entertaining ride. Keep conversation to a minimum so you can focus on the road,” advises the document. “Passengers will pepper you with questions. Here are some you may be asked and the recommended responses.”

If riders ask a driver how long they have been with the company, they are instructed to respond with: “Long enough to know these tunnels pretty well!” The document goes on to note: “Passengers will not feel safe if they think you’ve only been driving for a week (even though that could mean hundreds of rides). Accordingly, do not share how long you’ve been employed here, but instead, find a way to evade the question or shift the focus,” the document advises drivers.

When asked how many crashes the system has experienced (the script uses the term “accidents”), drivers are told to respond: “It’s a very safe system, and I’m not sure. You’d have to reach out to the company.” Riders should expect similarly vague responses if they wonder how many employees or drivers TBC has, or how much the tunnels cost to dig. (About $53 million in total).

The use of Tesla’s advanced driver assistance system that is branded “Autopilot” is clearly a sore point at TBC. Clark County does not currently permit the use of the various driver assistance features anywhere within the Loop system, including automatic emergency braking or technologies that make the vehicle aware of obstacles and keep the vehicle in lane.

Officials even require mechanics to check the vehicles to ensure these are not activated.

“In addition to completing the actions under the initial inspection checklist, maintenance staff will verify that the automatic features of the vehicle, such as steering and braking/acceleration/deceleration assist (commonly known as Autopilot) are disabled for manual loop operation,” the document reads. The following checks will be conducted on a daily basis by CWPM technicians, according to the Vehicle Maintenance plan viewed by TechCrunch.

If a passenger should ask whether the Loop’s Tesla vehicles use Autopilot, drivers will give a response. However, this content was marked “Public Safety Related Confidential” in the documents TechCrunch received and was redacted, as were many other technical details.

TechCrunch’s repeated requests to officials to explain this decision went unanswered.

He who shall not be named

The script also covers responses to questions about Musk himself: “This category of questions is extremely common and extremely sensitive. Public fascination with our founder is inevitable and may dominate the conversation. Be as brief as possible, and do your best to shut down such conversation. If passengers continue to force the topic, politely say, ‘I’m sorry, but I really can’t comment’ and change the subject.”

Nevertheless, the script provides a number of replies to common Musk questions. Ask what Musk is like and you should expect the answer: “He’s awesome! Inspiring / motivating / etc.”

Follow up with: “Do you like working for him?” and you’ll get a response that could have come straight from North Korea: “Yup, he’s a great leader! He motivates us to do great work.”

Should a customer wonder how involved Musk is in the business, the driver will tell them: “He’s the company founder, and has been very involved and supportive.” Questions about Musk’s erratic tweets will be brushed off: “Elon is a public figure. We’re just here to provide an awesome transportation experience!”

One question, however, seems to hint that not everyone is happy working for Musk: “Is it true what I’ve read about him in the papers that he [is a mean boss / smokes pot / doesn’t let employees take vacations / etc.]?” Your driver’s rather equivocal response will be: “I haven’t seen that article, but that hasn’t been my experience.”

On a side note: While the hundreds of pages of training documents and operational manuals that TechCrunch obtained detail strong policies against drug use and harassment at the Loop, the word “vacation” does not otherwise appear.

Tech that’s allowed

Because Clark County currently forbids the use of automated driving features in the Loop, human drivers could be part of the system for some time. But the system is home to plenty of other advanced technologies, according to design and operational documents submitted to Clark County. Each of the 62 Teslas in the underground Loop has a unique RFID chip — as used in contactless payment systems — that pinpoints its location when it passes over one of 55 antennas installed in the roadway, stations and parking stalls.

Each vehicle also streams data to 24 hotspots through the system, sharing its speed, state of charge, the number of passengers in the car, and whether they are wearing seatbelts. Riders should be aware that every car is also constantly streaming real-time video from a camera inside the passenger cabin. All this data, along with video from 81 fixed cameras throughout the Loop, is fed to an Operations Control Center (OCC) located a few blocks away from the Convention Center. Video is recorded and stored for at least two weeks.

In the OCC, an operator is monitoring the camera feeds and other sensors for security threats or other problems — such as a driver using their own cellphone or speeding. The OCC can communicate with any driver via a Bluetooth headset or an in-car iPad that displays messages, alerts and a map of the car’s location in the tunnels. Vehicles have strict speed limits, ranging from 10 mph within stations to 40 mph on straight tunnel sections, and must maintain at least 6 seconds of separation from the car in front.

During testing this spring, the documents reveal that Clark County officials found some drivers were not following all the rules. “When asked about the speed limitations, several drivers replied with wrong straightaway and/or curved tunnel speeds. None provided at station, express lane, or ramp speeds,” reads one document. “Drivers were not announcing to the passengers to buckle their seatbelts. When asked, [some were saying] that they are optional or not required.”

Several drivers were also failing to maintain the 6-second safety margin with cars in front. TBC told Clark County that it would provide refresher training in those areas.

TBC, Clark County, and the Las Vegas Convention and Visitors Authority, which oversees the LVCC, did not reply to multiple requests for comment for this story.

The LVCVA recently signed a contract with Alphabet’s spin-out urban advertising agency, Intersection Media, to sell naming rights to the Loop system, which it hopes will net it $4.5 million.

TBC is currently building two extensions to the Loop to serve nearby hotels and ultimately wants to build a transit system covering much of the Strip and downtown Las Vegas with more than 40 stations. That system would be financed by TBC and supported by ticket sales.

Ford takes aim at Tesla, GM with its new hands-free driving system

Ford will debut its new hands-free driving feature on the 2021 F-150 pickup truck and certain 2021 Mustang Mach-E models through a software update later this year, technology that the automaker developed to rival similar systems from Tesla and GM.

That hands-free capability — which uses camera, radar sensors and software to provide a combination of adaptive cruise control, lane centering and speed sign recognition — has undergone some 500,000 miles of development testing, Ford emphasized in its announcement and tweet from its CEO Jim Farley in a not-so-subtle dig at Tesla’s approach of rolling out beta software to customers. The system also has an in-cabin camera that monitors eye gaze and head position to help ensure the driver’s eyes remain on the road.

The hands-free system will be available on vehicles equipped with Ford’s Co-Pilot360 Technology and will only work on certain sections of divided highways that Ford. The system, which will be rolled out via software updates later this year, will initially be available on more than 100,000 miles of highways in North America.

The system does comes with a price. BlueCruise software, which includes a three-year service period, will cost $600. The price of upgrading the hardware will depend on the vehicle. For instance, on F-150 owners will have to plunk down another $995 for the hardware, while owners of the “select” Mustang Mach-E model variant will have to pay an additional $2,600. BlueCruise comes standard on CA Route 1, Premium and First Edition variants of the Mustang Mach-E.

While nearly every automaker offers some driver assistance features, Ford is clearly aiming to compete with or capture market share away from GM and Tesla — the two companies with the best-known and capable ADAS. Convincing customers that its system is worth the expense will be critical to meeting its internal target of selling more than 100,000 vehicles equipped with BlueCruise in the first year, based on company sales and take-rate projections.

GM Super Cruise uses a combination of lidar map data, high-precision GPS, cameras and radar sensors, as well as a driver attention system, which monitors the person behind the wheel to ensure they’re paying attention. Unlike Tesla’s Autopilot driver assistance system, users of Super Cruise do not need to have their hands on the wheel. However, their eyes must remain directed straight ahead.

Tesla’s Autopilot feature also combines sensors like cameras and radar, computing power and software. Autopilot, which comes standard in all new Tesla vehicles, will steer, accelerate and brake automatically within its lane. Tesla uses a torque sensor in the steering wheel to determine if drivers are paying attention, although many owners have found and publicly documented hacks so they can keep their hands off the wheels and eyes off the road ahead. Tesla charges $10,000 for its upgrade to FSD (its own internal branding meant to stand for full self-driving). FSD is not an autonomous system. It does provide a number of more capable driver assist functions including automatic lane changes, the ability to recognize and act upon traffic lights and stop signs and a navigation feature that will suggest lane changes on route and automatically steer the vehicle toward highway interchanges and exits.

Ford said that its system communicates with drivers in different ways, including displaying text and blue lighting cues in the instrument cluster, which it says is effective even for those with color blindness.

The so-called BlueCruise hands-free technology will be offered in other Ford vehicle models in the future, the company said. Drivers who opt for the technology will continue to receive software updates as it is improved. Ford said future improvements will include a feature that will let the vehicle change lanes by tapping the turn signal indicator as well as one that will predict and then adjust vehicle speed for roundabouts and curves. The company also said it plans to offer regular mapping updates.

Tesla is willing to license Autopilot and has already had “preliminary discussions” about it with other automakers

Tesla is open to licensing its software, including its Autopilot highly-automated driving technology, and the neural network training it has built to improve its autonomous driving technology. Tesla CEO Elon Musk revealed those considerations on the company’s Q4 earnings call on Wednesday, adding that the company has in fact already “had some preliminary discussions about licensing Autopilot to other OEMs.”

The company began rolling out its beta version of the so-called ‘full self-driving’ or FSD version of Autopilot late last year. The standard Autopilot features available in general release provide advanced driver assistance (ADAS) which provide essentially advanced cruise control capabilities designed primarily for use in highway commutes. Musk said on the call that he expects the company will seek to prove out its FSD capabilities before entering into any licensing agreements, if it does end up pursuing that path.

Musk noted that Tesla’s “philosophy is definitely not to create walled gardens” overall, and pointed out that the company is planning to allow other automakers to use its Supercharger networks, as well as its autonomy software. He characterized Tesla as “more than happy to license” those autonomous technologies to “other car companies,” in fact.

One key technical hurdle required to get to a point where Tesla’s technology is able to demonstrate true reliability far surpassing that of a standard human driver is transition the neural networks operating in the cars and providing them with the analysis that powers their perception engines is to transition those to video. That’s a full-stack transition across the system away from basing it around neural nets trained on single cameras and single frames.

To this end, the company has developed video labelling software that has had “a huge effect on the efficiency of labeling,” with the ultimate aim being enabling automatic labeling. Musk (who isn’t known for modesty around his company’s achievements, it should be said) noted that Tesla believes “it may be the best neural net training computer in the world by possibly an order of magnitude,” adding that it’s also “something we can offer potentially as a service.”

Training huge quantities of video data will help Tesla push the reliability of its software from 100% that of a human driver, to 200% and eventually to “2,000% better than the average human,” Musk said, while again suggesting that it won’t be a technological achievement the company is interested into keeping to themselves.

Tesla is willing to license Autopilot and has already had “preliminary discussions” about it with other automakers

Tesla is open to licensing its software, including its Autopilot highly-automated driving technology, and the neural network training it has built to improve its autonomous driving technology. Tesla CEO Elon Musk revealed those considerations on the company’s Q4 earnings call on Wednesday, adding that the company has in fact already “had some preliminary discussions about licensing Autopilot to other OEMs.”

The company began rolling out its beta version of the so-called ‘full self-driving’ or FSD version of Autopilot late last year. The standard Autopilot features available in general release provide advanced driver assistance (ADAS) which provide essentially advanced cruise control capabilities designed primarily for use in highway commutes. Musk said on the call that he expects the company will seek to prove out its FSD capabilities before entering into any licensing agreements, if it does end up pursuing that path.

Musk noted that Tesla’s “philosophy is definitely not to create walled gardens” overall, and pointed out that the company is planning to allow other automakers to use its Supercharger networks, as well as its autonomy software. He characterized Tesla as “more than happy to license” those autonomous technologies to “other car companies,” in fact.

One key technical hurdle required to get to a point where Tesla’s technology is able to demonstrate true reliability far surpassing that of a standard human driver is transition the neural networks operating in the cars and providing them with the analysis that powers their perception engines is to transition those to video. That’s a full-stack transition across the system away from basing it around neural nets trained on single cameras and single frames.

To this end, the company has developed video labelling software that has had “a huge effect on the efficiency of labeling,” with the ultimate aim being enabling automatic labeling. Musk (who isn’t known for modesty around his company’s achievements, it should be said) noted that Tesla believes “it may be the best neural net training computer in the world by possibly an order of magnitude,” adding that it’s also “something we can offer potentially as a service.”

Training huge quantities of video data will help Tesla push the reliability of its software from 100% that of a human driver, to 200% and eventually to “2,000% better than the average human,” Musk said, while again suggesting that it won’t be a technological achievement the company is interested into keeping to themselves.

Tesla has increased the price of its “Full Self-Driving” option to $10,000

Tesla has made good on founder and CEO Elon Musk’s promise to boost the price of its “Full Self-Driving” (FSD) software upgrade option, increasing it to $10,000 following the start of the staged rollout of a beta version of the software update last week. This boosts the price of the package $2,000 from its price before today, and it has steadily increased since last May.

The FSD option has been available as an optional add-on to complement Tesla’s Autopilot driver assistance technology, even though the features themselves haven’t been available to Tesla owners before the launch of the beta this month. Even still, it’s only in limited beta, but this is the closest Musk and Tesla have come to actually launching something under the FSD moniker – after having teased a fully autonomous mode in production Teslas for years now.

Despite its name, FSD isn’t what most in the industry would define as full, Level 4 or Level 5 autonomy per the standards defined by SAE International and accepted by most working on self-driving. Musk has designed it as vehicles having the ability “to be autonomous but requiring supervision and intervention at times,” whereas Levels 4 and 5 (often considered ‘true self-driving’) under SAE standards require no driver intervention.

Still, the technology does appear impressive in some ways according to early user feedback – though testing any kind of self-driving software unsupervised via the general public does seem an incredibly risky move. Musk has said that we should see a wide rollout of the FSD tech beyond the beta before year’s end, so he definitely seems confident in its performance.

The price increase might be another sign of his and the company’s confidence. Musk has always maintained that users were getting a discount by handing money over early to Tesla in order to help it develop technology that would come later, so in many ways it makes sense that the price increase comes now. This also obviously helps Tesla boost margins, though it’s already riding high on earning that beat both revenue and profit expectations from analysts.

Tesla vehicles recognize and respond to traffic lights, stop signs with latest software update

Properly equipped Tesla vehicles can now recognize and respond to traffic lights and stop signs thanks to a software update that the company started pushing out to owners over the weekend.

The software update had been available to sliver of Tesla owners, some of whom had posted videos of the new capability. Now, the automaker is pushing the software update (2020.12.6) to the broader fleet.

The feature isn’t available in every Tesla vehicle on the road today. The vehicles must be equipped with the most recent Hardware 3 package and the fully optioned Autopilot package that the company has marketed as “full self-driving.”

The feature called Traffic Light and Stop Sign Control is designed to allow the vehicles to recognize and respond to traffic lights and stop signs.

To be clear, Tesla vehicles are not self-driving and this feature has its limits. The feature slows properly equipped Tesla vehicles to a stop when using “traffic-aware cruise control” or “Autosteer.” The vehicle will slow for all detected traffic lights, including green, blinking yellow and off lights, according to the software release notes.

As the vehicle approaches an intersection, a notification will indicate the intention to slow down. The vehicle will then begin to slow down and stop and stop at the red line shown on the driving visualization, which is one the center display.

DragTimes tested and shared a video, which is posted below, of a beta version of the feature.

Owners must pull the Autopilot stalk once or manually press the accelerator pedal to continue through the stop line. Tesla said that the feature is designed to be conservative at first. Owners will notice that it will slowdown often and will not attempt to turn through intersections. “Over time, as we learn from the fleet, the feature will control more naturally,” the company wrote in the release notes.

Tesla warns in the release notes that “as with all Autopilot features, you must continue to pay attention and be ready to take immediate action, including braking because this feature may not stop for all traffic controls.”

The software update also improved driving visualization, which are displayed in the vehicle. Additional objects such as stop lights, stop signs and select road markings now appear on the screen. The stop sign and stop light visualizations are not a substitute for an attentive driver and will not stop the car, Tesla said in the release notes.