Tesla vehicles recognize and respond to traffic lights, stop signs with latest software update

Properly equipped Tesla vehicles can now recognize and respond to traffic lights and stop signs thanks to a software update that the company started pushing out to owners over the weekend.

The software update had been available to sliver of Tesla owners, some of whom had posted videos of the new capability. Now, the automaker is pushing the software update (2020.12.6) to the broader fleet.

The feature isn’t available in every Tesla vehicle on the road today. The vehicles must be equipped with the most recent Hardware 3 package and the fully optioned Autopilot package that the company has marketed as “full self-driving.”

The feature called Traffic Light and Stop Sign Control is designed to allow the vehicles to recognize and respond to traffic lights and stop signs.

To be clear, Tesla vehicles are not self-driving and this feature has its limits. The feature slows properly equipped Tesla vehicles to a stop when using “traffic-aware cruise control” or “Autosteer.” The vehicle will slow for all detected traffic lights, including green, blinking yellow and off lights, according to the software release notes.

As the vehicle approaches an intersection, a notification will indicate the intention to slow down. The vehicle will then begin to slow down and stop and stop at the red line shown on the driving visualization, which is one the center display.

DragTimes tested and shared a video, which is posted below, of a beta version of the feature.

Owners must pull the Autopilot stalk once or manually press the accelerator pedal to continue through the stop line. Tesla said that the feature is designed to be conservative at first. Owners will notice that it will slowdown often and will not attempt to turn through intersections. “Over time, as we learn from the fleet, the feature will control more naturally,” the company wrote in the release notes.

Tesla warns in the release notes that “as with all Autopilot features, you must continue to pay attention and be ready to take immediate action, including braking because this feature may not stop for all traffic controls.”

The software update also improved driving visualization, which are displayed in the vehicle. Additional objects such as stop lights, stop signs and select road markings now appear on the screen. The stop sign and stop light visualizations are not a substitute for an attentive driver and will not stop the car, Tesla said in the release notes.

Video of Cybertruck driving on California highway reveals more questions

Video of Cybertruck driving on California highway reveals more questions

A video just surfaced showing the Tesla Cybertruck driving around LA. Elon Musk is purportedly driving, but it’s not confirmed that he was behind the wheel while this video was filmed. However the video reveals several things.

One, there’s no mirrors yet.

According to US regulations, passenger vehicles need to have a mirror inside and one on the driver’s side of the vehicle. The Cybertruck in this video does not have a driver’s side mirror.

When Musk unveiled the Cybertruck, he stated that the vehicle used a video camera for the rear-view camera, which is something other automakers are trying as well. Cadillac has done this for years. It works.

The lack mirror on the driver’s side is a bigger question. The vehicle used in the unveiling was missing exterior mirrors and the one in this video also lacks them too. It’s possible it uses a camera for the side mirrors though it’s yet to be announced. Other automakers including Audi have turned to cameras for European-spec’d vehicles as US-based vehicles must have a physical mirror.

Two, there’s a lot of body roll.

The video shows the driver taking a wide turn onto the street. In doing so, the Cybertruck appears to experience a large amount of body roll. A surprising amount, too.

The Cybertruck, like every other Tesla vehicle, has a bank of batteries on the bottom. Supposedly. If that’s the case, the bulk of the weight should be at the bottom, dropping the center of gravity and giving the vehicle a stable driving experience. In the Model X, this results in amazing protection from rolling over when impacted on the side.

The Cybertruck experienced a large amount of body roll with a wider wheel base than what’s allowed. During the unveiling, it was clear the Cybertruck’s tires were wider than the fenders. This is also not allowed by US standards as tires must be covered by fenders. I assumed Tesla would correct this in the final version and did this for stage presence and to improve stability in testing. The latest video shows a Cybertruck with tires also sticking out from the fenders. It’s not clear how far the tires protrude but so much so that the driver hits a traffic cone when turning into traffic. If the Cybertruck’s stance is narrowed, will the body roll be worse?

Also, the driver runs a red light because clearly the Cybertruck is living in a post-traffic light world.

[gallery ids="1915573,1916231,1915811,1915673,1915672,1915671,1915668"]

Elon Musk says Tesla “early access” full self-driving could arrive by end of year

Tesla CEO Elon Musk said on the company’s earnings call today that the company’s full self-driving mode, in a feature-complete release, could arrive as early as the end of this year. That would be made available in an ‘early access’ mode which is essentially a limited beta, and Musk qualified that this isn’t a sure thing.

“While it’s going to be tight, it still does appear that will be at least in limited in early access release of a feature complete self-driving feature this year,” he said on the call.

He added that it’s “not for sure,” but that it “appears to be on track” for a limited private beta by year’s end.

This follows the release of Tesla’s Smart Summon automated parking lot driverless parking lot hailing feature. It allows Tesla owners to call their cars from their parking spots to pick them up from a curbside within the parking lot. The feature has been used plenty of times with mixed results reported from early use, but Musk also said that the company will be releasing an updated version of the software with improvements in the next “week or so.”

The Smart Summon update is an improvement built on the data taken from the “over a million” uses of the feature by Tesla owners already since its release at the end of September.

To make use of the full self-driving mode that Tesla plans to introduce, vehicle owners will have to own the FSD upgrade package, which is a $7,000 upgrade after it increased from $6,000 in August.

Tesla began shipping its new full self-driving computer hardware in all new vehicles beginning in April, moving to its own custom chip. This was to ensure that enabling the FSD feature would be a software-only update, which is something the company had claimed would be possible with a previous generation of self-driving computing hardware, but the challenge has clearly been more difficult than expected – estimated timelines for the feature’s deployment have also slipped multiple times.

Tesla Autopilot design combined with driver inattention caused crash, NTSB says

The National Transportation Safety Board said driver inattention coupled with the design of Tesla’s advanced driver assistance system Autopilot and an over reliance on feature were behind the January 2018 crash of a Model S into a parked fire truck on a highway in Southern California.
The NTSB filed Wednesday the report a day after issuing a preliminary brief that provided important details about the incident, including that the Model S was in Autopilot mode when it crashed into the fire truck.

The crash, involving a 2014 Tesla Model S, occurred January 22, 2018 in Culver City, Calif. The Tesla had Autopilot engaged for nearly 14 minutes when it struck a fire truck that was parked on Interstate 405. The driver was not injured in the crash and the fire truck was unoccupied.

Autopilot includes two important features, Autosteer and Traffic-Aware Cruise Control. Autosteer is a lane-keeping assist system that can only be engaged after Traffic-Aware Cruise Control is activated. The Traffic-Aware Cruise Control is an adaptive cruise control system that modifies speed based on information from the camera and radar sensors.

According to NTSB, the Model S had Autopilot engaged and was in the HOV lane following another car.

In the 15 seconds prior to the crash the system detected and followed two different lead vehicles. Data shows that 3 to 4 seconds before the crash, the lead vehicle changed lanes to the right, the NTSB report says. When the Traffic-Aware Cruise Control no longer detected the lead vehicle, the system accelerated the Tesla from about 21 mph toward the preset cruise speed of 80 mph, which had been set by the driver about 5 minutes before the crash, the report says.

The “Autopilot” system detected a stationary object in the Tesla’s path about 0.49 seconds before the crash and the forward collision warning activated, displaying a visual warning and sounding an auditory warning. By the moment of impact, the Tesla had accelerated to 30.9 mph.

Autopilot was engaged in the final 13 minutes and 48 seconds of the trip and yet, the system detected driver-applied steering wheel torque for only 51 seconds of that time, the NTSB said.

While, the Tesla Model S owner’s manual contains numerous warnings about the limitations of these features and the need for drivers to keep their hands on the wheel, the driver was not paying attention, the NTSB said. More importantly, the Tesla’s Autopilot design permitted the driver to disengage from the driving task, the NTSB concluded.

Tesla Autopilot was engaged before 2018 California crash, NTSB finds

A Tesla Model S was in Autopilot mode —the company’s advanced driver assistance system — when it crashed into a fire truck in Southern California last year, according to a preliminary report released Tuesday by the National Transportation Safety Board.

Reuters was the first to the report on the contents of the public documents. A final accident brief, including NTSB’s determination of probable cause, is scheduled to be published Wednesday.

The crash, involving a 2014 Tesla Model S, occurred Jan. 22, 2018 in Culver City, Calif.  The Tesla had Autopilot engaged for nearly 14 minutes when it struck a fire truck that was parked on Interstate 405. The driver was not injured in the crash and the fire truck was unoccupied.

Tesla has not commented on the report. TechCrunch will update if the company provides a statement.

The report found that the driver’s hands were not on the wheel for the vast majority of that time despite receiving numerous alerts. Autopilot was engaged in the final 13 minutes and 48 seconds of the trip and the system detected driver-applied steering wheel torque for only 51 seconds of that time, the NTSB said. Other findings include:

  • The system presented a visual alert regarding hands-off operation of the Autopilot on 4 separate occasions.
  • The system presented a first level auditory warning on one occasion; it occurred following the first visual alert.
  • The longest period during which the system did not detect driver-applied steering wheel torque was 3 minutes and 41 seconds.

In the 2018 crash into a fire truck, the vehicle was operating a “Hardware Version 1” and a firmware version that had been installed via an over-the-air software update on December 28, 2017. The technology provided a number of convenience and safety features, including forward, lane departure and side collision warnings and automatic emergency braking as well as its adaptive cruise control and so-called Autosteer features, which when used together

While the report didn’t find any evidence that the driver was texting or calling in the moments leading up to the crash, a witness told investigators that he was looking down at what appear to be a smartphone. It’s possible that the driver was holding a coffee or bagel at the time of the crash, the report said.

Autopilot has come under scrutiny by the NTSB, notably a 2016 fatal crash in Florida and a more recent one involving a Walter Huang, who died after his Model X crashed into a highway median in California. The National Highway Traffic Safety Administration also opened an inquiry into the 2016 fatal crash and ultimately found no defects in the Autopilot system. NTSB determined the 2016 fatal crash was caused by a combination of factors that included limitations of the system.

The family of Huang filed in May 2019 a lawsuit against Tesla and the State of California Department of Transportation. The wrongful death lawsuit, filed in California Superior Court, County of Santa Clara, alleges that errors by Tesla’s Autopilot driver assistance system caused the crash.

Elon Musk: Tesla will ‘most likely’ begin computer chip upgrades this year

Tesla CEO Elon Musk said the company will “most likely” begin upgrading older electric vehicles with its new custom chip later this year — a lofty task that will involve retrofitting hundreds of thousands of Model S, X and 3s.

Musk tweeted Sunday night that the upgrades will begin most likely at the end of the fourth quarter.

Musk didn’t provide other details. He has previously said the upgrade would be free for owners who purchased the full self-driving feature, a software package that costs $6,000.

Tesla offers two different advanced driver assistance packages to customers: Autopilot and Full Self-Driving or FSD. Autopilot is ADAS that offers a combination of adaptive cruise control and lane steering and is now a standard feature on new cars. FSD includes Summon as well as Navigate on Autopilot, an active guidance system that navigates a car from a highway on-ramp to off-ramp, including interchanges and making lane changes.

While Tesla charges for the FSD software package, the vehicles are not fully autonomous. Musk has promised that the advanced driver assistance capabilities on Tesla vehicles will continue to improve until eventually reaching that full automation high-water mark.

The custom chip unveiled in April has been couched as a necessary hardware upgrade to reach that goal. Since March, new Model X and S vehicles have come equipped with the chip. The Model 3 followed a month later.

The custom chip was a milestone for the company. However, it still faces the considerable challenge of upgrading thousands of so-called “Hardware 2” vehicles, not to mention the continuous development of the software.

Tesla started producing electric vehicles with a more robust suite of sensors, radar, and cameras — called Hardware 2— in October 2016 under the premise and the promise that it had the hardware needed to eventually drive autonomously without human intervention. At that time, the company also began selling the upgraded full self-driving package that Musk said would eventually reach that ambitious target.

Tesla shows off next-gen automated emergency breaking stopping for pedestrians and cyclists

Back in 2017, Tesla introduced an automated emergency breaking (AEB) system for all its vehicles that’s powered by its Autopilot technology, which is available even for vehicles with haven’t purchased the actual Autopilot cruise-assist upgrade. Now, the automaker is showing off some of the more advanced features coming in its next-generation AEB update.

These include automatically engaging the brakes on a vehicle when the Autopilot-based system detects a pedestrian crossing the car’s path, and doing the same for a cyclist. Below, you can see those features reportedly working in real-life situations, according to Tesls’a official Twitter account.

These kinds of features aren’t new, and in fact have been present in some form since inclusion in a version of Volvo’s automated braking system in 2009. Safety organizations and regulators like the Insurance Institute for Highway Safety (IIHS) and the National Highway Traffic Safety Administration (NHTSA) have been testing and advocating for these systems for years, as well.

Not all AEB and driver-assist features are built equally, however, and in theory the versions of these systems based on vehicles with more advanced sensors and on-board computation should be more effective at actually avoiding or preventing collisions in practice. Tesla has made bold claims about the capabilities of its own system, especially when paired with its in-house AI processor technology, which will serve as the ‘brain’ on its future autonomous driving technology in Tesla cars.

Why all standard black Tesla cars are about to cost $1,000 more

Tesla will start charging $1,000 for its once-standard black paint color next month, according to a tweet Wednesday by CEO Elon Musk, the latest pricing adjustment by the automaker as it aims for profitability.

Basic white will become the new (and only) free standard paint color, Musk added in a followup tweet. Musk didn’t explain what prompted the change or provide any further details.

Automakers make pricing adjustments and offer incentives as tools to boost margin and sales. Yet, Tesla’s particular style — which mainly involves Musk tweeting out the changes — often feels like a company floating trial balloons to see what sticks, or what its customer base will accept.

Tesla has beefed up its publicity game in recent months on the heels of a disappointing quarter. In April, Tesla reported wider-than-expected loss of $702 million in the first quarter after disappointing delivery numbers, costs and pricing adjustments to its vehicles threw the automaker off of its profitability track.

For instance, the company released Tuesday a video announcing Beach Buggy Racing 2, the latest video game to be added to its arcade app. It also launched a promotion that invited people to its showroom to try out all of its video games.

Earlier this year, Musk tweeted about a price increase to its “full self-driving” or FSD feature. Tesla vehicles are not self-driving. Musk has promised that its advanced driver assistance system Autopilot will continue to improve until eventually reaching that full automation high-water mark. Autopilot now comes standard. The FSD feature, a software upgrade, costs $6,000.

The price of vehicles with the standard Autopilot is higher (although it should be noted that this standard feature is less than the prior cost of the option).

Meanwhile, Tesla is about to see its federal tax incentive reduced again, a development that could weigh on sales. (It should be noted that Musk stressed during Tesla’s shareholder meeting that there is not a demand problem for its vehicles, notably the Model 3)

Musk reminded his Twitter followers on Wednesday of the tax credit reduction. Tesla delivered its 200,00th electric vehicle in 2018, a milestone that triggered a countdown for the $7,500 federal tax credit offered to consumers who buy new electric vehicles. The tax credit will drop to $1,875 after June 30. 

Is your product’s AI annoying people?

Artificial intelligence (AI) is allowing us all to consider surprising new ways to simplify the lives of our customers. As a product developer, your central focus is always on the customer. But new problems can arise when the specific solution under development helps one customer while alienating others.

We tend to think of AI as an incredible dream assistant to our lives and business operations, when that’s not always the case. Designers of new AI services should consider in what ways and for whom might these services be annoying, burdensome or problematic, and whether it involves the direct customer or others who are intertwined with the customer. When we apply AI services to make tasks easier for our customers which end up making things more difficult for others, that outcome can ultimately cause real harm to our brand perception.

Let’s consider one personal example taken from my own use of Amy.ai, a service (from x.ai) that provides AI assistants named Amy and Andrew Ingram. Amy and Andrew are AI assistants that help schedule meetings for up to four people. This service solves the very relatable problem of scheduling meetings over email, at least for the person who is trying to do the scheduling.

After all, who doesn’t want a personal assistant to whom you can simply say, “Amy, please find the time next week to meet with Tom, Mary, Anushya and Shiveesh.” In this way, you don’t have to arrange a meeting room, send the email, and go back and forth managing everyone’s replies. My own experience showed that while it was easier for me to use Amy to find a good time to meet with my four colleagues, it soon became a headache for those other four people. They resented me for it after being bombarded by countless emails trying to find some mutually agreeable time and place for everyone involved.

Automotive designers are another group that’s incorporating all kinds of new AI systems to enhance the driving experience. For instance, Tesla recently updated its autopilot software to allow a car to change lanes automatically when it sees fit, presumably when the system interprets that the next lane’s traffic is going faster.

In concept, this idea seems advantageous to the driver who can make a safe entrance into faster traffic, while relieving any cognitive burden of having to change lanes manually. Furthermore, by allowing the Tesla system to change lanes, it takes away the desire to play Speed Racer or edge toward competitiveness that one may feel on the highway.

However, for the drivers in other lanes who are forced to react to the Tesla autopilot, they may be annoyed if the Tesla jerks, slows down, or behaves outside the normal realm of what people expect on the freeway. Moreover, if they are driving very fast and the autopilot did not recognize they were operating at a high rate of speed when the car decided to make the lane change, then that other driver can get annoyed. We can all relate to driving 75 mph in the fast lane, only to have someone suddenly pull in front of us at 70 as if they were clueless that the lane was moving at 75.

For two-lane traffic highways that are not busy, the Tesla software might work reasonably well.   However, in my experience of driving around the congested freeways of the Bay Area, the system performed horribly whenever I changed crowded lanes, and I knew that it was angering other drivers most of the time. Even without knowing those irate drivers personally, I care enough about driving etiquette to politely change lanes without getting the finger from them for doing so.

Post Intelligence robot

Another example from the Internet world involves Google Duplex, a clever feature for Android phone users that allows AI to make restaurant reservations. From the consumer point of view, having an automated system to make a dinner reservation on one’s behalf sounds excellent. It is advantageous to the person making the reservation because, theoretically, it will save the burden of calling when the restaurant is open and the hassle of dealing with busy signals and callbacks.

However, this tool is also potentially problematic for the restaurant worker who answers the phone. Even though the system may introduce itself as artificial, the burden shifts to the restaurant employee to adapt and master a new and more limited interaction to achieve the same goal – making a simple reservation.

On the one hand, Duplex is bringing customers to the restaurant, but on the other hand, the system is narrowing the scope of interaction between the restaurant and its customer. The restaurant may have other tables on different days, or it may be able to squeeze you in if you leave early, but the system might not handle exceptions like this. Even the idea of an AI bot bothering the host who answers the phone doesn’t seem quite right.

As you think about making the lives of your customers easier, consider how the assistance you are dreaming about might be more of a nightmare for everyone else associated with your primary customer. If there is a question regarding the negative experience of anyone related to your AI product, explore that experience further to determine if there is another better way to still delight them without angering their neighbors.

From a user experience perspective, developing a customer journey map can be a helpful way to explore the actions, thoughts, and emotional experiences of your primary customer or “buyer persona.” Identify the touchpoints in which your system interacts with innocent bystanders who are not your direct customers. For those people unaware of your product, explore their interaction with your buyer persona, specifically their emotional experience.

An aspirational goal should be to delight this adjacent group of people enough that they would move towards being prospects and, eventually, becoming your customers as well. Also, you can use participant ethnography to analyze the innocent bystander in relation to your product. This is a research method which combines the observations of people as they interact with processes and the product.

A guiding design inspiration for this research could be, “How can our AI system behave in such a way that everyone who might come into contact with our product is enchanted and wants to know more?”

That’s just human intelligence, and it’s not artificial.

Elon Musk calls it ‘financially insane’ to buy a car that isn’t an EV capable of full self-driving

During the Tesla Annual Shareholders Meeting that took place on Tuesday, Tesla CEO Elon Musk didn’t mince words when he talked about what he thinks of the value proposition of traditional fossil fuel vehicles. He called it “financially insane” to buy any car that isn’t an electric car capable of full autonomy – which, conveniently, currently is type of vehicle that only Tesla claims to sell.

Musk reiterated a claim he’s made previously about Tesla vehicles, that all of its cars manufactured since October 2016 have everything they need to become fully autonomous – with those built before the release of its new autonomous in-car computer earlier this year needing only a computer swap, replacing the new Tesla-built computer for the Nvidia ones they shipped with.

The Tesla CEO also reiterated his claim from earlier this year that there will be 1 million robotaxis on the road as of next year, noting that it’s easy to arrive at that number if you consider that it includes all Teslas, including Model X, Model S and Model 3 sold between October 2016 and today.

Regarding Tesla’s progress with self-driving, Musk noted that by end of year, Tesla hopes to deliver autonomy such that while you’ll still have to supervise the driving in-car, it’ll get you from your garage to your workplace without intervention. He said that by next year, their goal is the same thing but without requiring supervision, and then some time after that, pending regulatory cooperation, they’ll be able to do full autonomy without anyone on board.

Musk ended this musing with a colorful metaphor, likening buying a car that’s powered by traditional fossil fuel and without any path to self-driving to someone today “riding a horse and using a flip phone.”