Early Saturday, Tesla began letting owners request its “Full Self-Driving” software, suggesting that thousands of drivers will soon be on the road with the unregulated and mostly untested capabilities.
It’s the first time the firm has allowed ordinary customers to update to self-driving software, despite the fact that the phrase is an exaggeration by industry and legal standards. Tesla CEO Elon Musk had previously stated that owners will be able to request the updated suite of sophisticated driver-assistance technologies Owners must agree to allow Tesla to track their driving habits using the company’s insurance calculation. Tesla published a thorough handbook outlining the criteria that will be used to assess drivers. “Beta access will be granted” if their driving is considered “good” during a seven-day period, Musk wrote on Twitter.
It’s the latest twist in a tale that has regulators, safety advocates, and relatives of Tesla crash victims worried about the technology’s potential for disaster if it’s released on real-world roads. Approximately 2,000 beta testers have had access to the technology up until now. This weekend, albeit they would not receive them immediately.
Those who have purchased the now-$10,000 software upgrade, as well as those who have paid a Tesla membership for around $100 to $200 per month — assuming they can first pass Tesla’s safety monitoring — will be able to get it this weekend.
Musk has stated that the technology is a “debatable” concept, claiming that “full self-driving must function in order for it to be a compelling value proposition.”
Investigators are already investigating into its predecessor, called Autopilot, which navigates vehicles from interstate on-ramp to off-ramp and can park and summon cars, all while being monitored by a driver. Last month, the National Highway Traffic Safety Administration launched an inquiry into a dozen accidents involving parked emergency vehicles that occurred while Autopilot was turned on.
Autopilot’s capabilities are expanded to city streets with “Full Self-Driving,” which includes the capacity to guide the car turn-by-turn from point A to point B.
Requests for comment from Tesla and the NHTSA were not immediately returned. When comparing the modes using Tesla statistics and information from the National Highway Traffic Safety Administration, Tesla has consistently claimed that Autopilot is safer than driving a car manually.
“Autopilot is unquestionably safer” than traditional automobiles, according to Musk. The data isn’t exactly comparable, though, because Autopilot is only meant to be used on specific types of routes and in certain situations.
Tesla’s decision to quickly roll out the features to a wide number of customers has been criticized by regulators and industry colleagues, who believe the firm is rushing into a problem that requires thorough consideration and a focus on safety.
Despite its moniker, the new software does not meet the car industry’s or safety authorities’ definitions of “self-driving,” and drivers should remain vigilant while it is in use.
Before turning to Musk, National Transportation Safety Board Chair Jennifer Homendy stated, “I do believe their product is deceptive and generally contributes to more misuse and abuse.” “All I ask is that he emphasizes safety in the same way that he promotes innovation and new technology… Safety is equally as vital as, if not more crucial than, technological advancement.”
Tesla revealed its “safety score” system on its website shortly before the button’s availability, so drivers who wish to join up may evaluate it. It stated that drivers will be graded on a scale of 0 to 100, with the majority earning an 80 or higher. Drivers will be judged on five criteria, according to the company: forward collision warnings per 1,000 miles, occurrences of forceful braking, aggressive turning, hazardous following, and forced Autopilot disengagements. Tesla would then compute their score using a formula.
Tesla said, “These are combined to predict the risk of your driving resulting in a future collision.” It wasn’t immediately apparent what score would qualify as “excellent” in order to obtain Full Self-Driving, as Musk described it.
Musk has stated that drivers who use the company’s Autopilot software frequently will be rewarded. According to him, owners will be able to watch their progress in real time and will be instructed on how to meet the standards.
Chamber of Progress, a trade organization, took aim at Tesla’s promotion of the technology late last month.
The consortium, which is backed by Apple, Alphabet-owned Waymo, and General Motors-backed Cruise, claimed that Tesla’s cars “aren’t truly completely self-driving.” “The fundamental issue is that Tesla drivers, in case after instance, take their eyes off the road because they assume they are operating a self-driving car. They’re not.”
Tesla, according to Homendy, has demonstrated no active interest in enhancing the safety of its products. She claims the board has issued recommendations in response to deadly collisions in Williston and Delray Beach, Florida, as well as Mountain View, California, but that they have gone unheeded.
She stated, “Tesla has not responded to any of our queries.” “They’ve disregarded us – they haven’t reacted to us,” says the narrator.
“And if you don’t solve things and continue to upgrade, that’s a problem,” she continued.
Following an examination into a 2018 incident in which a driver was killed when his car crashed into a highway barrier, the National Highway Traffic Safety Administration (NHTSA) was asked to determine if Tesla’s technologies constituted an excessive safety risk.
NHTSA, according to Homendy, has to take a more active role in the problem. All crashes using driver-assistance systems must now be reported, according to the government.
Homendy stated, “It is incumbent on a federal regulator to take action and protect public safety.” “I am pleased that they have requested crash data from all manufacturers, and that they are starting with Tesla by requesting collision data for emergency cars. They must, however, do more.”
A continuous stream of videos from early beta tests have been circulating on Twitter, depicting the still-nascent Full Self-Driving system’s perplexity when confronted with unexpected impediments. Roundabouts and unprotected left turns have been found to cause problems for the system, as well as abruptly swerving toward pedestrians and crossing a double-yellow line into oncoming traffic.
“I want the best for Tesla, but going broad release is not the move, at least not right now,” the user said in the latter situation.
Others stated they had personally suffered as a result of Tesla’s fast software release and asked the firm to rethink.
Bernadette Saint Jean’s husband, Jean Louis, was murdered on the Long Island Expressway in July after a Tesla with autonomous features collided with him on the side of the road, according to NHTSA.
“Tesla should not be spreading its Autopilot or Traffic-Aware Cruise Control Systems until they can explain why my husband and all of those First Responders had to die and be injured,” Saint Jean, of Queens, said in a statement released through her attorney, Joshua Brian Irwin.
Source of information (The Washington Post)