TECH NEWS

Federal Regulators Demand Data on Tesla's Elon Mode

Tesla Faces Scrutiny from Safety Regulators Over Elon Mode

In an unprecedented move, federal automotive safety officials issued Tesla over Elon Mode a unique and broad order, seeking detailed information about its support for drivers and driver monitoring systems.  When Tesla drivers use the vehicle’s driver assistance options such as Autopilot, Full Automation (FSD), or FSD Beta, they are accustomed to visual prompts on the vehicle’s touchscreen.

These prompts are designed to encourage drivers to their hands on the steering wheel. If the driver ignores this issue for an extended period of time, it escalates to an audible beep. If the driver continues to disengage, the vehicle is programmed to be unable to perform advanced driver support functions for the remainder of the trip.

However, the advent of “Alon Mode” has generated excitement and controversy. This mode allows drivers to use the Autopilot and FSD systems without resorting to steering wheel alignment prompts. The National Highway Traffic Safety Administration (NHTSA) took note of the development and sent a special letter and instruction to Tesla on July 26, inquiring about the nature and significance of this hidden feature.

NHTSA’s analysis extends to details such as the number of authorized vehicles and the number of drivers allowed to sidestep. The move is seen as a response to growing concerns about the safety of allowing drivers to operate autopilot FSD systems without the use of an active steering wheel.

John Donaldson, NHTSA’s acting chief counsel, expressed apprehension about recent changes to Tesla’s driver monitoring system. The anxiety is triggered by data that allows owners to adjust the autopilot’s driver control settings, potentially allowing the autopilot to operate longer without the driver having to put his hands on the steering wheel.

Tesla Faces Scrutiny From Safety Regulators Over Elon Mode
Tesla Faces Scrutiny From Safety Regulators Over Elon Mode

In response to the directive, Tesla was given until August 25 to provide the requested information. Although Tesla met the deadline and submitted its response within the stipulated time, it also requested that its submission be kept confidential, a request that NHTSA granted. So far, Tesla has not issued a statement in response to questions from major media outlets, including CNBC.

Prominent figures in the automotive safety community weighed in on the issue. Philip Koopman, an automotive safety researcher and associate professor of computer science at Carnegie Mellon University, insisted that hidden safety vulnerabilities have no place in software development He suggested that NHTSA’s concerns are likely safety-related of resources that can operate through such “fraud statutes”. ”

Koopman also highlighted NHTSA’s ongoing investigation into incidents involving Tesla’s Autopilot systems, specifically collision accidents involving the first vehicles to stop at a standstill. NHTSA Administrative Director Ann Carlson has hinted at the possibility of an impending decision in a recent interview.

Over the years, Tesla has contacted regulatory agencies including the NHTSA and the California DMV, arguing that its driver assistance programs, including FSD Beta, fall under the “Level 2” classification and render their vehicles fully inoperable. Despite this classification, the company’s marketing practices and statements by CEO Elon Musk have caused public confusion. Musk, who is also associated with the social network X (formerly known as Twitter), often talks about Teslas’ ability to actually drive cars.

In a highly publicized livestreamed test drive, Musk previewed Tesla’s upcoming FSD software (p. 12) on a social media platform. Throughout the show, Musk chatted with his passenger Ashok Eluswamy, head of Tesla’s Autopilot software engineering division. While the quality of the video was poor, viewers said that Musk did not always put his hand on the steering wheel.

These specific uses of Tesla‘s systems may violate the company’s own policy for Autopilot and FSD. Greg Lindsay, an Urban Tech fellow at Cornell, shared concerns about the test run, pointing out that it could be a problem from a security perspective. Tesla’s website has clear warnings for drivers, with their responsibility to stay alert, maintain their hands on the steering wheel and maintain control of the vehicle when using the autopilot and FSD features is included

Bruno Bowden, a machine learning expert and managing partner at Grep VC, an investor in autonomous vehicle startup Wayve, acknowledged Tesla’s technological progress as shown during the livestream but insisted Tesla is still on ara has a great deal of work ahead of it to develop reliable, secure automation systems. Notably, Tesla’s system frequently ran red lights during test drives, requiring Musk’s quick intervention to avoid potential accidents.

In conclusion, as federal regulators scrutinize Tesla’s driver assistance and behind-the-wheel systems, the unveiling of “Alone Mode” brought obvious safety concerns to the forefront In this ongoing saga the automotive industry and the public are eagerly awaiting further developments, as the future of self-driving technology intense -continues to be a subject of scrutiny and debate.

TechBeams

TechBeams Team of seasoned technology writers with several years of experience in the field. The team has a passion for exploring the latest trends and developments in the tech industry and sharing their insights with readers. With a background in Information Technology. TechBeams Team brings a unique perspective to their writing and is always looking for ways to make complex concepts accessible to a broad audience.

Leave a Reply

Back to top button