A Tesla, on autopilot mode, controlled by Full Self-Driving (FSD) beta firmware attempts a sudden right turn and is interrupted by the driver before nearly hitting a crossing pedestrian, showed a video that has been taken down by a copyright report.
A user, after watching the video, said on social media, “That corner, where that pedestrian was, where this car almost hit him, that’s our corner. That’s where we cross 5th Ave. to head anywhere around our neighborhood. That’s not cool man. Our neighborhood is not a Tesla testing zone. That driver could have hurt or killed somebody.”
Tesla video taken down on report of a copyright owner
“This media has been disabled in response to a report by the copyright owner,” reads a message on the video posted by a Twitter user Taylor Ogan.
— Taylor Ogan (@TaylorOgan) September 15, 2021
Tesla has begun testing its Full Self-Driving Beta software on Canadian roads in advance of its wider release in the United States.
Last weekend, Tesla began rolling out its new Full Self-Driving (FSD) Beta v10 software update to its early access fleet.
Elon Musk, CEO of Tesla, has praised the v10 software update as “mind-blowing.”
The video that has been taken down for copyright rules was later posted by another user on Twitter.
This is the video of the Tesla FSD Beta attempting to autopilot into a cross-walking pedestrian:
– when passing parked vehicle gently steer away to avoid collision
– when pedestrian enters the street make a sharp turn to face them head on and quickly increase speed to obtain optimal ramming velocity
— 𝕊𝕠𝕔𝕚𝕒𝕝𝕚𝕤𝕥 𝕊𝕪𝕤𝕒𝕕𝕞𝕚𝕟 ☭ (@reset_by_peer) September 17, 2021
According to Electrek, the new Tesla FSD update shows some progress, but it still has some major issues that make it less than spectacular.
Twitter users react to Tesla autopilot video
“To be honest, I didn’t understand what happened here. Did the car deviate from the route or was it supposed to stop and wait for the pedestrian?” a Twitter user said after watching Tesla’s full self-driving video.
Another user wrote, “That Tesla video underscores the need for a pedestrian bill of rights in the age of “autonomous” vehicles.”
“This is the video that shows Tesla FSD Beta try to autopilot the car into a crosswalking pedestrian. Fair use copyright laws or not, someone doesn’t want you to see it,” an LA Times reporter, Russ Mitchell, said in a tweet.
One user complained, “With the videos of those showing of the Tesla’s latest FSD Beta, nearly running a pedestrian over etc, I am at a loss at how members of the public are allowed to use it on the roads.”
According to Reuters, since 2016, the NHTSA has opened 33 investigations into Tesla crashes involving 11 fatalities in which the use of advanced driver assistance systems was suspected. The NHTSA has ruled out the use of Autopilot in three of those non-fatal crashes.