Turing-like Test for Tesla’s FSD?


A Tesla FSD Beta tester has conducted a Turing-like test with version 2020.48.35.7 of FSD. The tester, known as AI DRIVR on YouTube asked his wife to tail his Tesla on 5 routes. At the end of each route, she had to guess if it was her husband or FSD in control of the vehicle.

FSD did well but gave itself away on a handful of occasions:

7:49 – There was some weaving between parked cars.

18:25 – There was a minor swerve around a parked car, but FSD probably got away with that one.

20:50 – FSD had trouble with stopping at the right place before an intersection.

However there was a real nice, human-like right hand turn at 29:23.

The final score is at 30:15


This exercise throws up some interesting questions. Is the aim of FSD to fool other human drivers? Probably not, but it is a bi-product of a well-functioning self-driving system. Should we let other human drivers know that a machine is operating the vehicle next to them? Would humans be more tolerant of self-driving vehicles? Some might because they know they are being recorded from several angles.

On a note about the test itself, it became a reverse-Turing test when the human was in control as he wanted to ‘fool’ his wife into thinking he was a machine (10:25 in the video).

We expect to see many more exercises like this. They may not be very scientific, but they are interesting nonetheless.


Please enter your comment!
Please enter your name here