See how the latest autonomous driving hardware categorizes and prioritizes hazards.

Tesla recently announced it would start installing the hardware needed for autonomous driving in all its cars. Hardware that includes 12 ultrasonic sensors and eight cameras.

Not that the hardware can actually be used, as the software won’t be released to customers for some time. It is being extensively tested, and Tesla has released another video showing what the system is capable of.

And it is truly remarkable. The Model X cruises a complex, gnarly road near Tesla’s base in Palo Alto, California more competently than many human drivers. It easily works out tricky intersections and blind turns, and steers much more smoothly than earlier versions of Autopilot. Which steer so twitchily they could easily induce motion sickness on this kind of road.

Even more interesting than that, though, are the three camera feeds displayed at the edge of the screen, showing the obstacles the system picks out and how it categorizes them. Pedestrians, road signs, crash cans, signs, and road markings are all scrutinized and classified according to the hazard they pose.

The system still gets flummoxed by a couple of situations. It stops when a runner is close the edge of the road, on a stretch with double center line. Most human drivers would have simply steered round, crossing the divider a little. It then randomly stops half way around a couple of right turns for no apparent reason.

Still, it's no worse than a cautious human driver. Which is astonishing, considering Tesla has only been working on the technology for a few years. Of course, it remains to be seen how the system reacts in an emergency situation, and how it deals with low visibility conditions - the need to improve those abilities was a major factor in the decision to improve the hardware.

Anyway, on the basis of this performance, fully autonomous cars may be a lot closer to reality than we think.

Be part of something big