MIT Technology ReviewAutonomous cars often proudly claim to be fitted with a long list of sensors—cameras, ultrasound, radar, lidar, you name it. But if you’ve ever wondered why so many sensors are required, look no further than this picture. You’re looking at what’s known in the autonomous-car industry as an “edge case”—a situation where a vehicle might have behaved unpredictably because its software processed an unusual scenario differently from the way a human would. In this example, image-recognition software applied to data from a regular camera has been fooled into thinking that images of cyclists on the back of a van are genuine human cyclists. This particular blind spot was identified by researchers at Cognata, a firm that builds software simulators—essentially, highly detailed and programmable computer games—in which automakers can test autonomous-driving algorithms.