We have not reached the technological point where we can expect to see driverless vehicles plying our roadways, yet. However, as a recent incident on the QE2 highway in Alberta has demonstrated, we might be edging closer than some of us might have thought. How close are automakers to producing such a device, and more importantly, how close are lawmakers to allowing an AI-driven vehicle on city streets?
The industry has a 6-tier scale that is used to define the level of self-driving autonomy in a vehicle. The levels are:
- Level 0: Your typical car with nothing more than an onboard Nav system or old-style Cruise Control.
- Level 1: This would include vehicles that use a single computer-controlled driving system, like either Adaptive Cruise Control or Lane Centering.
- Level 2: These systems are thought of as Partial Driving Automation, they would contain multiple driver-assist systems, such as Adaptive Cruise Control or Lane Centering.
- Level 3: These vehicles would include the driver-assist systems from Level 2, but would also have the ability to make environmental driving decisions, like passing a slow-moving vehicle or detecting environmental conditions which the vehicle could make decisions about.
- Level 4: These High Driving Automation vehicles would typically not require human interaction or oversight and include the full suite of computer-controlled driving systems as well as an advanced AI that is capable of reacting to the changing conditions of a road, with traffic, pedestrians, and a wide variety of other variables.
- Level 5: Full Driving Automation vehicles would be fully automated cars that didn’t even have regular human driving controls. These vehicles would be fully automated.
The incident in question saw the “driver” of a Tesla Model S being pulled over and charged for speeding and dangerous driving. Both front seats in the vehicle were reclined and both the driver and passenger were sound asleep. So, does Tesla actually make a self-driving vehicle? Though they certainly do build systems that are above the Level 0, which makes them self-driving, they are not capable of truly autonomous driving, just yet. Tesla’s Autopilot System, as included on the Model S, is considered a Level 2 system on the above chart. Their marketing name of “Autopilot” is perhaps a bit ambitious, as such systems require a human to be in the driver’s seat and aware and able to take control of the vehicle without any notice. There is a vast array of possible hazards that these relatively dumb systems are not able to handle. Things like sudden severe weather, dirt or a rock chip covering a radar sensor or camera, low trailers, lane detours due to road construction, lane markings wearing out or being covered over, and even corners with heavy traffic can cause the system to become confused.
Tesla, like other automakers, has built-in extra safety features to ensure that the driver is still awake and driving. If the vehicle does not receive input from the driver, it will automatically slow the car down and bring it to a stop on the side of the roadway. However, as with any computer technology, clever people have devised workarounds or aftermarket applications that can help a driver circumvent these safeguards. In this case, it appears that the driver might have circumvented the safeguards to allow himself to kick back and not actively control the vehicle.
Currently, many automakers have Level 2 systems but there is only one Level 3 system currently on the market (the Audi A8). Even then, anything below a Level 4 still requires an attentive driver who can take control of the vehicle, from a technical standpoint. Legally-speaking, Level 4 systems are not permitted outside of very specific test regions in certain cities. There are several companies currently developing and testing Level 4 systems.
Canada does not currently have any legislation which would permit a driverless (Level 4 or 5) vehicle to operate on public roadways. This is not to say that one day such things would not be commonplace, but rather that currently, we do not have it. Ultimately, automated vehicles will become a mainstay of our roadways, and why not? We know that AI-controlled vehicles will be safer and that traffic will flow faster than those that are driven by humans. We still have serious legal considerations that need to be discussed, and not just by the corporations that are developing these systems. We, the citizens, should have some input on how these vehicles will operate. There are also likely concerns regarding the insurance industry that should be considered, as well. I am sure that insurers will be thrilled to have technology in place that lowers accident rates, as every accident costs them money, but questions like “Who is responsible when a Level 4 Automated system has an accident?” should be considered in detail today, not when we are scrambling to implement laws for an existing technology. In addition, what regulations are we going to require, not just testing of these vehicles, but for testing the security of them from malicious hacks? Given the frequency that major systems, even places like the Pentagon and NASA, get breached, I would be wary of the security and safety such automobiles have. That said, I am hesitant to endorse the idea of future Level 5 automated vehicles without a manual override that can physically disengage all automated systems and human (non-computer affected) controls that can be used to bring any vehicle to a safe stop.
Until the laws have changed, even if you find yourself in possession of a Level 4 Automation vehicle in the future, remember that YOU are the individual tasked with controlling your vehicle. It does not matter how automated the vehicle might be; until the laws say differently, there must always be a licensed, insured, sober, and awake driver behind the wheel, capable of taking control without any hesitation.
Éamonn Brosnan is a research associate with the Frontier Centre for Public Policy.