Friday 28 December 2012

Self-Drive Engage

Lately I've been thinking a lot about self-driving cars.

You see, the whole point of a self-driving vehicle is that the occupants of the vehicle are absolved from all the responsibility and all of the joy of operating the motor vehicle.

In such a scenario, which is currently playing out in both California and Nevada, the part that I've been thinking about the most is: Who should be responsible for paying the speeding tickets?

It brings in a number of thorny questions, not the least of which is the difference between driving safely and driving legally.  I hope that we can assume that the car will be authorized to drive safely first, and legally second. (Please let me know in the comments below if this is not the case!!)

It also calls into question the goal of the speeding ticket program in general.  If the goal is to genuinely limit the Kinetic Energy of the vehicle (= mass x sqr(velocity)/2), then lets forget about speeding, and instead record this computed kinetic energy quantity in a continuous manner, along with the GPS co-ordinates, and at the end of the month compare it with the local authority's database of speed kinetic energy limits.

Behavior Modification

In gaming terms, a (speeding) fine is a way of modifying behavior by producing a sharp negative feedback at random intervals.  This is among the most effective ways we know of reducing an undesired player behavior.

Unfortunately this technique simply does not work against computer software.  The only people qualified to change the software are the developers, and it requires active participation on the vehicle owner's part to update the software on a regular basis.

Insurance

I hereby propose an Insurance based licensing scheme for self-drive vehicles.  I propose that in order for a vehicle to (legally) use a self-drive mechanism, the owner of the vehicle must purchase insurance from an organization that is both state licensed, and independently audited.  Eligibility for any given insurance policy will be based on the make and model of the vehicle, plus the software package, version and database of the self-drive mechanism.  At the end of the month, all of the occasions when vehicles with the same policy have exceeded the posted speed kinetic energy limit are summed up, and it's the insurance policy fund which pays out to the state, with no additional per-vehicle owner expenses.

This creates a market for insurance policies.  You can purchase cheaper insurance by buying more conservative software, or pay more in insurance but arrive at your destination sooner with more aggressive software.  As technology and software changes and improves, so too will the market for your self-drive insurance match the current conditions in your state.

And if the price of the insurance is too high for your particular vehicle (e.g. it's too old, or too unsafe, or you're currently out-of-state), you can always opt-out and disable the self-drive feature of your vehicle.

Incentives


This proposal create the right incentives, the software developer must use the best software engineering techniques, the vehicle owner must keep their vehicle updated with the latest software, the insurance socializes the speeding costs amongst all vehicle owners of the same class, and the market ensures an efficient allocation of policies and choice of software programs across all the vehicles in the state's fleet.

The one piece of the puzzle that's missing is the state.  Suppose that a speed kinetic energy limit on a particular stretch of road is changed, but the software developers aren't notified in a timely manner.  In this case, the state itself has been negligent, and it's the state itself which should be fined for putting motorists at risk.  In the same way that the state must adequately signpost the speed limit, so should be it's responsibility to notify the state licensed self-drive software developers.

Speeding?

Of course, I've used speeding as an example of unsafe vehicle behavior, but this regulatory framework extends in a natural way to all vehicle behaviors - stop signs, following distances, red light rules, yielding to buses on residential roads.  Even accident compensation, emission standards, and fuel usage.

The only exceptions I can see are when a vehicle is attempting to drive safely rather than legally.  Without getting all Carl Sagan here, it seems that we could use the black-box data to evaluate all collisions (few) and near-misses (many) to improve the software and improve safety over time.

Failure To Yield

Interestingly, the large majority of vehicle collisions are caused by one simple mechanism, "Failure To Yield".   That's what stop signs and traffic lights and turning circles are all about. A self-drive vehicle, equipped with appropriate sensors, has no reason to stop at stop signs, nor yield at yield signs (if it can negotiate with another self-drive vehicle to yield instead), other than to avoid startling other human drivers.

Reality?

Will it happen?  An insurance based self-drive licensing scheme? I don't know..  If anyone knows of the actual proposed self-drive licensing situation, please post it in the comments below!



No comments:

Post a Comment