Miscellaneous

Freiburg speed camera tickets horse

Smart-city automation enforces rules without understanding objects, Kafka arrives on schedule

Images

Freiburg: Pferd vor Reitschule geblitzt Freiburg: Pferd vor Reitschule geblitzt spiegel.de

A traffic camera outside a riding school in Freiburg has discovered the limits of “smart city” automation: it issued a speeding ticket to a horse.

According to Der Spiegel, the city’s speed enforcement system flashed a rider and mount and generated a fine notice as if the subject were a motor vehicle. The episode is funny in the way bureaucracy is funny: the sensor did exactly what it was procured to do—detect motion above a threshold and feed it into a rules engine that prints punishment. What it cannot do is understand what it is looking at.

That gap between measurement and meaning is not a minor bug; it’s the business model. Municipalities buy “objective” enforcement to scale up fines without scaling up discretion. The pitch is always the same: cameras are neutral, consistent, and free of human bias. They are neutral only in the sense that they are indifferent to context. A machine cannot exercise judgment, so the system must either (1) tolerate absurd outcomes or (2) add a human review layer that quietly reintroduces discretion—after the public has already been taught that enforcement is “automatic” and therefore beyond argument.

The Freiburg horse ticket reflects a large administrative impulse: turn law into a machine-readable checklist. But legal rules are written for humans, full of implied categories (“vehicle,” “operator,” “road user”) and exceptions that depend on purpose, location, and intent. When you force that into a pipeline of sensors, databases, and templated notices, you don’t eliminate arbitrariness; you relocate it to procurement specs, calibration settings, and whatever the vendor’s software team guessed lawmakers meant.

And once the infrastructure exists, it expands. A camera that can’t distinguish a car from a horse will still be marketed as a platform: add more sensors, connect more databases, automate more penalties. The result is a Kafkaesque loop where citizens must prove they are not the thing the system assumed they were—because the system is optimized for throughput, not truth.

Freiburg will presumably void the fine, laugh, and move on. “Smart city” enforcement is often just a dumb sensor with a billing department attached—and the only truly intelligent component is the institution’s ability to convert measurement errors into administrative work for everyone else.