VI – When the Code Enters the Street

VI – When the Code Enters the Street
The calm interior of the vehicle and the unpredictable street ahead.

Series: Road Without Signs — On the Psychology of Movement in a World Increasingly Governed by Others

Everything works as long as the road is orderly.
As long as the lines are clear, the signs explicit, and behavior predictable.
As long as the world resembles an instruction manual.

But there are cities that have never had a single, definitive set of instructions.
Athens, Belgrade, Naples — places where you don’t drive by rules alone, but by a feel for the moment.

Places where every meter of asphalt has its unwritten code, and every intersection its own subtle specificity.

This is where code leaves the laboratory for the first time.
And becomes a stranger.

On these streets, pedestrians don’t cross on green, but on eye contact.
Drivers don’t move by right of way, but by reading intentions.
The horn is rarely a signal of danger — more often a tool of communication.

Here, traffic functions as social interaction rather than a system of rules.

A person is not “breaking the rules” — they are reading the context.
They sense the thin line between order and chaos and drive exactly along it.

An algorithm cannot sense that line.
It is either inside the rules — or outside them.


Can an autonomous vehicle survive in a culture of improvisation?

Autonomous systems struggle most in informal traffic environments, where behavior isn’t standardized.

Where traffic doesn’t always rely on signals, the driver is constantly micro-adjusting.
Where priority belongs not to the one who is right, but to the one who has already moved.
Where rules apply — except when they don’t.

For humans, this isn’t chaos.
It’s skill.

For an algorithm, it’s noise.


Can instinct be programmed?

Machine learning can absorb millions of scenarios.
But “improvisation” is not a scenario — it’s an attitude toward incomplete information.

A human doesn’t react to events, but to hints of events.
To half-movements, hesitation, a glance that lingers half a second too long.

Research in human–machine interaction shows that people make decisions based on intersubjective signals that machines still cannot reliably interpret.

This brings us to a set of questions:

• When should a machine intervene to assist the driver?
• How should it intervene, and to what degree?
• What effect will that intervention have on the driver?
• And finally, who carries responsibility for the act of driving?

🔗 https://www.sciencedirect.com/science/article/pii/S1474667016426078

Instinct is not speed.
Instinct is flexibility without a formula.


Global rules and the morality of the local street

The ethics of autonomous systems rest on universal principles:
safety, predictability, equality.

But the local street doesn’t ask for equality — it asks for sensitivity to difference.

Is it morally “correct” to stop and block an intersection because the rule demands it?
Or is it more moral to bend the rule slightly so that the flow can continue?

It turns out that what is globally correct is not always locally just.


The mirror technology holds up

Perhaps the most uncomfortable question is this:

What if autonomous vehicles don’t reveal the weakness of technology —
but the inconsistency of our culture?

What if the “instinct” we take pride in is not wisdom, but simply a survival habit in a system we don’t fully trust?

Studies show that societies with low trust in institutions tend to tolerate informal rules in traffic more easily:

🔗 https://defipp.unamur.be/wp/defipp_wp_2021_3.pdf#:~:text=However%2C%20low%20compliance%20with%20rules%20may%20also,others%20(Peltzman%201975%2C%20Bj%C3%B6rklund%20and%20%C3%85berg%202005)

Maybe that’s why code feels like a stranger.
Not because it is cold — but because it is consistent.


The road continues

When code enters the street, the road stops being a technical matter.
It becomes cultural.
Ethical.
Personal.

Technology must learn culture.
And culture must look at itself in the mirror technology holds up.


Where does driving end and society begin?

As long as we hold the steering wheel, responsibility feels clear.
We know who made the mistake, who accelerated, who failed to stop.

But what happens when decisions are no longer made by a person — but by a system?

When code chooses the speed, the distance, the moment to brake.
When comfort replaces responsibility.
When error can no longer be traced to a single name.

That’s where the question stops being technical and becomes social.

🔹 What does technological progress mean without a legal response to match it?
🔹 Who is responsible when the system fails — the driver, the manufacturer, the programmer?
🔹 Should those who write decision-making code carry moral responsibility as well?
🔹 Where is the line between the right to control and the comfort of machine choice?

When decisions are delegated, responsibility becomes diffuse.
And society is left without a clear answer to a simple question:

Who is actually behind the wheel?

In the next chapter of The Road Without Signs, we explore the moment when driving stops being an individual act and becomes a mirror of the society that still hasn’t decided what to do with its own power.

The road goes on.
Without signs.