Skip to content

State Machines

Time: 155 min

Learning Objectives

By the end of this lab you will be able to:

  • Identify when Boolean flags create unmaintainable code
  • Design a finite state machine (FSM) for robot behavior
  • Implement state machines using Python dictionaries or classes
  • Add new behaviors without modifying existing state logic
  • Use hardware interrupts to eliminate blocking from sensor reads
  • Measure and compare loop jitter across different scheduling approaches
  • Recognize state machines in real-world embedded systems

You will first experience how flag-based logic becomes messy, then replace it with a state machine that is readable and extensible. You will also discover how interrupts eliminate the last source of jitter in your control loop.

Lab Setup

Connect your Pico via USB, then from the picobot/ directory:

mpremote run clean_pico.py          # optional: wipe old files
mpremote cp -r lib :               # upload libraries

Verify in REPL: from picobot import Robot; Robot() → "Robot ready!" First time? See the full setup guide.


Introduction

Individual robot behaviors (line following, turning, obstacle detection) are now working. You learned ultrasonic sensing and non-blocking timing in Ultrasonic & Timing, and now it's time to combine everything into a complete mission: "Follow line → stop at junction → turn left → continue to destination."

This lab explores two approaches: 1. Boolean flags - the intuitive first attempt that becomes unmanageable 2. State machines - a structured pattern that scales cleanly

Understanding why flags fail helps you appreciate the state machine pattern used in production embedded systems.


Why This Lab Matters

As behavior grows, complexity explodes. Flags scale poorly because you end up tracking combinations, while state machines keep behavior explicit and testable. This is the point where your robot stops being a script and becomes a system.



Background: Why Flags Explode

When behavior grows complex, developers add boolean flags: is_turning, saw_obstacle, on_line, was_stopped... With \(n\) flags, there are \(2^n\) possible combinations — 5 flags create 32 states, most of which are bugs. A state machine replaces all flags with a single variable: the current state. Transitions are explicit, impossible states are eliminated by design, and adding new behaviors doesn't break existing ones.

→ Deep dive: State Machine Reference

Part 1: Try Without State Machines (15 min)

The First Attempt

Here's how most students start:

from picobot import Robot
import time

robot = Robot()
robot.imu.calibrate()

# Flags to track what we've done
started = False
at_first_junction = False
turned_at_junction = False
delivered = False

print("Press Enter to start")
input()
started = True

while not delivered:
    if started and not at_first_junction:
        # Follow line until junction
        robot.line_follow_step(80, 30)

        if robot.at_junction():
            robot.stop()
            at_first_junction = True
            print("At junction!")

    elif at_first_junction and not turned_at_junction:
        # Turn left at junction
        robot.turn_degrees(90)
        turned_at_junction = True
        print("Turned!")

    elif turned_at_junction and not delivered:
        # Follow to destination
        robot.line_follow_step(80, 30)

        if robot.at_junction():  # Second junction = destination
            robot.stop()
            delivered = True
            print("Delivered!")

    time.sleep(0.02)

print("Done!")
robot.set_leds((0, 255, 0))

Run this. It works! (for the basic case)

Now Add "Return Home"

Your instructor says: "Great! Now make it return to the warehouse after delivery."

Try it yourself. Add the logic to: 1. After delivery, turn around (180°) 2. Follow line back 3. Turn at junction 4. Follow to warehouse 5. Stop

Attempt it before looking below!


The Nightmare Begins

Here's what happens when you try to add return logic:

# Your code now looks like this...
started = False
at_first_junction = False
turned_at_junction = False
delivered = False
returning = False           # NEW
turned_for_return = False   # NEW
at_return_junction = False  # NEW
home = False                # NEW

while not home:
    if started and not at_first_junction:
        robot.line_follow_step(80, 30)
        if robot.at_junction():
            robot.stop()
            at_first_junction = True

    elif at_first_junction and not turned_at_junction:
        robot.turn_degrees(90)
        turned_at_junction = True

    elif turned_at_junction and not delivered:
        robot.line_follow_step(80, 30)
        if robot.at_junction():
            robot.stop()
            delivered = True

    elif delivered and not returning:
        # Turn around
        robot.turn_degrees(180)
        returning = True

    elif returning and not at_return_junction:
        robot.line_follow_step(80, 30)
        if robot.at_junction():
            robot.stop()
            at_return_junction = True

    elif at_return_junction and not turned_for_return:
        robot.turn_degrees(90)  # Wait... left or right?!
        turned_for_return = True

    elif turned_for_return and not home:
        robot.line_follow_step(80, 30)
        if robot.at_junction():  # Is this home? Or did we miss?
            robot.stop()
            home = True

    time.sleep(0.02)

What Went Wrong?

Count the problems: - [ ] 8 boolean flags to track - [ ] Each new feature adds 2-3 more flags - [ ] Hard to see what the robot is "currently doing" - [ ] Can you tell at a glance if turned_for_return means going TO or FROM? - [ ] What if a flag gets set wrong? Debugging nightmare! - [ ] Adding obstacle avoidance? Double the flags again!

The Real Failure

This code technically works, but: - You can't read it - You can't debug it - You can't extend it - You'll forget what you wrote by next week

This is called spaghetti code - everything is tangled together.

Measure the Problem

# How many states is your robot actually in?
#
# NOT delivered, NOT at junction, NOT turned = FOLLOWING TO JUNCTION
# at junction, NOT turned = AT FIRST JUNCTION
# turned, NOT delivered = FOLLOWING TO DESTINATION
# delivered, NOT returning = JUST DELIVERED
# returning, NOT at return junction = FOLLOWING BACK
# ... etc
#
# There are really just ~8 STATES.
# But we're tracking them with 8 BOOLEAN FLAGS.
# That's 2^8 = 256 possible combinations!
# Most of those combinations are INVALID.

???+ abstract "Math Connection: Why Booleans Explode"
     With $n$ boolean flags, you have $2^n$ possible combinations. 8 flags = 256 possible states, but your robot only has ~8 *meaningful* states! Most combinations like `(delivered=True, started=False)` are logically impossible.

     This is called *state space explosion* and it's a real problem in software verification. Tools that check if code is correct must examine every possible state—and $2^{20}$ flags means over a million states to check. Using explicit state machines instead of flags keeps the state space small and manageable.

     📚 [State Space Explosion (Wikipedia)](https://en.wikipedia.org/wiki/State_space_explosion) · [Combinatorics](https://en.wikipedia.org/wiki/Combinatorics)

The insight: Your robot is always doing ONE thing. But booleans can represent impossible combinations.

Checkpoint — Flag Pain Felt

You should now have experienced the mess of 8 boolean flags. If your return-home code worked on the first try, you're either very talented or very lucky. The point is that this approach doesn't scale.

Design Principle: Event Generation vs Event Handling

A common mistake when first building state machines is mixing up detecting events with reacting to them. These are two separate responsibilities:

  • Event generation answers: "Did something happen?" — a sensor threshold was crossed, a timer expired, a button was pressed. This is pure detection, with no knowledge of what the system should do about it.
  • Event handling answers: "Given that this happened, what should I do?" — and the answer depends entirely on which state the system is in.

Keeping these separate makes your code far easier to test and extend. You can test event detection independently ("does the sensor correctly report 'obstacle near'?") and test state logic independently ("when FOLLOWING and obstacle_near, does it transition to STOPPED?"). When you later add new states, the event detection code does not change — only the state machine's transition table grows.


Part 2: The Solution - State Machines (10 min)

What is a state machine?

A formal model where the system is always in exactly one state, with defined transitions between states. → State Machine Patterns

What Is a State?

A state is "what the robot is currently doing": - IDLE - waiting for start - FOLLOW_LINE - following the track - AT_JUNCTION - stopped at intersection - TURNING - executing a turn - DELIVERING - arrived at destination

The robot is always in exactly one state.

State machine basics - states and transitions

Try the Demo

from picobot import Robot
import time

robot = Robot()

# State machine variables
state = "IDLE"
state_start = time.ticks_ms()

def enter_state(new_state):
    global state, state_start
    print(f"{state}{new_state}")
    state = new_state
    state_start = time.ticks_ms()
    robot.leds.show_state(state)

def time_in_state():
    return time.ticks_diff(time.ticks_ms(), state_start)

# Demo: automatic state transitions
print("State machine demo - watch states change")

enter_state("IDLE")

while True:
    if state == "IDLE":
        robot.stop()
        if time_in_state() > 2000:
            enter_state("FOLLOW")

    elif state == "FOLLOW":
        robot.forward(80)
        if time_in_state() > 3000:
            enter_state("TURNING")

    elif state == "TURNING":
        robot.set_motors(-80, 80)
        if time_in_state() > 500:
            enter_state("IDLE")

    time.sleep(0.02)

Expected output:

IDLE → FOLLOW
FOLLOW → TURNING
TURNING → IDLE
IDLE → FOLLOW

Observe: Each state has: - Actions — what to do while in this state - Transitions — when to change to another state

Checkpoint — State Machine Running

The robot should cycle through IDLE → FOLLOW → TURNING → IDLE. The console output shows clean transitions. Compare this clarity with the 8-flag mess from Part 1.

Math Connection: Finite State Machines in Theory

What you're building has a formal mathematical definition: a Finite State Machine (FSM). Computer scientists define it as \(M = (Q, \Sigma, \delta, q_0, F)\)—states, inputs, transition function, start state, and accepting states. The same formalism describes regular expressions, network protocols, and digital circuits.

Compilers use FSMs to recognize keywords in code. Network protocols use them to track connection states. Even the CPU in your robot is built from millions of tiny hardware state machines. It's one of the most widely applicable abstractions in computing.

📚 Finite State Machine (Wikipedia) · Automata Theory

State transitions in action


Part 3: Line Following with Junctions (20 min)

Detecting Junctions

A junction = all sensors see black:

from picobot import Robot
import time

robot = Robot()

print("Push robot over track, watch for junctions")

while True:
    if robot.sensors.line.at_junction():
        print("JUNCTION!")
        robot.set_leds((0, 0, 255))
    elif robot.sensors.line.line_lost():
        print("LINE LOST")
        robot.set_leds((255, 0, 0))
    else:
        robot.set_leds((0, 255, 0))

    time.sleep(0.1)
from picobot import Robot
import time

robot = Robot()

# Calibrate gyro for turns
print("Calibrating...")
robot.imu.calibrate()

# ═══════════════════════════════════════════════════
# STATE MACHINE
# ═══════════════════════════════════════════════════
state = "IDLE"
state_start = time.ticks_ms()
junction_count = 0

def enter_state(new_state):
    global state, state_start
    duration = time_in_state()
    print(f"[{time.ticks_ms():6d}] {state} ({duration}ms) → {new_state}")
    state = new_state
    state_start = time.ticks_ms()
    robot.leds.show_state(state)

def time_in_state():
    return time.ticks_diff(time.ticks_ms(), state_start)

# ═══════════════════════════════════════════════════
# MAIN LOOP
# ═══════════════════════════════════════════════════
print("\nPress Enter to start navigation")
input()
enter_state("FOLLOW")

try:
    while True:
        # ─────────────────────────────────────────
        # IDLE: Wait for command
        # ─────────────────────────────────────────
        if state == "IDLE":
            robot.stop()

        # ─────────────────────────────────────────
        # FOLLOW: Line following
        # ─────────────────────────────────────────
        elif state == "FOLLOW":
            robot.line_follow_step(speed=80, kp=30)

            if robot.at_junction():
                robot.stop()
                junction_count += 1
                print(f"Junction #{junction_count}")
                enter_state("AT_JUNCTION")

        # ─────────────────────────────────────────
        # AT_JUNCTION: Decide what to do
        # ─────────────────────────────────────────
        elif state == "AT_JUNCTION":
            robot.stop()

            if time_in_state() > 500:
                if junction_count == 1:
                    enter_state("TURN_LEFT")
                elif junction_count == 2:
                    enter_state("DELIVERING")
                else:
                    enter_state("FOLLOW")

        # ─────────────────────────────────────────
        # TURN_LEFT: Execute 90° turn
        # ─────────────────────────────────────────
        elif state == "TURN_LEFT":
            robot.turn_degrees(90)  # Blocking, but OK for turns
            enter_state("FOLLOW")

        # ─────────────────────────────────────────
        # DELIVERING: Flash LEDs, then return
        # ─────────────────────────────────────────
        elif state == "DELIVERING":
            # Flash LEDs
            if (time_in_state() // 200) % 2 == 0:
                robot.set_leds((0, 255, 0))
            else:
                robot.leds_off()

            if time_in_state() > 2000:
                print("Delivery complete!")
                enter_state("IDLE")

        time.sleep(0.02)

except KeyboardInterrupt:
    robot.stop()
    robot.leds_off()
    print("\nStopped")
Checkpoint — Navigation Working

The robot should follow the line, stop at junctions, turn, and continue. The console log should show transitions like FOLLOW → AT_JUNCTION → TURN_LEFT → FOLLOW. If the robot doesn't detect junctions, check that the track has proper intersection markings.

Stuck?
  • Robot never enters AT_JUNCTION: The junction detection needs all sensors to see black. Make sure your track has wide intersections (≥19mm tape crossing).
  • Robot gets stuck in a state: Add print(f"State: {state}") inside the loop to see where it's stuck. Check transition conditions.
  • Turn is wrong angle: Recalibrate the gyro before running. Temperature changes affect calibration.

Part 4: Using the StateMachine Helper (10 min)

The picobot library has a helper class that separates state definition from the main loop:

from picobot import Robot, StateMachine

robot = Robot()
robot.imu.calibrate()

sm = StateMachine(robot)

# Define states with actions and optional entry/exit callbacks
sm.add_state("IDLE",
    action=robot.stop,
    on_enter=lambda: robot.set_leds((0, 0, 50)))   # Dim blue

sm.add_state("FOLLOW",
    action=lambda: robot.line_follow_step(80, 30),
    on_enter=lambda: robot.set_leds((0, 255, 0)))   # Green

sm.add_state("JUNCTION",
    action=robot.stop,
    on_enter=lambda: robot.set_leds((0, 0, 255)))   # Blue

sm.add_state("TURN",
    action=lambda: robot.turn_degrees(90),
    on_enter=lambda: robot.set_leds((255, 255, 0)),  # Yellow
    on_exit=lambda: print("Turn complete"))

# Define transitions
sm.add_transition("IDLE", "FOLLOW",
    condition=lambda: True)  # Immediate

sm.add_transition("FOLLOW", "JUNCTION",
    condition=robot.at_junction)

sm.add_transition("JUNCTION", "TURN",
    condition=lambda: sm.time_in_state() > 500)

sm.add_transition("TURN", "FOLLOW",
    condition=lambda: True)  # After turn completes

# Run!
sm.run()

The on_enter callback runs once when entering a state (e.g., set LEDs, print a message). The on_exit callback runs once when leaving. The action runs every loop iteration while in the state. This separation keeps each state self-contained.


Part 5: The Blocking Problem in Real Loops (15 min)

Before adding obstacle avoidance to our state machine, let's see why it needs careful integration. The obvious approach — just add read_distance() to the loop — breaks everything.

Naive Obstacle Detection + Line Following

from picobot import Robot
import time

robot = Robot()

SPEED = 80
KP = 30
OBSTACLE_DISTANCE = 15  # cm - stop if closer than this

print("Line following with obstacle detection")

try:
    while True:
        # Check for obstacles
        distance = robot.read_distance()

        if distance < OBSTACLE_DISTANCE:
            robot.stop()
            robot.set_leds((255, 0, 0))
            print(f"OBSTACLE at {distance:.0f}cm!")
            time.sleep(0.5)
            continue

        # Follow the line
        error = robot.sensors.line.get_error()

        if error is not None:
            correction = KP * error
            robot.set_motors(int(SPEED + correction), int(SPEED - correction))
        else:
            robot.stop()

        time.sleep(0.02)

except KeyboardInterrupt:
    robot.stop()

✅ Task — Test the Combined Code

Put the robot on the track and run this code. What do you observe?

  • [ ] Line following is smooth (like before)
  • [ ] Line following is jerky/terrible
  • [ ] Robot wobbles back and forth
  • [ ] Robot loses the line easily

The line following broke! But we barely changed anything—just added one line to read the distance. What happened?

Checkpoint — Observed the Problem

You should see the robot wobbling, losing the line, or moving jerkily. This is expected! The ultrasonic sensor is blocking the control loop.

Measure the Impact

When something goes wrong in embedded systems, the first step is to measure:

from picobot import Robot
import time

robot = Robot()

# Test 1: Loop WITHOUT ultrasonic
print("Measuring loop time WITHOUT ultrasonic")
times = []

for i in range(100):
    loop_start = time.ticks_us()

    # Just line following - no ultrasonic
    error = robot.sensors.line.get_error()
    if error is not None:
        correction = 30 * error
        robot.set_motors(int(80 + correction), int(80 - correction))

    loop_time = time.ticks_diff(time.ticks_us(), loop_start)
    times.append(loop_time)

robot.stop()

avg = sum(times) / len(times)
print(f"Loop time: {avg:.0f} µs average")
print(f"That's {1000000/avg:.0f} loops per second")
# Test 2: Loop WITH ultrasonic
from picobot import Robot
import time

robot = Robot()

print("Measuring loop time WITH ultrasonic")
times = []

for i in range(100):
    loop_start = time.ticks_us()

    # THE SLOW PART
    distance = robot.read_distance()

    # Line following
    error = robot.sensors.line.get_error()
    if error is not None:
        correction = 30 * error
        robot.set_motors(int(80 + correction), int(80 - correction))

    loop_time = time.ticks_diff(time.ticks_us(), loop_start)
    times.append(loop_time)

robot.stop()

avg = sum(times) / len(times)
print(f"Loop time: {avg:.0f} µs average")
print(f"That's {1000000/avg:.0f} loops per second")

Compare your results:

Metric Without Ultrasonic With Ultrasonic
Loop time ~500 µs ~25,000 µs
Loops/second ~2000 ~40

The ultrasonic sensor takes ~25 ms per reading — your loop slowed by ~50×. During those 25 ms, line sensors aren't being read, motors aren't being updated, and the robot is essentially driving blind.

This is exactly why the state machine's obstacle avoidance (Part 6) uses the non-blocking pattern you learned in Ultrasonic & Timing.


Part 6: Add Obstacle Avoidance (10 min)

Extend the State Machine

Adding obstacle avoidance requires three new states with entry/exit actions that make each state's purpose clear:

from picobot import Robot, StateMachine, Timer

robot = Robot()
robot.imu.calibrate()

# Non-blocking distance check
distance_timer = Timer(100)  # 10 Hz
cached_distance = 100

def check_distance():
    global cached_distance
    if distance_timer.ready():
        cached_distance = robot.read_distance()
    return cached_distance

def obstacle_near():
    return check_distance() < 20

def obstacle_clear():
    return check_distance() > 30

sm = StateMachine(robot)

# States with entry/exit actions
sm.add_state("FOLLOW",
    action=lambda: robot.line_follow_step(80, 30),
    on_enter=lambda: robot.set_leds((0, 255, 0)))   # Green = normal

sm.add_state("STOP_FOR_OBSTACLE",
    action=robot.stop,
    on_enter=lambda: (robot.set_leds((255, 0, 0)),   # Red = danger
                      robot.beep(880, 100),
                      print("Obstacle detected!")))

sm.add_state("BACK_UP",
    action=lambda: robot.set_motors(-60, -60),
    on_enter=lambda: robot.set_leds((255, 165, 0)),  # Orange = reversing
    on_exit=lambda: robot.stop())                     # Stop before turning

sm.add_state("TURN_AWAY",
    action=lambda: robot.set_motors(-60, 60),
    on_enter=lambda: robot.set_leds((255, 255, 0)),  # Yellow = turning
    on_exit=lambda: robot.stop())                     # Stop before resuming

# Transitions
sm.add_transition("FOLLOW", "STOP_FOR_OBSTACLE",
    condition=obstacle_near)

sm.add_transition("STOP_FOR_OBSTACLE", "BACK_UP",
    condition=lambda: sm.time_in_state() > 300)       # Brief pause

sm.add_transition("BACK_UP", "TURN_AWAY",
    condition=lambda: sm.time_in_state() > 500)       # Back up 500ms

sm.add_transition("TURN_AWAY", "FOLLOW",
    condition=lambda: sm.time_in_state() > 400 and obstacle_clear())

sm.run()

Notice how on_enter and on_exit handle one-time setup (LEDs, beeps, stopping motors) while action handles continuous behavior. This prevents the common bug of playing a beep on every loop iteration.

Checkpoint — Obstacle Avoidance Working

Place your hand in front of the robot while it follows the line. It should stop (red LEDs), back up (orange), turn away (yellow), then resume following (green). The console log shows the full state sequence.

Stuck?
  • Robot doesn't detect obstacles: Check cached_distance with a print — is it updating? The Timer(100) must tick.
  • Robot avoids but never finds the line again: Increase the turn time or add a SEARCH state that spins until the line is found.
  • States change too fast to see: Increase the time_in_state() thresholds (e.g., 500 → 1000ms) for debugging.

Part 7: Interrupt-Driven Ultrasonic (20 min)

The Timer(100) pattern from Part 6 works — but it has a hidden cost. Let's measure it.

Measure the Jitter

✅ Task — Loop Time Histogram

Run your Part 6 state machine with loop timing added:

from picobot import Robot, Timer
import time

robot = Robot()

SPEED = 80
KP = 30

distance_timer = Timer(100)  # 10 Hz
cached_distance = 100

def check_distance():
    global cached_distance
    if distance_timer.ready():
        cached_distance = robot.read_distance()
    return cached_distance

# Collect loop times
loop_times = []

try:
    for _ in range(500):
        loop_start = time.ticks_us()

        # Line following
        error = robot.sensors.line.get_error()
        if error is not None:
            correction = KP * error
            robot.set_motors(int(SPEED + correction), int(SPEED - correction))

        # Obstacle check (non-blocking — or is it?)
        check_distance()

        loop_time = time.ticks_diff(time.ticks_us(), loop_start)
        loop_times.append(loop_time)

except KeyboardInterrupt:
    pass

robot.stop()

# Analyze
fast = [t for t in loop_times if t < 2000]
slow = [t for t in loop_times if t >= 2000]

print(f"Total loops: {len(loop_times)}")
print(f"Fast loops (<2ms): {len(fast)}, avg {sum(fast)//max(len(fast),1)} µs")
print(f"Slow loops (>2ms): {len(slow)}, avg {sum(slow)//max(len(slow),1)} µs")
print(f"Worst case: {max(loop_times)} µs")

Your results:

Metric Value
Fast loops (count) ______
Fast loop avg ~500 µs
Slow loops (count) ~5
Slow loop avg ~25,000 µs
Worst case ______ µs

Most loops are fast (~500 µs), but every 100 ms there's a ~25 ms spike where the control loop freezes. At higher speeds, that spike causes a visible wobble — the robot drives blind for 25 ms every time the ultrasonic fires.

Why This Matters

The non-blocking Timer pattern reduced the frequency of blocking — from every loop to every 100 ms. But it didn't eliminate blocking. Each read_distance() call still freezes the CPU for up to 25 ms.

For a 50 Hz control loop (20 ms period), a 25 ms spike means you miss an entire control cycle. At speed, that's enough to lose the line.

The Idea: Let Hardware Do the Waiting

What if the CPU didn't have to wait for the echo at all?

The HC-SR04 echo pin goes HIGH when the burst leaves and LOW when the echo returns. These are two events — exactly what interrupts are designed for:

Polling (current):
  CPU: [send trigger] [wait... wait... wait... 25ms] [got echo!]
        └── blocked the whole time ──────────────────┘

Interrupt-driven (new):
  CPU: [send trigger] [line follow] [line follow] [line follow]...
  IRQ:                 ↑ echo HIGH                  ↑ echo LOW → calculate!
        └── CPU is free the whole time ─────────────┘

Build It

✅ Task — Interrupt-Based Distance Reader

from machine import Pin, Timer as HWTimer
import time

class UltrasonicIRQ:
    """Non-blocking ultrasonic using echo pin interrupt."""

    def __init__(self, trig_pin=14, echo_pin=15):
        self._trig = Pin(trig_pin, Pin.OUT)
        self._echo = Pin(echo_pin, Pin.IN)
        self._trig.off()

        # State
        self._rise_time = 0      # ticks_us when echo went HIGH
        self._distance = 100.0   # last valid distance (cm)
        self._measuring = False

        # Attach interrupt to echo pin — fires on both edges
        self._echo.irq(
            trigger=Pin.IRQ_RISING | Pin.IRQ_FALLING,
            handler=self._echo_handler
        )

        # Hardware timer triggers measurements at fixed rate
        self._timer = HWTimer(-1)
        self._timer.init(period=100, mode=HWTimer.PERIODIC,
                         callback=self._start_measurement)

    def _start_measurement(self, timer):
        """Send trigger pulse (called by hardware timer)."""
        if self._measuring:
            return  # Previous measurement still in progress

        self._measuring = True
        self._trig.off()
        time.sleep_us(2)
        self._trig.on()
        time.sleep_us(10)
        self._trig.off()

    def _echo_handler(self, pin):
        """ISR: called on echo pin rising AND falling edge."""
        if pin.value() == 1:
            # Rising edge — echo started
            self._rise_time = time.ticks_us()
        else:
            # Falling edge — echo ended, calculate distance
            pulse_us = time.ticks_diff(time.ticks_us(), self._rise_time)
            if 100 < pulse_us < 25000:  # Valid range
                self._distance = pulse_us / 58.0
            self._measuring = False

    @property
    def distance(self):
        """Get last measured distance (instant — no blocking!)."""
        return self._distance

    def stop(self):
        """Clean up timer and IRQ."""
        self._timer.deinit()
        self._echo.irq(handler=None)
ISR Rules

The _echo_handler runs as an interrupt service routine. It must be fast:

  • No print(), no I2C, no UART
  • No memory allocation (no lists, no string formatting)
  • Just read the time and do arithmetic

If the ISR takes too long, it delays everything else — including other interrupts. This is why we only do ticks_diff() and a division, then get out.

→ Interrupts Reference

✅ Task — Compare: Timer Polling vs Interrupt-Driven

Now replace check_distance() with the interrupt version and re-run the same measurement:

from picobot import Robot
import time

robot = Robot()

SPEED = 80
KP = 30

# Replace Timer-based polling with interrupt-driven
us = UltrasonicIRQ()

loop_times = []

try:
    for _ in range(500):
        loop_start = time.ticks_us()

        # Line following
        error = robot.sensors.line.get_error()
        if error is not None:
            correction = KP * error
            robot.set_motors(int(SPEED + correction), int(SPEED - correction))

        # Distance check — INSTANT, no blocking ever
        distance = us.distance

        loop_time = time.ticks_diff(time.ticks_us(), loop_start)
        loop_times.append(loop_time)

except KeyboardInterrupt:
    pass

robot.stop()
us.stop()

fast = [t for t in loop_times if t < 2000]
slow = [t for t in loop_times if t >= 2000]

print(f"Total loops: {len(loop_times)}")
print(f"Fast loops (<2ms): {len(fast)}, avg {sum(fast)//max(len(fast),1)} µs")
print(f"Slow loops (>2ms): {len(slow)}, avg {sum(slow)//max(len(slow),1)} µs")
print(f"Worst case: {max(loop_times)} µs")

Compare your results:

Metric Timer Polling Interrupt-Driven
Fast loops ~495 / 500 ~500 / 500
Slow loops ~5 / 500 0
Worst case ~25,000 µs ~600 µs
Checkpoint — No More Spikes

With interrupts, every loop is fast. No more 25 ms spikes. The hardware timer sends the trigger, the echo pin ISR captures the result, and your main loop just reads us.distance — an instant property access.

Why This Matters for Architecture

You've now seen three approaches to the same problem:

Approach Blocking Jitter Complexity
read_distance() every loop 25 ms every loop Huge Simple
Timer(100) + cached 25 ms every 100 ms Periodic spikes Medium
Interrupt-driven 0 ms None More code

This is a recurring pattern in embedded systems: moving work from the main loop into hardware/ISRs makes the main loop more predictable. Professional systems take this further with DMA (hardware copies data without CPU involvement) and RTOS task scheduling.

When to Use Each Pattern

Timer polling is fine when occasional jitter doesn't matter — obstacle detection at low speed, battery monitoring, temperature checks. Most of your robot code will use this pattern.

Interrupts matter when you need deterministic loop timing — high-speed control loops, encoder counting, precise sensor fusion. The extra complexity is only worth it when you can measure the improvement.

The skill is knowing which pattern fits the problem — and being able to measure the difference rather than guessing.


Part 8: Debug Your State Machine (5 min)

State History

# After stopping, print what happened:
sm.print_history()

Output:

State History:
--------------------------------------------------
  IDLE            → FOLLOW          (1523ms)
  FOLLOW          → JUNCTION        (4521ms)
  JUNCTION        → TURN            (502ms)
  TURN            → FOLLOW          (890ms)
  ...

This tells you exactly what the robot did!


Part 9: Ultrasonic Scanning — Where Am I? (20 min)

So far you've used ultrasonic for obstacle detection — "is something close?" Now let's use it for spatial awareness — "where am I in this space?"

The Concept: Spin and Scan

If the robot spins 360° and takes distance measurements at regular intervals, it builds a simple distance profile of its surroundings:

          0° (front)
     315°   │   45°
        ╲   │   ╱
         ╲  │  ╱
  270° ────[R]──── 90°
         ╱  │  ╲
        ╱   │   ╲
     225°   │   135°
         180° (back)

Different positions produce different profiles:

Position Distance Profile
Center of room Similar distances in all directions
Against a wall Short distance in one direction, long in others
In a corner Short distances in two adjacent directions
Near doorway/gap One direction has much longer distance

The Math: Angles and Steps

To scan in N directions around 360°:

\[\text{angle\_step} = \frac{360°}{N}\]

For 8 measurements: step = 45° For 12 measurements: step = 30°

The gyroscope tracks cumulative rotation, so you turn by angle_step degrees N times.

Math Connection: Seeing in Polar Coordinates

When you spin and measure distances, you're naturally working in polar coordinates: each reading is \((r, \theta)\) where \(r\) is distance and \(\theta\) is angle. This is how LIDAR sensors on self-driving cars work—they spin and collect thousands of \((r, \theta)\) points per second.

To plot this on a map or combine scans from different positions, you convert to Cartesian: \(x = r \cos\theta\), \(y = r \sin\theta\). This is the foundation of SLAM (Simultaneous Localization and Mapping)—the robot builds a map while figuring out where it is in that map.

📚 Polar Coordinates (Wikipedia) · SLAM Introduction

⚡ Task 6 — Basic Distance Scan

Place the robot in a box or enclosed space. This code spins and collects distances:

from picobot import Robot
import time

robot = Robot()
robot.imu.calibrate()

NUM_READINGS = 8  # 8 directions = every 45°
ANGLE_STEP = 360 // NUM_READINGS
TURN_SPEED = 60

distances = []

print(f"Scanning {NUM_READINGS} directions...")
print()

for i in range(NUM_READINGS):
    # Take distance reading
    d = robot.read_distance()
    distances.append(d)

    angle = i * ANGLE_STEP
    print(f"{angle:3d}°: {d:5.1f} cm")

    # Turn to next position (except after last reading)
    if i < NUM_READINGS - 1:
        robot.turn_degrees(ANGLE_STEP)
        time.sleep(0.2)

robot.stop()

print()
print("Scan complete!")
print(f"Distances: {distances}")
print(f"Min: {min(distances):.1f} cm at {distances.index(min(distances)) * ANGLE_STEP}°")
print(f"Max: {max(distances):.1f} cm at {distances.index(max(distances)) * ANGLE_STEP}°")

✅ Record Your Scan

Angle Distance (cm)
45°
90°
135°
180°
225°
270°
315°

Questions: - Which direction has the shortest distance? - Which has the longest? - What does this tell you about your position?

Checkpoint — Scan Complete

Your table should show 8 distance values. The shortest distance should correspond to the nearest wall, and the longest to the most open direction. If the robot didn't complete a full rotation, recalibrate the IMU with robot.imu.calibrate().

Position Classification

With the distance profile, you can classify position using simple rules:

def classify_position(distances):
    """
    Classify robot position based on distance scan.

    Returns: 'corner', 'wall', 'center', or 'unknown'
    """
    min_dist = min(distances)
    max_dist = max(distances)
    avg_dist = sum(distances) / len(distances)

    # Find how many directions are "close" (< 30cm)
    CLOSE_THRESHOLD = 30
    close_count = sum(1 for d in distances if d < CLOSE_THRESHOLD)

    # Find index of minimum distance
    min_idx = distances.index(min_dist)

    # Check if adjacent readings are also close (corner detection)
    def adjacent_close(idx):
        n = len(distances)
        prev_idx = (idx - 1) % n
        next_idx = (idx + 1) % n
        return (distances[prev_idx] < CLOSE_THRESHOLD or
                distances[next_idx] < CLOSE_THRESHOLD)

    # Classification logic
    if close_count >= 2 and adjacent_close(min_idx):
        return 'corner'
    elif close_count >= 1 and min_dist < 20:
        return 'wall'
    elif min_dist > 40 and (max_dist - min_dist) < 30:
        return 'center'
    else:
        return 'open'

# Use it:
position = classify_position(distances)
print(f"Position: {position}")
The Logic Explained
Condition Interpretation
2+ adjacent close readings Walls on two sides = corner
1 close reading, others far One wall nearby
All readings similar and far Roughly equidistant from walls = center
Mix of close and far Open area with some obstacles

These thresholds (30cm, 40cm) depend on your space — adjust based on your test environment.

⚡ Task 7 — Test Position Detection

Test the classification in different positions:

Test Position Expected Actual Correct?
Center of box center
Against one wall wall
In a corner corner
Near the opening open

If classification is wrong: - Adjust CLOSE_THRESHOLD based on your box size - Check if gyro drift is causing scan misalignment - Try more readings (12 or 16) for finer resolution

Visualizing the Scan

For debugging, a visual representation helps:

def print_scan_visual(distances):
    """Print a simple radar-style visualization."""
    n = len(distances)
    max_d = max(distances)

    # Normalize distances to 0-10 scale for display
    normalized = [int(10 * d / max_d) for d in distances]

    # Create simple visual
    print("\n  Scan Visualization:")
    print("  " + "─" * 25)

    for i, (d, norm) in enumerate(zip(distances, normalized)):
        angle = i * (360 // n)
        bar = "█" * norm + "░" * (10 - norm)
        direction = ["Front", "F-R", "Right", "B-R",
                     "Back", "B-L", "Left", "F-L"][i % 8]
        print(f"  {angle:3d}° {direction:5s} |{bar}| {d:.0f}cm")

    print("  " + "─" * 25)

print_scan_visual(distances)

Scanning results can feed directly into state machine decisions — for example, a SCAN state that measures surroundings, then transitions to TURN_TO_EXIT if an opening is found, or BACK_UP if cornered.


What You Discovered

Why State Machines?

Software architecture patterns

State machines are one of many patterns for organizing code. Others include event-driven, pub-sub, and layered architectures. → Software Architecture Overview

Without With
Hidden state in variables Explicit state name
Nested if-else mess Clear state blocks
Hard to debug Log shows exactly what happened
Hard to extend Add states without touching others

The Pattern

# Every state machine has:

state = "INITIAL"

while True:
    if state == "STATE_A":
        # Actions while in this state
        do_something()

        # Transitions to other states
        if condition:
            state = "STATE_B"

    elif state == "STATE_B":
        # ...

Recap

Flags hide complexity while states expose it. State machines make behavior readable and testable, and most embedded systems use this pattern for reliability.


Challenges (Optional)

Challenge: Return Home

Add states to make the robot return to the start after delivering.

Challenge: Error Recovery

Add a "LOST" state for when the line is lost. Spin slowly to find it again.

Challenge: Multiple Destinations

Make the robot deliver to A, return, deliver to B, return, repeat.

Challenge: Find the Exit

Modify the scan to find the direction with maximum distance (likely the opening/exit). Have the robot turn to face that direction and drive toward it.

Hint: After scanning, use distances.index(max(distances)) to find the exit direction, then turn to that heading.

Challenge: Room Shape Detection

Can you distinguish a rectangular room from an irregular space? In a rectangle, opposite directions should have similar distances. Calculate the difference between 0°/180° and 90°/270° pairs.

Challenge: Continuous Mapping

Instead of stopping to scan, can you build a map while moving? Take distance readings at regular time intervals while rotating slowly. How does robot speed affect map accuracy?


← Precise Turns | Labs Overview | Next: Inside the Box →