Linux Input Events
Time estimate: ~45 minutes Prerequisites: SSH Login, DSI Display (for touch) or USB keyboard/mouse
Learning Objectives
By the end of this tutorial you will be able to:
- Explain how Linux delivers hardware input to applications
- Discover input devices and read raw events from
/dev/input/eventN - Decode
input_eventstructs in C (type, code, value) - Handle touch events using the multi-touch protocol
- Build a swipe gesture detector from raw events
- Understand how SDL2's event system maps to the kernel's input layer
The Input Pipeline
Every keypress, mouse movement, and finger touch follows the same path:
A USB keyboard and a capacitive touchscreen both produce input_event structs with the same format. The only difference is which event types and codes they report. Understanding this single abstraction lets you handle any input device — including ones that don't exist yet.
See also: Input Subsystem Reference for the full specification.
1. Discover Input Devices
Every input device gets a /dev/input/eventN file. The kernel creates these automatically when a driver registers an input device.
You'll see entries like:
I: Bus=0018 Vendor=0000 Product=0000 Version=0000
N: Name="Goodix Capacitive TouchScreen"
P: Phys=input/ts
S: Sysfs=/devices/platform/soc/fe804000.i2c/i2c-1/1-0014/input/input2
U: Uniq=
H: Handlers=event2
B: EV=b
B: KEY=400 0 0 0 0 0
B: ABS=2618000 0
The key fields:
| Field | Meaning |
|---|---|
N: Name= |
Human-readable device name |
H: Handlers=event2 |
Device file is /dev/input/event2 |
B: EV=b |
Bitmask of supported event types |
B: ABS= |
Bitmask of supported absolute axes (touchscreens) |
B: KEY= |
Bitmask of supported keys |
Tip
Shortcut: ls -la /dev/input/by-id/ and /dev/input/by-path/ provide stable symlinks that don't change when devices are plugged in a different order.
2. Read Events with evtest
evtest is the standard tool for watching raw input events:
Select your device (e.g., the touchscreen), then touch the screen or press keys. You'll see:
Event: time 1710332456.789012, type 3 (EV_ABS), code 53 (ABS_MT_POSITION_X), value 412
Event: time 1710332456.789012, type 3 (EV_ABS), code 54 (ABS_MT_POSITION_Y), value 287
Event: time 1710332456.789012, type 3 (EV_ABS), code 57 (ABS_MT_TRACKING_ID), value 45
Event: time 1710332456.789012, type 0 (EV_SYN), code 0 (SYN_REPORT), value 0
Checkpoint
Run evtest and touch the screen. You should see ABS_MT_POSITION_X and ABS_MT_POSITION_Y events with coordinates changing as you move your finger.
3. The input_event Struct
Every event from /dev/input/eventN is exactly this 24-byte struct (on 64-bit systems):
#include <linux/input.h>
struct input_event {
struct timeval time; /* when the event happened */
__u16 type; /* what kind of event */
__u16 code; /* which axis, key, or button */
__s32 value; /* the value (position, pressed/released, etc.) */
};
The three fields that matter:
| Field | Example | Meaning |
|---|---|---|
type |
EV_KEY (1) |
Key/button event |
EV_REL (2) |
Relative axis (mouse movement) | |
EV_ABS (3) |
Absolute axis (touchscreen position) | |
EV_SYN (0) |
Synchronization (end of event group) | |
code |
KEY_A (30) |
Which key |
ABS_MT_POSITION_X (53) |
Touch X coordinate | |
REL_X (0) |
Mouse X delta | |
value |
1 / 0 | Pressed / released (for keys) |
| 0–799 | Pixel coordinate (for touch) | |
| -5 | Mouse moved 5 pixels left |
Events come in groups terminated by EV_SYN / SYN_REPORT. A single touch produces multiple events (X, Y, tracking ID, pressure) before the SYN_REPORT marks "all fields for this instant are now reported."
4. Read Raw Events in C
Create input_reader.c:
/* input_reader.c — Read raw input events from /dev/input/eventN */
#include <stdio.h>
#include <stdlib.h>
#include <fcntl.h>
#include <unistd.h>
#include <linux/input.h>
static const char *ev_type_name(int type)
{
switch (type) {
case EV_SYN: return "SYN";
case EV_KEY: return "KEY";
case EV_REL: return "REL";
case EV_ABS: return "ABS";
default: return "???";
}
}
int main(int argc, char *argv[])
{
if (argc < 2) {
fprintf(stderr, "Usage: %s /dev/input/eventN\n", argv[0]);
return 1;
}
int fd = open(argv[1], O_RDONLY);
if (fd < 0) {
perror("open");
return 1;
}
printf("Reading events from %s (Ctrl+C to stop)...\n\n", argv[1]);
struct input_event ev;
while (read(fd, &ev, sizeof(ev)) == sizeof(ev)) {
printf("[%ld.%06ld] type=%s(%d) code=%d value=%d\n",
ev.time.tv_sec, ev.time.tv_usec,
ev_type_name(ev.type), ev.type,
ev.code, ev.value);
}
close(fd);
return 0;
}
Build and run:
gcc -Wall -O2 -o input_reader input_reader.c
sudo ./input_reader /dev/input/event2 # adjust device number
Checkpoint
Touch the screen and see raw events printed. Note how each touch produces a burst of ABS events followed by a single SYN event.
5. Touch Event Protocol
Linux uses multi-touch protocol type B for modern touchscreens. Each finger gets a slot and a tracking ID:
ABS_MT_SLOT → which finger slot (0, 1, 2, ...)
ABS_MT_TRACKING_ID → unique ID for this contact (new touch = new ID, lift = -1)
ABS_MT_POSITION_X → X coordinate
ABS_MT_POSITION_Y → Y coordinate
SYN_REPORT → end of frame
Finger down: new tracking ID assigned to a slot
Finger move: position updates in the same slot
Finger up: tracking ID set to -1
# Finger touches at (200, 300):
ABS_MT_SLOT value=0 ← slot 0
ABS_MT_TRACKING_ID value=45 ← new contact
ABS_MT_POSITION_X value=200
ABS_MT_POSITION_Y value=300
SYN_REPORT
# Finger moves to (250, 310):
ABS_MT_POSITION_X value=250 ← slot 0 still active
ABS_MT_POSITION_Y value=310
SYN_REPORT
# Finger lifts:
ABS_MT_TRACKING_ID value=-1 ← contact ended
SYN_REPORT
Info
The kernel only sends changed fields. If only X changes, you won't see a Y event. Your code must remember the last known Y value for that slot.
6. Build a Touch Event Handler
Create touch_monitor.c — a program that tracks multi-touch contacts and prints their state:
/* touch_monitor.c — Track multi-touch contacts from raw events */
#include <stdio.h>
#include <stdlib.h>
#include <fcntl.h>
#include <unistd.h>
#include <linux/input.h>
#define MAX_SLOTS 10
typedef struct {
int active; /* tracking_id != -1 */
int x, y;
int id; /* tracking ID */
} touch_slot_t;
int main(int argc, char *argv[])
{
if (argc < 2) {
fprintf(stderr, "Usage: %s /dev/input/eventN\n", argv[0]);
return 1;
}
int fd = open(argv[1], O_RDONLY);
if (fd < 0) { perror("open"); return 1; }
touch_slot_t slots[MAX_SLOTS] = {0};
int cur_slot = 0;
printf("Touch monitor — touch the screen...\n\n");
struct input_event ev;
while (read(fd, &ev, sizeof(ev)) == sizeof(ev)) {
switch (ev.type) {
case EV_ABS:
switch (ev.code) {
case ABS_MT_SLOT:
cur_slot = ev.value;
if (cur_slot >= MAX_SLOTS) cur_slot = 0;
break;
case ABS_MT_TRACKING_ID:
if (ev.value == -1) {
printf(" Slot %d: LIFTED\n", cur_slot);
slots[cur_slot].active = 0;
} else {
slots[cur_slot].active = 1;
slots[cur_slot].id = ev.value;
}
break;
case ABS_MT_POSITION_X:
slots[cur_slot].x = ev.value;
break;
case ABS_MT_POSITION_Y:
slots[cur_slot].y = ev.value;
break;
}
break;
case EV_SYN:
if (ev.code == SYN_REPORT) {
/* Print all active contacts */
int n = 0;
for (int i = 0; i < MAX_SLOTS; i++) {
if (slots[i].active) {
printf(" Slot %d: (%4d, %4d) id=%d\n",
i, slots[i].x, slots[i].y, slots[i].id);
n++;
}
}
if (n > 0) printf(" --- %d finger(s) ---\n", n);
}
break;
}
}
close(fd);
return 0;
}
Checkpoint
Touch with one finger, see position updates. Touch with two fingers — you'll see two slots with independent coordinates. Lift one finger — that slot shows "LIFTED" while the other continues tracking.
7. Detect a Swipe Gesture
Now let's build something useful: a swipe-up detector that could be used as an "exit" gesture (like we use in the audio visualizer and other SDL2 apps). The logic:
- Finger down in bottom 10% of screen → start tracking
- Finger moves past the middle of the screen → swipe detected
- Finger up → reset
Create swipe_detect.c:
/* swipe_detect.c — Detect swipe-up from bottom edge */
#include <stdio.h>
#include <stdlib.h>
#include <fcntl.h>
#include <unistd.h>
#include <linux/input.h>
#include <linux/input-event-codes.h>
int main(int argc, char *argv[])
{
if (argc < 2) {
fprintf(stderr, "Usage: %s /dev/input/eventN [screen_height]\n", argv[0]);
return 1;
}
int fd = open(argv[1], O_RDONLY);
if (fd < 0) { perror("open"); return 1; }
/* Query actual axis range from the device */
struct input_absinfo abs_y;
if (ioctl(fd, EVIOCGABS(ABS_MT_POSITION_Y), &abs_y) < 0) {
perror("ioctl EVIOCGABS");
close(fd);
return 1;
}
int y_max = abs_y.maximum;
printf("Y axis range: 0 — %d\n", y_max);
int threshold_start = y_max * 90 / 100; /* bottom 10% */
int threshold_end = y_max * 50 / 100; /* above middle */
printf("Swipe zone: start > %d, end < %d\n", threshold_start, threshold_end);
printf("Swipe up from the bottom edge...\n\n");
int tracking = 0; /* actively tracking a swipe candidate */
int cur_slot = 0;
int track_slot = -1; /* which slot we're following */
int cur_y = 0;
struct input_event ev;
while (read(fd, &ev, sizeof(ev)) == sizeof(ev)) {
if (ev.type == EV_ABS) {
switch (ev.code) {
case ABS_MT_SLOT:
cur_slot = ev.value;
break;
case ABS_MT_TRACKING_ID:
if (ev.value == -1 && cur_slot == track_slot) {
/* Finger lifted — cancel */
tracking = 0;
track_slot = -1;
}
break;
case ABS_MT_POSITION_Y:
if (cur_slot == track_slot || !tracking)
cur_y = ev.value;
/* Start: finger down in bottom zone */
if (!tracking && cur_y > threshold_start) {
tracking = 1;
track_slot = cur_slot;
printf(" Swipe started (y=%d)\n", cur_y);
}
/* Complete: finger moved above middle */
if (tracking && cur_slot == track_slot &&
cur_y < threshold_end) {
printf(" >>> SWIPE UP DETECTED! <<<\n\n");
tracking = 0;
track_slot = -1;
}
break;
}
}
}
close(fd);
return 0;
}
Checkpoint
Start a touch at the very bottom of the screen and drag upward. When your finger passes the middle, you should see "SWIPE UP DETECTED!" This is the same gesture used by the Qt app launcher's child apps to return to the home screen.
8. How SDL2 Wraps the Input Layer
SDL2 reads from /dev/input/eventN internally and translates kernel events into its own event types:
| Kernel Event | SDL2 Event | Notes |
|---|---|---|
EV_KEY + KEY_A |
SDL_KEYDOWN / SDL_KEYUP |
Key press/release |
EV_REL + REL_X/Y |
SDL_MOUSEMOTION |
Mouse delta |
EV_KEY + BTN_LEFT |
SDL_MOUSEBUTTONDOWN |
Mouse click |
EV_ABS + ABS_MT_* |
SDL_FINGERDOWN / SDL_FINGERMOTION / SDL_FINGERUP |
Touch |
EV_ABS + ABS_MT_* |
SDL_MOUSEMOTION (also) |
SDL2 generates mouse events from touch too |
Key differences from raw events:
- Normalized coordinates — SDL2 touch coordinates are 0.0–1.0 (float), not raw pixel values
- No slots — SDL2 uses
fingerIdinstead of slot numbers - Automatic SYN batching — you never see
EV_SYN, events arrive pre-batched - Touch-to-mouse emulation — SDL2 sends mouse events for the first touch, so apps that only handle mouse input still work on touchscreens
This is the swipe gesture from our SDL2 apps, using SDL2's abstraction:
/* SDL2 swipe-up exit — same logic, higher-level API */
int swipe_active = 0;
while (SDL_PollEvent(&ev)) {
/* Finger touches bottom 10% of screen */
if (ev.type == SDL_FINGERDOWN && ev.tfinger.y > 0.9f)
swipe_active = 1;
/* Finger released — cancel */
if (ev.type == SDL_FINGERUP)
swipe_active = 0;
/* Finger dragged past middle — exit */
if (ev.type == SDL_FINGERMOTION && swipe_active &&
ev.tfinger.y < 0.5f)
running = 0;
}
Compare this with the raw evdev version from Section 7 — the logic is identical, but SDL2 handles device discovery, coordinate normalization, and event batching for you.
When to Use Raw evdev vs SDL2
| Use raw evdev when... | Use SDL2 when... |
|---|---|
| No display (headless input processing) | Building a graphical application |
| Custom input device (barcode scanner, RFID) | Standard keyboard/mouse/touch |
Need exclusive device access (EVIOCGRAB) |
Multi-platform compatibility matters |
Implementing a virtual input device (uinput) |
Want touch-to-mouse emulation for free |
| Kernel driver development or debugging | Rapid prototyping |
9. Practical Exercises
Tip
Challenge 1: Key Logger — Modify input_reader.c to only print key events (EV_KEY) and translate key codes to characters using linux/input-event-codes.h. Count words per minute.
Tip
Challenge 2: Touch Heatmap — Modify touch_monitor.c to accumulate touch positions into a 2D grid (e.g., 80×48). Print an ASCII heatmap showing which screen regions are touched most. Which corners are hardest to reach?
Tip
Challenge 3: Gesture Library — Extend swipe_detect.c to detect four swipe directions (up, down, left, right) and pinch-to-zoom (two fingers moving closer or apart). Print the detected gesture name.
Tip
Challenge 4: Measure Touch Latency — Timestamp when ABS_MT_TRACKING_ID arrives (kernel timestamp) vs when you receive it in userspace (clock_gettime). What's the kernel-to-userspace delay? How does it compare to the total pipeline latency shown by the audio visualizer?
What Just Happened?
You explored the Linux input subsystem from bottom to top:
- Discovery — found devices in
/proc/bus/input/devices - Raw events — read
struct input_eventdirectly from device files - Multi-touch protocol — tracked finger slots, positions, and lift events
- Gesture detection — built a swipe detector from raw events
- SDL2 mapping — understood how SDL2 translates kernel events to its API
The same read() + struct input_event pattern works for any input device Linux supports. Whether you're reading a touchscreen, a USB gamepad, a barcode scanner, or a rotary encoder — the kernel presents them all through the same interface.
Next: SDL2 Touch Paint — use SDL2's input abstraction to build an interactive drawing app and measure touch responsiveness across display interfaces.
Reference: Linux Input Subsystem — complete specification including uinput virtual devices, exclusive access, and kernel driver internals.