assumption: gestures performed by clients (via raw events) and also a global gesture recognizer on the compositor's side. protocol/wayland.xml: compositor/evdev.c - basic filtering to convert all input device events to a unique stream, i.e. "type B" slot protocol; straightforward accomplished with mtdev library - {single, multi}-touch work simultaneously using the same driver - at the initialization, we have to register if the device is direct (wl_input_touch) or indirect (wl_input_pointer); okay, we have touchpad flag now for that. For wl_input_pointer we create a sprite. - forward ABS_MT_* only to the compositor using notify_touch() - parallel from all the touch stream, there's also the raw device's, which forwards all way down to clients raw events from the input device. It's totally different path. compositor/compositor.c - each touch point is recorded separately here (using ABS_MT_SLOT), and the compositor maintains this kind of information (struct touch_point) - notify_touch() pretty much takes care of the touch life cycle, or "a frame" that happens from the moment that one or more touches are down until they get released. - in practice, it creates one touch point or find an already existent, slightly process them based on the event type and eventually creates the event to be delivered for client surfaces. - namely, events are touch_down, touch_motion and touch_up; touch_frame might not be required due ABS_MT_SLOT; touch_cancel might not be required if we're forwarding raw device events as well. - compositor's side gesture recognition will happen at this point as well, and an axis event (zoom, pinch, etc) will be created for delivery as well. - for each touch_point, the surface is picked and the events just generated are delivered. - a (implicit) grab may happen as well - an implicit grab will start, depending on BTN_TOUCH (and probably ABS_PRESSURE as well), and multiple grabs may happen at the same time on a single surface - the grab triggers on the first touch event, and all following touch events go to that surface, which is performed until all touch points get released. clients/*-mt.c: - extend wl_input_device_listener with the according multitouch events - touch_{down, move, up} hooks could all centralize the events to one place and eventually trigger the (client's side) gesture recognizer.