Create TouchEvent as a subclass of InputEvent.
Generate the events as a result of touchscreen actions. Each event has to carry a touch point (representing one pressed finger) and references to all other touch points. This design allows for handling and consuming each finger separately while making it possible to encapsulate handling of more complex multi-touch gestures in which not all touch points need to be over the handling node.
In any moment of a multi-touch action, we will have a set of touch points. For each of those touch points create one touch event. Mark each one of this bunch of events by a common eventSetId number. This number will be sequential number unique in scope of a gesture (when all fingers are released, the number will be reset). All touch events from the set will carry the same list of touch points, each of them carrying different one as the "main" touch point.
Provide event types TOUCH_PRESSED, TOUCH_MOVED, TOUCH_STATIONARY, TOUCH_RELEASED, with the usual setOnTouchPressed, ...
Provide following fields: touchCount (current number of touch points), modifiers, the earlier mentioned eventSetId, touchPoint (this event's main touch point) and touchPoints (list of all touch points sorted in the pressed order).
For each touch point provide: state (PRESSED, MOVED, STATIONARY, RELEASED - corresponding to event's type), coordinates, id and target (the node to which this touch point's event is delivered). The Id will contain a sequential number of the touch point unique in scope of a gesture. First touched finger gets id=1, each next touched finger gets next ordinary number, when all fingers are released the counter is reset. It will then provide method belongsTo(node), which will test whether this touch point is delivered to the given target node (including bubbling); this is useful for testing the other touch points in an event.
By default, each touch point will be delivered to a single node during its whole trajectory, similarly to mouse dragging: the node will be picked in the moment of touch point press and used as a target no matter how far the touch point moves. This behavior is great for dragging nodes (and is consistent with mouse events), but sometimes it's needed to have it different way (when sliding a finger over a list always selecting the row under the finger - it needs to be delivered always to the picked node) - in this case it can be modified by using a grabbing API. The touch point will provide methods ungrab() (since this call the touch point will be always delivered to a node picked under it), grab(node) (since this call the touch point will be always delivered to the given node) and grab() (since this call the touch point will be grabbed by the current source - the node whose handler calls it). The default behavior means that each newly pressed touch point is automatically grabbed by the node picked for it.
Touch events need to be automatically translated to mouse events. So when user touches the screen and moves the finger, the application will not only get the appropriate touch events, but also mouse dragging events, tap will result in mouse click, etc. Utilize the platform's translation from touch events to mouse events where present, implement it on Glass level for the other platforms.