You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Gamepad API currently supports three types of inputs: buttons, axes, and touch surfaces. The current proposal would fire events when buttons or axes change but not touch inputs. If this is intentional then consider calling out touch surface inputs as explicitly out of scope in the non-goals section. If touch surfaces are out of scope then I think the event should be renamed so it's clear it only fires for button and axis changes.
Alternatively, the proposal could be extended to fire rawgamepadinputchange when any input changes, including touch inputs. For touch surfaces I think it would make sense to add three arrays to the event to list the touchIds for updated touch points. A "changed" array lists all touchId values for updated touch points, a "down" array lists touchId values for new touch points, and an "up" array lists touchId values for touch points that ended.
// Touch ID 0 ends, touch ID 1 moves, touch ID 2 starts.
touchesChanged: [0, 1, 2],
// Touch ID 2 starts.
touchesDown: [2],
// Touch ID 0 ends.
touchesUp: [0]
GamepadButton has a touched member which indicates when the button is detecting touch. It's possible for the touched member to change while the pressed and value members remain unchanged. If that happens, should rawgamepadinputchange be fired? I think it makes the most sense to fire the event for any change to any Gamepad attribute, including touch.
https://github.com/MicrosoftEdge/MSEdgeExplainers/blob/main/GamepadEventDrivenInputAPI/explainer.md
The text was updated successfully, but these errors were encountered: