Pointing device gesture
In computing, a pointing device gesture or mouse gesture (or simply gesture) is a way of combining pointing device or finger movements and clicks that the software recognizes as a specific computer event and responds to accordingly. They can be useful for people who have difficulties typing on a keyboard. For example, in a web browser, a user can navigate to the previously viewed page by pressing the right pointing device button, moving the pointing device briefly to the left, then releasing the button.
History
[edit]The first[1] pointing device gesture, the "drag", was introduced by Apple to replace a dedicated "move" button on mice shipped with its Macintosh and Lisa computers. Dragging involves holding down a pointing device button while moving the pointing device; the software interprets this as an action distinct from separate clicking and moving behaviors. Unlike most pointing device gestures, it does not involve the tracing of any particular shape. Although the "drag" behavior has been adopted in a huge variety of software packages, few other gestures have been as successful.
Current use
[edit]As of 2005[update], most programs do not support gestures other than the drag operation. Each program that recognizes pointing device gestures does so in its own way, sometimes allowing for very short mouse movement distances to be recognized as gestures, and sometimes requiring very precise emulation of a certain movement pattern (e.g. circle). Some implementations allow users to customize these factors.
Some video games have used gestures. For example, in the Myth real-time tactics series, originally created by Bungie, players use them to order battlefield units to face in a desired direction. Another game using gestures is Lionhead's Black & White. The game Arx Fatalis uses mouse gestures for drawing runes in the air to cast spells. Several Nintendo Wii games take advantage of such a system. Ōkami uses a system similar to mouse gestures; the player can enter a drawing mode in which the shape they create (circle, lightning bolt, line, etc.) performs a function in the game such as creating a bomb or changing the time from night to day. Other examples of computer games that use mouse gestures are Die by the Sword and Silver where basic mouse gestures actually map attack moves and such in real-time combat, along with MX vs. ATV: Reflex, which has a control scheme that implements its titular rider "reflex" system with mouse gestures.[2]
The Opera web browser has recognized gestures since version 5.10 (April 2001) but this feature was disabled by default. Opera browser also supports mouse chording which serves a similar function but doesn't necessitate mouse movement. The first browser that used advanced mouse gestures (in 2002) was Maxthon, in which a highly customizable interface allowed the assignment of almost every action to one of 52 mouse gestures and few mouse chords. Several mouse gesture extensions are also available for the Mozilla Firefox browser. These extensions use almost identical gestures as Opera.
Some tools provide mouse gestures support in any application for Microsoft Windows. K Desktop Environment 3 includes universal mouse gesture support since version 3.2.
Windows Aero provides three mouse gestures called Aero Peek, Aero Shake and Aero Snap. See the corresponding article for a description.
Touchpad and touchscreen gestures
[edit]Touchscreens of tablet-type devices, such as the iPad, utilize multi-touch technology, with gestures acting as the main form of user interface. Many touchpads, which in laptops replace the traditional mouse, have similar gesture support. For example, a common gesture is to use two fingers in a downwards or upwards motion to scroll the currently active page. The rising popularity of touchscreen interfaces has led to gestures becoming a more standard feature in computing. Windows 7 introduced touchscreen support and touchpad gestures.[3] Its successor, Windows 8 is designed to run both on traditional desktops and mobile devices and hence gestures are now enabled by default where the hardware allows it.[citation needed]
Related to gestures are touchpad hotspots, where a particular region of the touchpad has additional functionality. For example, a common hotspot feature is the far right side of the touchpad, which will scroll the active page if a finger is dragged down or up it.
Multi-touch touchscreen gestures are predefined motions used to interact with multi-touch devices. An increasing number of products like smartphones, tablets, laptops or desktop computers have functions that are triggered by multi-touch gestures. Common touchscreen gestures include:
Tap |
||
Double Tap |
||
Long Press |
||
Scroll, Swipe |
||
Pan |
||
Flick |
||
Two Finger Tap |
||
Two Finger Scroll |
||
Pinch |
||
Zoom |
||
Rotate |
Other gestures including more than 2 fingers on screen have also been developed such as Sticky Tools.[4] These techniques are often developed for 3D applications and are not considered standard.
Drawbacks
[edit]A major drawback of current gesture interaction solutions is the lack of support for two necessary user interface design principles, feedback and visibility (or affordance). Feedback notification is required to indicate whether the gesture has been entered correctly by indicating the gesture recognized and the corresponding command activated, although Sensiva does approach this to some extent in providing voice notification. The other principle is visibility of gestures, providing the user some means of learning the necessary gestures and the contexts they can be used in. Both Mouse Gestures for Internet Explorer and ALToolbar Mouse Gestures display colored tracers that indicate the current motion that the user is taking to facilitate visual clues for the user. Pie menus and marking menus have been proposed as solutions for both problems, since they support learning of the available options but can also be used with quick gestures. Most recent versions of Opera (11 and above) uses an on-screen pie menu to simply and instructively display which mouse gestures are available and how to activate them, providing feedback and visibility.[5]
One limitation with gesture interaction is the scope context in which the gestures can be used. For example, each gesture has only one corresponding command for each application window.
Holding down buttons while moving the mouse can be awkward and requires some practice, since the downwards action increases friction for the horizontal motion. An optical mouse would be less susceptible to changes in behavior than a ball mouse with increased friction because the sensor does not rely on mechanical contact to sense movement; a touchpad provides no added friction with all its buttons held down with a thumb. However, it was also argued that muscular tension resulting from holding down buttons could be exploited in user interface design as it gives constant feedback that the user is in a temporary state, or mode (Buxton, 1995).
See also
[edit]References
[edit]- ^ "A Quick History of Drag and Drop – A GoPhore Article". 365Trucking.com. Archived from the original on 2019-07-02. Retrieved 2019-07-02.
- ^ "MX vs. ATV: Reflex PC UK Manual" (PDF). p. 3. Archived (PDF) from the original on 13 February 2022. Retrieved 13 February 2022.
- ^ "Windows 7 Hardware: Touch Finally Arrives". 2009-09-28. Archived from the original on 2012-11-07. Retrieved 2012-11-19.
- ^ Hancock, Mark; ten Cate, Thomas; Carpendale, Sheelagh (2009). "Sticky tools". Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces - ITS '09. New York, New York, USA: ACM Press. p. 133. doi:10.1145/1731903.1731930. ISBN 978-1-60558-733-2.
- ^ "Opera Tutorials - Gestures". Archived from the original on 7 September 2012. Retrieved 3 August 2012.
- Buxton, W. A. (1995). "Chunking and phrasing and the design of human-computer dialogues" in Human-Computer interaction: Toward the Year 2000, R. M. Baecker, J. Grudin, W. A. Buxton, and S. Greenberg, Eds. Morgan Kaufmann Publishers, San Francisco, CA, 494-499.
- The Unknown History of Pen Computing Archived 2017-04-18 at the Wayback Machine contains a history of pen computing, including touch and gesture technology, from approximately 1917 to 1992.
- Annotated bibliography of references to handwriting recognition, gesture user interfaces, and pen computing[permanent dead link]
- L. K. Welbourn and R. J. Whitrow. 1988. A gesture based text editor. In Proceedings of the Fourth Conference of the British Computer Society on People and computers IV, D. M. Jones and R. Winder (Eds.). Cambridge University Press, New York, NY, USA, 363-371. ISBN 0-521-36553-8
- Brad A. Myers. "A Brief History of Human Computer Interaction Technology Archived 2019-06-18 at the Wayback Machine". ACM interactions. Vol. 5, no. 2, March, 1998. pp. 44–54.
- Notes on the History of Pen-based Computing (YouTube) Archived 2021-11-10 at the Wayback Machine