Recently, an Apple patent was granted by the USPT Office for a system that describes a virtual input tool, such as a trackpad. The tool interacts with on-screen objects allowing users to navigate through programs and documents more easily.
The “Virtual input tool” patent is a simple and effective tweak for the actual UI offered in modern computer operating systems. The patent describes a system that allows people to have direct implication for the touch screen movement. Unlike the classic UI, such as a mouse, which translates data from an input device and maps it to the movement of the cursor, the new Apple patent offers a direct contact.
First of all, the virtual input tool patent offers a clear view regarding a physical input device, such as a multitouch trackpad similar with the one found within Apple’s MacBook Pro. The trackpad will be displayed on screen, and it will be mapped to the coordinates of the actual device. Along with the virtual trackpad, the patent also displays some interactive elements that can be directly selected and manipulated without the help of a cursor. All these elements are in a direct connection to the physical device. However, according to the patent files, these files are application windows, directories or even media files.
The patent summary describes a “virtual representation of an input device that can be a two-dimensional area that increases an amount of data” such as a virtual representation of objects presented in a particular time. With the help of this method, the trackpad can be used integrally to navigate through the environment. In the same time, the trackpad surface can be used as a multitouch system with various gestures such as one-finger tap-and hold.
The virtual input tool combines six different engines: an identification engine for detecting the physical device, the render engine which presents the virtual interface tools, a mapping engine that connects the physical coordinates with the virtual ones, a preference engine to customize the virtual device, a presentation engine that presents the interactive objects and the interactivity engine that translates the onscreen interactions.
In what concerns the interactivity system, the engine receives a signal that user transmits to the trackpad. From this information, the tapped area is mapped to the virtual trackpad, and the interactive object is activated. In the same time, the interactivity surface also recognizes swipe gestures that can be used to control the size and operation of the virtual display. However, once the user removes its finger, the interaction is canceled.
For now, it is not certain whether if Apple will make use of the virtual input tool for UI interactivity on OS X devices on any future product. However, in a case of a touchscreen display laptop, it will be more than useful. The patent was filled in 2009, and it was created by John O. Louch.