EP1425653A2 - Desktop manager - Google Patents
Desktop managerInfo
- Publication number
- EP1425653A2 EP1425653A2 EP02777079A EP02777079A EP1425653A2 EP 1425653 A2 EP1425653 A2 EP 1425653A2 EP 02777079 A EP02777079 A EP 02777079A EP 02777079 A EP02777079 A EP 02777079A EP 1425653 A2 EP1425653 A2 EP 1425653A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- user interface
- input device
- virtual window
- freedom
- enlargement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04805—Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present invention relates to a method for managing user interfaces, to a computer software program for implementing such a method and to the use of a force / moment sensor for such a method.
- the general background of the present invention is the management of graphical user interfaces on which symbols are arranged, the arrangement being generally freely selectable by the user.
- “desktop” is the designation for the visible work surface of the graphical user interface of, for example, Microsoft Windows or OS / 2.
- “Desktop” is normally a work area on the screen that contains symbols and menus, around the surface of a To simulate desks.
- a desktop for example, is characteristic of window-oriented programs such as Microsoft Windows. The purpose of such a desktop is the intuitive operation of a computer, since the user can move the pictures of objects and start and end tasks in almost the same way as he is used to from a real desk.
- a force / torque sensor is used as an input device for such a desktop program, the state of the art with regard to Force / torque sensors are explained.
- DE 199 52 560 AI discloses a method for adjusting and / or adjusting a seat of a motor vehicle using a multifunctional, manually operated input device with a force / moment sensor.
- a force / torque sensor is shown in FIG. 6 of DE 199 52 560 AI.
- the technical details of such a sensor refer to this figure and the associated description for DE 199 52 560 AI.
- the input device has a user interface on which a number of areas are provided for entering at least one pressure pulse.
- the input device has a device for evaluating and recognizing a pressure pulse detected by means of the force / moment sensor and converted into a force and moment vector pair.
- the selected device can then be controlled linearly by means of an analog signal from the force / torque sensor.
- the selection of a function and the subsequent activation are thus separated into two processes that are separated from one another in time.
- the central finding of the invention is that a user of a real desk arranges various documents on the desk surface in accordance with an intuitive, user-specific working behavior. This aspect has already been taken into account in classic desktop technology, i.e. translated into the world of the graphical user interface.
- the first time it is possible for the first time to navigate a virtual window - similar to microfiche technology (microfilm with microcopies arranged in rows) - relative to a user interface.
- the user interface under the virtual window can be moved in three dimensions, for example.
- the user interface can therefore be larger than the desktop, depending on the desktop. In this case, the entire user interface is not displayed on the monitor. However, it is also possible to equate the size of the desktop with the entire user interface.
- a further finding in the present invention is that the user first takes a certain distance (“lean back”) in order to obtain an overview of the workplace. After recognizing desired documents etc. by means of this overview, the focus is then turned to interesting working documents This is reproduced by the invention in that the enlargement / reduction factor of a virtual window can be changed, which essentially corresponds to a zoom effect with regard to the objects located within the window of the viewer are gradually directed towards certain screen objects (working documents, icons, etc.).
- This effect is more precisely achieved according to the invention in that objects are first arranged on a user interface, for example by the user.
- the user can add, delete or move objects and scale the display size of the objects.
- This step corresponds to arranging documents on a desk, for example.
- a virtual window with an adjustable enlargement / reduction factor can be navigated with respect to the user interface, which corresponds to a focus that can be changed in terms of position and viewing angle.
- an input device which provides control signals in at least three mutually independent degrees of freedom. It is thus possible to navigate three-dimensionally with respect to the user interface, with control signals in two degrees of freedom for the
- a method for managing objects on a graphical user interface is provided. First, objects are arranged on the user interface by the user. Finally, a virtual window can be navigated with respect to the overall user interface configured in this way, the content of the window being displayed on the screen in each case.
- Control signals in Two degrees of freedom are used for the positioning of the virtual window with respect to the user interface and that
- the input device can provide control signals in at least three translational and / or rotational degrees of freedom.
- This input device can in particular be a force / moment sensor.
- an input device can also be used Navigation (for example a computer mouse) can be used, which is physically assigned an element for generating a control signal in a third degree of freedom.
- This element can be, for example, an additional switch, a rotary wheel or a button.
- the virtual window can correspond to the entire display area of a screen.
- the zoom function is carried out, the size of all objects on the total user surface changes to the same extent.
- the virtual window only as part of the total display area of the screen. If the entire user interface is then displayed on the display area of the screen, the input device can be used to navigate the virtual window as a type of “magnifying glass” with an adjustable magnification factor with respect to the user interface, so that the user interface can be moved under the “magnifying glass”, so to speak.
- the software programs to be managed can in particular be office applications, such as word processing or spreadsheets.
- the objects on the user interface can be windows of files which can be changed with regard to their display size. These files can be active, ie immediately callable and executable state are displayed. It is therefore not necessary to start an application program after activating such an object.
- the objects can be displayed on the user interface in a pseudo 3D view.
- a computer software program which implements a method of the type mentioned above when it runs on a computer.
- the invention proposes the use of a force / moment sensor for a method according to one of the above-mentioned types.
- FIG. 1 shows a system having a 3D input device and a computer with a desktop surface
- Fig. 2 shows a modification of
- 3 to 5 show a further embodiment in which a virtual window has been defined as the entire screen
- FIG. 6 shows a schematic flow diagram of a sequence for carrying out the present invention
- FIG. 7 shows the evaluation step S3 from FIG. 6 in detail.
- a PC 4 is used to implement the invention.
- This PC 4 has a monitor 6 on which a desktop 3, that is to say a section of the user interface, is displayed.
- a desktop 3 that is to say a section of the user interface
- Several graphic objects 5, 10 are arranged on this displayed section of the user interface.
- a 3D input device 1 has an operating part 7 which can be manipulated by the fingers or the hand of a user and which is movably mounted, for example, in three mutually independent rotational and three translational degrees of freedom with respect to a base part 8. A relative movement between the operating part 7 and the base part 8 is evaluated and the result of the evaluation is transmitted to the computer 4 in the form of control signals.
- the input device 1 can of course still output control signals with respect to further degrees of freedom by physically assigning it further rotary dials, buttons or switches on the operating part 7 or on the base plate 8.
- the input device 1 can be used to navigate a virtual window with an adjustable size with respect to the total area of the user interface.
- the display scale of objects within of the virtual window can be optionally selected in a particularly advantageous embodiment within certain limits by means of the input device 1.
- control signals are used in two degrees of freedom of the input device 1 for navigating the virtual window with respect to the user interface 3 (up / down or left / right).
- a control signal in a third degree of freedom of the input device 1 is provided - if this option is provided - for real-time setting of an enlargement / reduction factor for the objects lying within the virtual window.
- This enlargement / reduction factor can be changed continuously with the appropriate pixel scaling or discretely, for example in the case of defined font size levels.
- increasing the enlargement / reduction factor within the virtual window can be as
- Approximation according to the screen objects are shown larger and the section of the user interface 3 shown on the screen is reduced.
- Such a virtual window is designated by the reference symbol 2 in FIG.
- the size of this window 2 is set such that it occupies only part of the display area of the screen 6. Accordingly, it can be navigated selectively, for example, as shown, via the object 10, so that the object 10 lies within the window area. If now by means of the input device 1, the enlargement / reduction factor of the virtual window 2 is increased, which can be done in steps or continuously, results in the enlarged representation 10 ′ of the object 10, which is shown schematically in FIG.
- FIGS. 3 to 5 show the case in which the virtual window 2 is set such that it corresponds to the entire display area of the screen 6.
- the user interface 3 is thus moved with respect to the desktop.
- the display size of all objects represented on the display area changes when the enlargement / reduction factor changes. If the user has arranged a group 11 on the user interface 3, he can enlarge the display of it continuously (pixel scaling) or step by step until, for example (see FIG. 5), only the document 12 from this group 11 is legibly displayed. This corresponds to zooming in on the user interface 3.
- a computer mouse 1 ' is symbolically provided in FIG. 2 as an input device.
- This computer mouse 1 ' which can actually only provide control signals in two degrees of freedom (x-y axis), is physically assigned a further element 9 which can generate a control signal in at least one further degree of freedom.
- this further element is a rotary wheel 9, which is arranged on the top of the computer mouse 1 '. By rotating this wheel 9 to the front, the display area of a screen object 10, 10 'can also be enlarged (selective focus) or all screen objects 5, 10 can be shown enlarged (general focus).
- the reduction function can accordingly by rotating the wheel 9 in the reverse direction (in the three-dimensional Input device by pressing or tilting the control panel 7 backwards), which intuitively corresponds to leaning back of the user in order to get a better overview of the objects 5, 10 on the user interface 3.
- objects 5, 10 on the user interface 3 display files from application programs, such as word processing or spreadsheets
- these file objects can be actively displayed.
- the corresponding object is enlarged / reduced, not only is an icon enlarged or reduced as a symbol for the corresponding application program, but rather the document / spreadsheet itself can be enlarged or reduced.
- several screen objects can be actively displayed on the user interface 3 at the same time, their respective display scale being freely selectable. The user can thus, for example, arrange documents of any size and at any position on the screen surface 3.
- FIG. 6 shows schematically the sequence in the implementation of the present invention.
- Output signals of the force / moment sensor are generated in a step S1. These are then fed (step S2) to the data input of an EDP system.
- This can be done for example by means of a so-called USB interface.
- USB Universal Serial Bus
- peripheral devices such as a mouse, modem, printer, keyboard, scanner, etc.
- the transfer rate of USB version 1.1 is already 12 Mbit / s.
- step S3 the signals input by the force / moment sensor are evaluated. This step S3 is explained in detail below with reference to FIG. 7. Depending on the evaluation in step S3, the control of the graphical user interface (GUI) is then carried out in a step S4 before the data are evaluated again by the force / torque sensor.
- GUI graphical user interface
- step S3 of the sequence of FIG. 6 will now be explained in more detail.
- data in three different degrees of freedom x, y and z are evaluated, for example, to determine whether the corresponding signal is in the positive or negative range.
- degree of freedom “z” a positive signal can be used to enlarge and a negative signal to reduce the virtual window with respect to the entirety of the graphical user interface.
- a positive signal can shift the virtual window to the left and a negative signal can shift the virtual window to the right (always with respect to the entirety of the graphical user interface).
- the virtual window can therefore be designed as a fixed marking bar "under” the user interface is navigated. Objects that come under the virtual window are automatically marked “highlight”) and preselected for a possible subsequent click or other activation.
- This procedure is particularly advantageous if a directory structure (directory tree) is navigated under the fixed window, under directories located in the window can be selected automatically, so that in principle, you can navigate in infinitely large structures without the user's hand having to leave the input device.
- a "grasp" to change the image section as soon as the cursor in the case of known techniques on the edge of the screen is no longer required.
- a positive signal can move the window upwards and a negative signal can move the window downwards. This can also be seen analogously as an inverse movement of the user interface "under” the virtual window.
- the display size or the document size can be freely selected on the user interface.
- a single device such as a 3D input device or a 2D input device with additional elements
- the arrangement and the size of the screen objects on the desktop surface can be freely selected.
- the recognition value of freely arranged areas is significantly larger, since optical recognition features and not just purely memory features apply here.
- a real intuitive working behavior is largely reproduced.
- the real working behavior is usually the fact that the user works in the workplace with the inclusion of the optically perceptible sector.
- the focus on a working document that leans back to gain an overview are a natural part of processing real objects. But the present invention does it now for the first time possible to transfer such intuitive behavior to virtual objects, namely objects that are displayed on a user interface.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE10145185 | 2001-09-13 | ||
DE10145185 | 2001-09-13 | ||
DE10155030A DE10155030A1 (en) | 2001-09-13 | 2001-11-09 | desktop Manager |
DE10155030 | 2001-11-09 | ||
PCT/EP2002/010246 WO2003023592A2 (en) | 2001-09-13 | 2002-09-12 | Desktop manager |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1425653A2 true EP1425653A2 (en) | 2004-06-09 |
Family
ID=26010126
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP02777079A Withdrawn EP1425653A2 (en) | 2001-09-13 | 2002-09-12 | Desktop manager |
Country Status (3)
Country | Link |
---|---|
US (1) | US20040046799A1 (en) |
EP (1) | EP1425653A2 (en) |
WO (1) | WO2003023592A2 (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005070898A (en) * | 2003-08-20 | 2005-03-17 | Toshiba Corp | Information processor and display control method |
US7899756B2 (en) * | 2004-12-01 | 2011-03-01 | Xerox Corporation | Critical parameter/requirements management process and environment |
US8819569B2 (en) | 2005-02-18 | 2014-08-26 | Zumobi, Inc | Single-handed approach for navigation of application tiles using panning and zooming |
JP4653561B2 (en) * | 2005-05-31 | 2011-03-16 | 株式会社東芝 | Information processing apparatus and display control method |
US20070268317A1 (en) * | 2006-05-18 | 2007-11-22 | Dan Banay | User interface system and method for selectively displaying a portion of a display screen |
US8914786B2 (en) | 2007-03-23 | 2014-12-16 | Zumobi, Inc. | Systems and methods for controlling application updates across a wireless interface |
US8595642B1 (en) * | 2007-10-04 | 2013-11-26 | Great Northern Research, LLC | Multiple shell multi faceted graphical user interface |
BR112012021347A2 (en) | 2008-02-26 | 2019-09-24 | Jenavalve Tecnology Inc | stent for positioning and anchoring a valve prosthesis at an implantation site in a patient's heart |
US8289288B2 (en) * | 2009-01-15 | 2012-10-16 | Microsoft Corporation | Virtual object adjustment via physical object detection |
US9443257B2 (en) * | 2010-10-21 | 2016-09-13 | Yahoo! Inc. | Securing expandable display advertisements in a display advertising environment |
US10417018B2 (en) | 2011-05-27 | 2019-09-17 | Microsoft Technology Licensing, Llc | Navigation of immersive and desktop shells |
US9843665B2 (en) * | 2011-05-27 | 2017-12-12 | Microsoft Technology Licensing, Llc | Display of immersive and desktop shells |
US9965038B2 (en) | 2014-03-21 | 2018-05-08 | Dell Products L.P. | Context adaptable projected information handling system input environment |
US20150268739A1 (en) * | 2014-03-21 | 2015-09-24 | Dell Products L.P. | Projected Information Handling System Input Environment with Object Initiated Responses |
US9304599B2 (en) | 2014-03-21 | 2016-04-05 | Dell Products L.P. | Gesture controlled adaptive projected information handling system input and output devices |
US10133355B2 (en) | 2014-03-21 | 2018-11-20 | Dell Products L.P. | Interactive projected information handling system support input and output devices |
US20160196013A1 (en) * | 2015-01-07 | 2016-07-07 | Blackberry Limited | Electronic device and method of controlling display of information |
CN105867754B (en) * | 2015-01-22 | 2019-11-26 | 阿里巴巴集团控股有限公司 | Application interface processing method and processing device |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5341466A (en) * | 1991-05-09 | 1994-08-23 | New York University | Fractal computer user centerface with zooming capability |
US5670984A (en) * | 1993-10-26 | 1997-09-23 | Xerox Corporation | Image lens |
JP2813728B2 (en) * | 1993-11-01 | 1998-10-22 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Personal communication device with zoom / pan function |
EP0693852A3 (en) * | 1994-07-22 | 1997-05-28 | Eastman Kodak Co | Method and apparatus for applying a function to a localized area of a digital image using a window |
US6037939A (en) * | 1995-09-27 | 2000-03-14 | Sharp Kabushiki Kaisha | Method for enabling interactive manipulation of data retained in computer system, and a computer system for implementing the method |
US5999169A (en) * | 1996-08-30 | 1999-12-07 | International Business Machines Corporation | Computer graphical user interface method and system for supporting multiple two-dimensional movement inputs |
US6097393A (en) * | 1996-09-03 | 2000-08-01 | The Takshele Corporation | Computer-executed, three-dimensional graphical resource management process and system |
US6128006A (en) * | 1998-03-26 | 2000-10-03 | Immersion Corporation | Force feedback mouse wheel and other control wheels |
US6275232B1 (en) * | 1998-12-14 | 2001-08-14 | Sony Corporation | Polymorphic event handling for zooming graphical user interface |
US20020060691A1 (en) * | 1999-11-16 | 2002-05-23 | Pixel Kinetix, Inc. | Method for increasing multimedia data accessibility |
-
2002
- 2002-09-12 EP EP02777079A patent/EP1425653A2/en not_active Withdrawn
- 2002-09-12 WO PCT/EP2002/010246 patent/WO2003023592A2/en not_active Application Discontinuation
- 2002-09-12 US US10/433,514 patent/US20040046799A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
See references of WO03023592A2 * |
Also Published As
Publication number | Publication date |
---|---|
WO2003023592A3 (en) | 2004-02-12 |
WO2003023592B1 (en) | 2004-03-25 |
WO2003023592A2 (en) | 2003-03-20 |
US20040046799A1 (en) | 2004-03-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1425653A2 (en) | Desktop manager | |
DE60024655T2 (en) | METHOD OF USING KEYS ASSOCIATED WITH A DISPLAYING DEVICE FOR ACCESSING AND PERFORMING ASSOCIATED FUNCTIONS | |
DE69429237T2 (en) | User interface device for computer system | |
DE102012109058B4 (en) | Control method and electronic device | |
DE69130773T2 (en) | ELECTRONIC DISPLAY AND DATA PROCESSING DEVICE | |
DE3881331T2 (en) | Flexible window management on a computer display. | |
EP1272921B1 (en) | Method for navigating between windows in a display area | |
DE69426548T2 (en) | User interface with click-through tools that can be put together with other tools | |
EP1513056B1 (en) | Three-dimensional integrated touch screen input apparatus | |
DE19744861A1 (en) | Computer control method using 3-dimensional mouse | |
DE102012020607B4 (en) | A motor vehicle with a gesture control device and method for controlling a selection element | |
DE102012014098A1 (en) | Method for imitating the touch screen control by a mouse | |
DE69221204T2 (en) | Data processing device for window position control | |
DE102012014603A1 (en) | Synchronized operating system for operating e.g. portable touch smartphone for communication, has key board and cursor device enabling to perform file duplication, writing, deleting and processing between host and touch device with window | |
DE102019106684A1 (en) | System for haptic interaction with virtual objects for applications in virtual reality | |
DE102012220062A1 (en) | SETTING MULTIPLE USER INPUT PARAMETERS | |
DE10140874A1 (en) | Touch screen type graphical user interface in which the sizing of screen areas is automatically adjusted to simplify user input and make control of devices, especially communications devices more user friendly | |
WO2017144298A1 (en) | User interface comprising a plurality of display units, and method for positioning contents on a plurality of display units | |
DE102013203918A1 (en) | A method of operating a device in a sterile environment | |
EP2877910B1 (en) | Input device having a lowerable touch-sensitive surface | |
DE10084249T5 (en) | Additional LCD panel with touch screen | |
DE69320288T2 (en) | Method and device for interacting with a user interface of a computer system using a stylus | |
DE10155030A1 (en) | desktop Manager | |
WO2003042802A2 (en) | Input device, webcam and screen having a voice input function | |
DE102009003995A1 (en) | Method for enlarging a display area on a presentation device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20040202 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
17Q | First examination report despatched |
Effective date: 20070817 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: 8566 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20071228 |