[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

EP1473624A2 - System, method and user interface for active reading of electronic content - Google Patents

System, method and user interface for active reading of electronic content Download PDF

Info

Publication number
EP1473624A2
EP1473624A2 EP04018712A EP04018712A EP1473624A2 EP 1473624 A2 EP1473624 A2 EP 1473624A2 EP 04018712 A EP04018712 A EP 04018712A EP 04018712 A EP04018712 A EP 04018712A EP 1473624 A2 EP1473624 A2 EP 1473624A2
Authority
EP
European Patent Office
Prior art keywords
window
list
display portion
computer
menu options
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP04018712A
Other languages
German (de)
French (fr)
Other versions
EP1473624B1 (en
EP1473624A3 (en
Inventor
Marco A. Demello
Vikram Madan
Leroy B. Keely
David M. Silver
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Corp
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Publication of EP1473624A2 publication Critical patent/EP1473624A2/en
Publication of EP1473624A3 publication Critical patent/EP1473624A3/en
Application granted granted Critical
Publication of EP1473624B1 publication Critical patent/EP1473624B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the disclosure generally relates to the electronic display of documents. More particularly, the disclosure relates to a user interface for use with viewing electronically displayed documents.
  • Paper books have a simple user interface. One opens a book and begins to read it. If a user wants to do something to the book (for example, add a textual annotation), he may use a pen or pencil to add a notation in the margin next to a relevant passage of the book.
  • the objects presented to the user include the paper page of the book with two regions (the margin and the text on the page itself) and the writing implement.
  • Other types of actions are also available including bookmarking the page (for example, by folding over the top comer of the page), adding a drawing (using the same pen or pencil discussed above), and highlighting a passage (with a pen or pencil of different color).
  • a user is able to interact quickly and easily with the pages of the book, creating an environment of active reading with a transparent user interface.
  • a user concentrates on the text, not on the highlighter or pen in her hand when actively reading and annotating (adding a note or highlighting, for instance) the text.
  • the user intends to select the drawing tool, but instead selects the highlighting tool.
  • the user then has to move back to the tool bar (step 5), select the correct tool (step 6), move back to the text area (step 7), then select the object to be annotated (step 8).
  • the distance one needs to control the cursor to travel is distracting.
  • this large distance translates into significant hand or arm movement that requires the user to change the focus of his attention from the text to be annotated to precise navigation over an extended distance. Performed many times, this change in attention may become a significant distraction and eventually force the user to refrain from actively reading a document or book.
  • the present invention provides a technique for allowing a user to interact with the electronic display of documents with a simple user interface.
  • a user experiences immediate gratification in response to selection of an object or objects.
  • a selection of an object results in a display of a list of menu items relevant to the selected object.
  • the user interface for the object is determined based on the type of object selected. Instead of displaying a general user interface for all potential selectable objects, the user interface is tightly associated with the selected object.
  • the user interface is quick, simple, and unobtrusive.
  • the goal associated with the user interface includes being functionally transparent to the user.
  • the present application also relates to the mechanism underlying the functionality of the display and operation of the user interface.
  • a "document” or “book” encompasses all forms of electronically displayable information including but not limited to books, manuals, reference materials, picture books, etc. Further, the documents or books may include catalogs, e-commerce publications, articles, web pages, and the like.
  • Object as used herein encompasses all displayed information. With reference to looking up information regarding the object, the object may be a word or a group of words, symbols, icons, reference point on a page, page number, equation, margin, title, title bar, comer of the screen, and the like.
  • annotations are generally related to textual annotations.
  • other annotations that may be used include highlighting, drawings (as one would expect to do with a pencil or pen to a paper book), and bookmarks. While the annotations are to be displayed in conjunction with the document, the underlying document is not modified.
  • Related annotations and techniques for creating them are described in the following disclosures:
  • the present invention relates to an improved user interface for use with the electronic display and active reading of documents or books.
  • program modules include routines, programs, objects, scripts, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • program modules may be located in both local and remote memory storage devices.
  • the present invention may also be practiced in personal computers (PCs), hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like.
  • FIG. 1 is a schematic diagram of a computing environment in which the present invention may be implemented.
  • the present invention may be implemented within a general purpose computing device in the form of a conventional personal computer 200, including a processing unit 210, a system memory 220, and a system bus 230 that couples various system components including the system memory to the processing unit 210.
  • the system bus 230 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the system memory includes read only memory (ROM) 240 and random access memory (RAM) 250.
  • ROM read only memory
  • RAM random access memory
  • a basic input/output system 260 (BIOS), containing the basic routines that help to transfer information between elements within the personal computer 200, such as during start-up, is stored in ROM 240.
  • the personal computer 200 further includes a hard disk drive 270 for reading from and writing to a hard disk, not shown, a magnetic disk drive 280 for reading from or writing to a removable magnetic disk 290, and an optical disk drive 291 for reading from or writing to a removable optical disk 292 such as a CD ROM or other optical media.
  • the hard disk drive 270, magnetic disk drive 280, and optical disk drive 291 are connected to the system bus 230 by a hard disk drive interface 292, a magnetic disk drive interface 293, and an optical disk drive interface 294, respectively.
  • the drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 200.
  • exemplary environment described herein employs a hard disk, a removable magnetic disk 290 and a removable optical disk 292, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the exemplary operating environment.
  • RAMs random access memories
  • ROMs read only memories
  • a number of program modules may be stored on the hard disk, magnetic disk 290, optical disk 292, ROM 240 or RAM 250, including an operating system 295, one or more application programs 296, other program modules 297, and program data 298.
  • a user may enter commands and information into the personal computer 200 through input devices such as a keyboard 201 and pointing device 202.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 210 through a serial port interface 206 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB).
  • a monitor 207 or other type of display device is also connected to the system bus 230 via an interface, such as a video adapter 208.
  • personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
  • the personal computer 200 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 209.
  • the remote computer 209 may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the personal computer 200, although only a memory storage device 211 has been illustrated in Figure 1.
  • the logical connections depicted in Figure I include a local area network (LAN) 212 and a wide area network (WAN) 213.
  • LAN local area network
  • WAN wide area network
  • the personal computer 200 When used in a LAN networking environment, the personal computer 200 is connected to the local network 212 through a network interface or adapter 214. When used in a WAN networking environment, the personal computer 200 typically includes a modem 215 or other means for establishing a communications over the wide area network 213, such as the Internet.
  • the modem 215, which may be internal or external, is connected to the system bus 230 via the serial port interface 206.
  • program modules depicted relative to the personal computer 200, or portions thereof may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • handheld computers and purpose-built devices may support the invention as well.
  • handheld computers and purpose-built devices are similar in structure to the system of Figure 1 but may be limited to a display (which may be touch-sensitive to a human finger or stylus), memory (including RAM and ROM), and a synchronization/modem port for connecting the handheld computer and purpose-built devices to another computer or a network (including the Internet) to download and/or upload documents or download and/or upload annotations.
  • a display which may be touch-sensitive to a human finger or stylus
  • memory including RAM and ROM
  • a synchronization/modem port for connecting the handheld computer and purpose-built devices to another computer or a network (including the Internet) to download and/or upload documents or download and/or upload annotations.
  • the description of handheld computers and purpose-built devices is known in the art and is omitted for simplicity.
  • the invention may be practiced using C. Also, it is appreciated that other languages may be used including C++, assembly language, and the like.
  • Figure 2 shows a displayed document on a computer screen in accordance with embodiments of the present invention.
  • the document is displayed in a form that closely resembles the appearance of a paper equivalent of the e-book and, in this case, a paper novel.
  • the document reader window 101 may comprise a variety of portions including a title bar 101A listing the title of the document and a body 102. In the body 102 of the display window, various portions of a document may be displayed.
  • Figure 2 shows an example where a title 104, a chapter number 105, a chapter title 106, and the text of the chapter 107 are displayed. Similar to an actual book, margins 108, 109, 110, and 111 appear around the displayed text.
  • the displayed elements may be independently referenced.
  • object 103 "sleepy" has a drawing annotation placing a box around it as placed there by the user.
  • the presence of icon 112 indicates that a text annotation is present in the line of text adjacent to the icon 112.
  • the icon 112 is represented as the letter "T" in a circle, it will be appreciated by those of skill in the art that other representations may be used to designate the presence of an annotation.
  • other letters may be used (for instance, the letter "A" for annotation) or other symbols may be used (for instance, a symbol of an open book) or any other representation that indicate that an annotation exists.
  • Tapping on (or otherwise activating) the icon results in the display of a stored annotation.
  • the selection of the object provides an environment where the user physically interacts with the displayed image by actually tapping the image of the object itself
  • Figure 3A shows a displayed window after an object has been selected.
  • the object "beginning" 301 in the first line of text 107 was selected.
  • the object may be selected through tapping it with a stylus or a user's finger.
  • a user may position a cursor over the object then select the object (by clicking a mouse button or by operating a designation source).
  • the object may be inverted, meaning that the pixels that make up the object and surrounding pixels are inverted.
  • Alternative embodiments include changing the color of the pixels surrounding the object or highlighting the object in some other way as is known in the art.
  • a user used a stylus to tap on a display screen.
  • the tap point (the point receiving the tap from the user's stylus) in this example was the center of the second "g" of the selected object 301 (here, the word "beginning").
  • window 302 is displayed close to the physical location of object 301.
  • Window 302 contains actions operable on the selected object.
  • window 302 is shown close to object 301.
  • the user interface minimizes the distance a user needs to move a cursor or pointing device (including stylus or finger tip) from the object to the menu items provided in window 302.
  • a variety of distances and configurations of window 302 may be used.
  • An example includes a showing of the menu options, with the centers of adjacent menu options no more than 0.2 inches apart from each other.
  • the first of the menu options may be displayed no more than 0.3 inches away from the tap point used to select object 301.
  • the distances shown above are merely exemplary and do not limit the scope of the claims. Other examples of values include 0.5 and 1.0 inches, respectively. Again, these values are merely used for example.
  • Figure 3B shows an embodiment where window 302 overlies selected object 301. Overlaying object 301 with the window 302 provides a further minimization of the distance that needs to be traveled by the cursor or pointing device. As shown in Figure 3B, window 302 is opaque. Selected object 301 is completely hidden. Alternatively, window 302 of Figure 3B may be semi-transparent and allow a ghost or grayed out image of selected object 301 to show through window 302.
  • Other techniques for projecting the identity of object 301 through window 302 without diminishing the display of menu items in window 302 include projecting selected object with thin lines, italicized font formats, different colors, active shifting colors (for example, the outline of each letter or character of the object being traced with alternating bouncing black and white or shifting colored lines) and the like.
  • An advantage of overlying window 302 over selected object 301 is that the actual distance needed to move between the selected object and the menu items of window 302 is reduced.
  • window 302 as shown in Figures 3A and 3B may be modified to present the user with the most recently selected menu item juxtaposed to the selected object.
  • a user may have previously added a bookmark to the displayed text 107.
  • the system presents window 302 on selection of object 301 with the menu option of "Add Bookmark" closest to the insertion point.
  • the previous action may have been to add a highlight to the displayed content.
  • the position of the window 302 is modified to position the menu option of "Add Highlight" closest to the insertion point.
  • the window 302 may be positioned so that the text of the last selected menu item (in Figure 3B, "Add Highlight”) does not directly overlie the ghosted image of the selected object 301 to improve readability of the selected object 301.
  • the events as shown in Figures 3A and 3B are based on the primary input function including a left mouse click or a single tap of a stylus, among other things.
  • the primary input function becomes the initial operational interface device revealed to the user. So, when a user first initiates a document active reading system as described herein, the options available through window 302 will become readily apparent with the first selection of an object 301. Moreover, the same movement (i.e., tapping object 301) controls both the pointing to and selection of the object. Visual feedback occurs immediately at or near the selection.
  • the primary input function including a left mouse click or a single tap of a stylus, among other things.
  • the primary input function becomes the initial operational interface device revealed to the user. So, when a user first initiates a document active reading system as described herein, the options available through window 302 will become readily apparent with the first selection of an object 301. Moreover, the same movement (i.e., tapping object 301) controls both the pointing to and selection of the object. Visual feedback occurs immediately
  • window 302 may also include options that may affect the display of the content as well.
  • window 302 may include menu options that allow for switching from one book to another.
  • An advantage of displaying more information to the user may include a net reduction in the number of navigation steps required to perform a process. For example, by providing a menu option to allow one to change which book is currently being displayed, a user may switch between books with a few navigational commands. However, the total options available to a user at any given may be substantial. Due to the overcrowding of window 302, displaying all options decreases the intuitive nature of adding an annotation to a page.
  • the number of options available to the user are limited.
  • the menu quickly becomes tailored to the intention of the user.
  • the desire is to provide pure functionality to the user without cluttering the user's reading space.
  • the size of the font as displayed in text 107 may be different from that displayed in window 302.
  • the size of the font used for text 107 may be a 14-point font.
  • the size of the font used for the menu items in window 302 may be an 18-point font.
  • a and b are constants.
  • Fitt's Law may be found at http://psych.hanover.edu/classes/hfnotes2/sId041.html.
  • the present invention improves the user interface presented to the user by minimizing the distance traversed to select a menu option as well as increases the size of the target area (for example, by increasing the font size) of the menu options.
  • Figures 4 and 5 show further examples of the present invention with respect to display portions.
  • Figure 4 shows a reference window 402 having been opened based on the selection of object 401 (here, the word "natural") and the "Lookup " option (from Figure 3, window 302).
  • Reference window 402 displays the results of looking up a definition for the word "natural.”
  • only one reference document was consulted as only one document may have been installed in conjunction with the lookup functionality.
  • the system may skip directly to the sole reference, rather than displaying to the user a choice of only one item.
  • a user may also be given the option of editing the form of the selected object (for lookup purposes).
  • Figure 4 also shows the page number "i" as 403.
  • the page number is always displayed in order to provide the user with standard window appearance for reference window 402.
  • the page number 403 may be omitted where there is only enough information to fill one reference window 402 and included where there is more information than space available in a single window 402.
  • Figure 5 shows an example of a second page of uncovered reference information relating to the object "natural.”
  • the second page of reference information is shown as reference window 404 with page number 405. It will be appreciated that changing from reference window 402 to reference window 404 may or may not involve navigating to a new window. If multiple windows are employed, each reference window (402, 404) may overlie one another. Alternatively, they may be cascaded allowing a user to jump pages by selecting new windows. Further, there may only be a single reference window 402 with different content displayed therein with the content being altered only to display additional information regarding the selected object or on navigation to display new information for a newly selected object.
  • the menu choices available to users include a number of annotation features. These annotation features can add value to the book. For example, while a textbook alone may not have much value, the textbook with annotations from Albert Einstein or Stephen Hawking may be extremely valuable. However, if one were to purchase a book with annotations, one would not readily want to modify the purchased book (at least for copyright concerns) based on his own annotations or other active reading activities. At least one aspect of the present invention allows users the freedom to read actively a displayed text without the burden of contemplating how one is modifying the underlying document. Here, the user may be shielded from modifying the underlying document by having all annotations added to separate document or to a modifiable portion of the document apart from the displayed, non-modifiable portion. Figures 6A and 6B described how annotations may be captured and stored.
  • Figures 6A and 6B show various storage techniques for storing annotations in accordance with embodiments of the present invention.
  • Figure 6A shows a reference B 701 as having been annotated.
  • the file structure of Figure 6A has modifiable (703-706) and non-modifiable (702) portions. Files of this type include Infotext file formats as are known in the art.
  • Annotations 706 may be stored in combination with the non-modifiable content 702.
  • An annotation 706 may be stored in a file with header 703 and body 706.
  • the header 703 includes, for example, the file position 704 of the object with which the annotation 706 is associated. It may also include an indication of the type of annotation 706 in file portion 705.
  • the annotation 706 may include a highlight, a bookmark, a drawing to be overlaid over the object, or a text annotation.
  • Figure 6B shows the non-modifiable content of reference B 702 as a separate file apart from the annotation file 707.
  • the annotation file 707 of Figure 6B has similar constituent elements to that of annotation file 707 of Figure 6A.
  • Annotation file 707 may include a file portion 708 that indicates to which non-modifiable document (here, 702) it is linked.
  • one file may store all annotations for a user with the non-modifiable content portions 702 being stored separately. This approach has the advantage of being able to quickly scan all annotations at one time rather than accessing all documents 701 (as including non-modifiable portions 707 of Figure 6A) to obtain all annotations stored therein.
  • Figure 7 shows a method for controlling the display of a user interface in accordance with embodiments of the present invention.
  • Figure 7 is described with reference to the use of a stylus as an input device for tapping on a computer screen.
  • a "tap" may be defined as an event defined as a contact with the surface of the display screen and a subsequent release that is near the initial contact point in both time and distance across the surface of the display screen.
  • a mouse with the operation of the primary mouse button as may be used as well.
  • step 801 a tap is received.
  • the system selects the object under the tap point of the computer screen as represented by step 802.
  • Menu option window 302 is rendered with options relevant to selected object 301, as shown in step 803.
  • a timeout interval the system determines whether a subsequent tap was received within a given time interval from the initial tap of step 801. If a timeout interval is in operation and if no subsequent tap was received (or was received after time interval T), then the menu option window 302 is released (step 805) and waits for the next tap (step 801). If the subsequent tap was received within time interval T, the system determines whether the subsequent tap was inside the menu option window 302 or outside the menu option window 302 (step 806). If inside the window, the system executes the selected option (step 807). If outside menu option window 302, the menu option window 302 is released and the system waits for the next tap (step 801).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Nitrogen And Oxygen Or Sulfur-Condensed Heterocyclic Ring Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

A system, method, and user interface for interacting with electronically displayed content is disclosed. In response to a selection of an object, a list of menu options is displayed close to the selected object. In one embodiment, the list of menu options is displayed in a font larger than that used to display the selected object. Through the use of the invention, a user is provided with a technique for actively reading displayed content with minimal distractions from the user interface.

Description

    1. Related Applications
  • This application is related to the following applications:
  • U.S. Serial No. (BW 03797.80027), filed December , 1999, entitled "Bookmarking and Placemarking a Displayed Document in a Computer System;"
  • U.S. Serial No. (BW 03797.84619) filed , "Method and Apparatus for Installing and Using Reference Materials In Conjunction With Reading Electronic Content;"
  • U.S. Serial No. (BW 03797.78802), filed December , 1999, entitled "System and Method for Annotating an Electronic Document Independently of Its Content;"
  • U.S. Serial No. (BW 03797.84617), filed December , 1999, entitled "Method and Apparatus For Capturing and Rendering Annotations For Non-Modifiable Electronic Content;" and,
  • U.S. Serial No. (BW 03797.84618), filed December , 1999, entitled "Method and Apparatus for Capturing and Rendering Text Annotations For Non-Modifiable Electronic Content."
  • 2. Background A. Technical Field
  • The disclosure generally relates to the electronic display of documents. More particularly, the disclosure relates to a user interface for use with viewing electronically displayed documents.
  • B. Related Art
  • Many factors today drive the development of computers and computer software. One of these factors is the desire to provide accessibility to information virtually anytime and anywhere. The proliferation of notebook computers, personal digital assistants (PDAs), and other personal electronic devices reflect the fact that users want to be able to access information wherever they may be, whenever they want. In order to facilitate greater levels of information accessibility, the presentation of information must be made as familiar and comfortable as possible.
  • In this vein, one way to foster success of electronic presentations of information will be to allow users to handle information in a familiar manner. Stated another way, the use and manipulation of electronically-presented information may mimic those paradigms that users are most familiar with, e.g., printed documents, as an initial invitation to their use. As a result, greater familiarity between users and their "machines" will be engendered, thereby fostering greater accessibility, even if the machines have greater capabilities and provide more content to the user beyond the user's expectations. Once users feel comfortable with new electronic presentations, they will be more likely to take advantage of an entire spectrum of available functionality.
  • Paper books have a simple user interface. One opens a book and begins to read it. If a user wants to do something to the book (for example, add a textual annotation), he may use a pen or pencil to add a notation in the margin next to a relevant passage of the book. Here, the objects presented to the user include the paper page of the book with two regions (the margin and the text on the page itself) and the writing implement. Other types of actions are also available including bookmarking the page (for example, by folding over the top comer of the page), adding a drawing (using the same pen or pencil discussed above), and highlighting a passage (with a pen or pencil of different color). With these simple objects and tools, a user is able to interact quickly and easily with the pages of the book, creating an environment of active reading with a transparent user interface. Here, for example, a user concentrates on the text, not on the highlighter or pen in her hand when actively reading and annotating (adding a note or highlighting, for instance) the text.
  • The transition of active reading from the paper environment to the electronic book environment has not projected the same transparent user interface. Rather, electronic documents commonly provide a user interface where one selects an action from a tool bar located at the top of a display (for example, turns on a highlighting tool) then selects the object. This sequence of actions can become cumbersome when switching between actions. For instance, switching between adding a text annotation to adding a drawing requires moving a user's hand (or other cursor designating device) from the text area to the tool bar (step 1), selecting the drawing tool (step 2), moving the user's hand back to the text area (step 3), then selecting the object to be annotated (step 4). This number of steps can be compounded if a mistake is made. For example, the user intends to select the drawing tool, but instead selects the highlighting tool. The user then has to move back to the tool bar (step 5), select the correct tool (step 6), move back to the text area (step 7), then select the object to be annotated (step 8). For large displays, the distance one needs to control the cursor to travel is distracting. Importantly, for stylus-controlled input devices, this large distance translates into significant hand or arm movement that requires the user to change the focus of his attention from the text to be annotated to precise navigation over an extended distance. Performed many times, this change in attention may become a significant distraction and eventually force the user to refrain from actively reading a document or book.
  • While some electronic document editors include the option to display a list of menu items based on a right mouse click over selected text, this option is not readily apparent to all users. One needs to become familiar with a windowing, operating environment prior to learning about this feature. In short, because the right mouse button is a lesser know interface, any listing of options based on the operation of the right mouse button is not intuitive. To most users then, the use of this interface is not known and all advantages that may be potentially available for the users will remain a mystery. To this end, despite the programmers' efforts to the contrary, all benefits that remain shrouded behind the right mouse click are as if they never existed.
  • 3. Summary
  • The present invention provides a technique for allowing a user to interact with the electronic display of documents with a simple user interface. Through use of the interface, a user experiences immediate gratification in response to selection of an object or objects. In a first embodiment, a selection of an object results in a display of a list of menu items relevant to the selected object. The user interface for the object is determined based on the type of object selected. Instead of displaying a general user interface for all potential selectable objects, the user interface is tightly associated with the selected object. Through being customized for the selected object (word, margin, page number, title, icon, equation, or the like), the user interface is quick, simple, and unobtrusive. In one embodiment, the goal associated with the user interface includes being functionally transparent to the user. The present application also relates to the mechanism underlying the functionality of the display and operation of the user interface. In the context of the present invention, a "document" or "book" encompasses all forms of electronically displayable information including but not limited to books, manuals, reference materials, picture books, etc. Further, the documents or books may include catalogs, e-commerce publications, articles, web pages, and the like.
  • "Object" as used herein encompasses all displayed information. With reference to looking up information regarding the object, the object may be a word or a group of words, symbols, icons, reference point on a page, page number, equation, margin, title, title bar, comer of the screen, and the like.
  • For the purpose of this disclosure, annotations are generally related to textual annotations. However, other annotations that may be used include highlighting, drawings (as one would expect to do with a pencil or pen to a paper book), and bookmarks. While the annotations are to be displayed in conjunction with the document, the underlying document is not modified. Related annotations and techniques for creating them are described in the following disclosures:
  • U.S. Serial No. (BW 03797.80027), filed December , 1999, entitled "Bookmarking and Placemarking a Displayed Document in a Computer System;"
  • U.S. Serial No. (BW 03797.84619) filed , entitled "Method and Apparatus for Installing and Using Reference Materials In Conjunction With Reading Electronic Content;"
  • U.S. Serial No. (BW 03797.78802), filed December , 1999, entitled "System and Method for Annotating an Electronic Document Independently of Its Content;"
  • U.S. Serial No. (BW 03797.84618), filed December , 1999, entitled "Method and Apparatus for Capturing and Rendering Text Annotations For Non-Modifiable Electronic Content;" and,
  • U.S. Serial No. (BW 03797.84617), filed December , 1999, entitled "Method and Apparatus For Capturing and Rendering Annotations For Non-Modifiable Electronic Content"
  •    which are incorporated herein by reference in their entireties for any enabling disclosure.
  • These and other novel advantages, details, embodiments, features and objects of the present invention will be apparent to those skilled in the art from following the detailed description of the invention, the attached claims and accompanying drawings, listed herein, which are useful in explaining the invention.
  • 4. Brief Description of Drawings
  • Figure 1 shows a general purpose computer supporting the display and annotation of an electronic document in accordance with embodiments of the present invention.
  • Figure 2 shows a displayed document on a computer screen in accordance with embodiments of the present invention.
  • Figures 3A and 3B show a displayed document with an object selected in accordance with embodiments of the present invention.
  • Figure 4 shows a displayed document with a first reference window in accordance with embodiments of the present invention.
  • Figure 5 shows a displayed document with a second reference window in accordance with embodiments of the present invention.
  • Figures 6A and 6B show two file formats for annotations in accordance with embodiments of the present invention.
  • Figure 7 shows a method for operating a user interface in accordance with embodiments of the present invention.
  • 5. Detailed Description
  • The present invention relates to an improved user interface for use with the electronic display and active reading of documents or books.
  • Although not required, the invention will be described in the general context of computer-executable instructions, such as program modules. Generally, program modules include routines, programs, objects, scripts, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with any number of computer system configurations including, but not limited to, distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices. The present invention may also be practiced in personal computers (PCs), hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like.
  • Figure 1 is a schematic diagram of a computing environment in which the present invention may be implemented. The present invention may be implemented within a general purpose computing device in the form of a conventional personal computer 200, including a processing unit 210, a system memory 220, and a system bus 230 that couples various system components including the system memory to the processing unit 210. The system bus 230 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory includes read only memory (ROM) 240 and random access memory (RAM) 250.
  • A basic input/output system 260 (BIOS), containing the basic routines that help to transfer information between elements within the personal computer 200, such as during start-up, is stored in ROM 240. The personal computer 200 further includes a hard disk drive 270 for reading from and writing to a hard disk, not shown, a magnetic disk drive 280 for reading from or writing to a removable magnetic disk 290, and an optical disk drive 291 for reading from or writing to a removable optical disk 292 such as a CD ROM or other optical media. The hard disk drive 270, magnetic disk drive 280, and optical disk drive 291 are connected to the system bus 230 by a hard disk drive interface 292, a magnetic disk drive interface 293, and an optical disk drive interface 294, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the personal computer 200.
  • Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 290 and a removable optical disk 292, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, Bernoulli cartridges, random access memories (RAMs), read only memories (ROMs), and the like, may also be used in the exemplary operating environment.
  • A number of program modules may be stored on the hard disk, magnetic disk 290, optical disk 292, ROM 240 or RAM 250, including an operating system 295, one or more application programs 296, other program modules 297, and program data 298. A user may enter commands and information into the personal computer 200 through input devices such as a keyboard 201 and pointing device 202. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 210 through a serial port interface 206 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB). A monitor 207 or other type of display device is also connected to the system bus 230 via an interface, such as a video adapter 208. In addition to the monitor, personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
  • The personal computer 200 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 209. The remote computer 209 may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the personal computer 200, although only a memory storage device 211 has been illustrated in Figure 1. The logical connections depicted in Figure I include a local area network (LAN) 212 and a wide area network (WAN) 213. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
  • When used in a LAN networking environment, the personal computer 200 is connected to the local network 212 through a network interface or adapter 214. When used in a WAN networking environment, the personal computer 200 typically includes a modem 215 or other means for establishing a communications over the wide area network 213, such as the Internet. The modem 215, which may be internal or external, is connected to the system bus 230 via the serial port interface 206. In a networked environment, program modules depicted relative to the personal computer 200, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • In addition to the system described in relation to Figure 1, the invention may be practiced on a handheld computer. Further, purpose-built devices may support the invention as well. In short, handheld computers and purpose-built devices are similar in structure to the system of Figure 1 but may be limited to a display (which may be touch-sensitive to a human finger or stylus), memory (including RAM and ROM), and a synchronization/modem port for connecting the handheld computer and purpose-built devices to another computer or a network (including the Internet) to download and/or upload documents or download and/or upload annotations. The description of handheld computers and purpose-built devices is known in the art and is omitted for simplicity. The invention may be practiced using C. Also, it is appreciated that other languages may be used including C++, assembly language, and the like.
  • Figure 2 shows a displayed document on a computer screen in accordance with embodiments of the present invention. As preferred, the document is displayed in a form that closely resembles the appearance of a paper equivalent of the e-book and, in this case, a paper novel. The document reader window 101 may comprise a variety of portions including a title bar 101A listing the title of the document and a body 102. In the body 102 of the display window, various portions of a document may be displayed. Figure 2 shows an example where a title 104, a chapter number 105, a chapter title 106, and the text of the chapter 107 are displayed. Similar to an actual book, margins 108, 109, 110, and 111 appear around the displayed text. As referred to herein, the displayed elements may be independently referenced. Here, for example object 103 "sleepy" has a drawing annotation placing a box around it as placed there by the user. The presence of icon 112 indicates that a text annotation is present in the line of text adjacent to the icon 112. While the icon 112 is represented as the letter "T" in a circle, it will be appreciated by those of skill in the art that other representations may be used to designate the presence of an annotation. For example, other letters may be used (for instance, the letter "A" for annotation) or other symbols may be used (for instance, a symbol of an open book) or any other representation that indicate that an annotation exists. Tapping on (or otherwise activating) the icon (or other designation) results in the display of a stored annotation. In an environment where a stylus is used to tap directly on a displayed image, the selection of the object provides an environment where the user physically interacts with the displayed image by actually tapping the image of the object itself
  • Figure 3A shows a displayed window after an object has been selected. Here, for example, the object "beginning" 301 in the first line of text 107 was selected. The object may be selected through tapping it with a stylus or a user's finger. Alternatively, a user may position a cursor over the object then select the object (by clicking a mouse button or by operating a designation source). As shown in Figure 3A, the object may be inverted, meaning that the pixels that make up the object and surrounding pixels are inverted. Alternative embodiments include changing the color of the pixels surrounding the object or highlighting the object in some other way as is known in the art.
  • In this example of Figure 3A, a user used a stylus to tap on a display screen. The tap point (the point receiving the tap from the user's stylus) in this example was the center of the second "g" of the selected object 301 (here, the word "beginning").
  • After an object has been selected, window 302 is displayed close to the physical location of object 301. Window 302 contains actions operable on the selected object. In one embodiment, as shown in Figure 3A, window 302 is shown close to object 301. By having window 302 close to selected object 301, the user interface minimizes the distance a user needs to move a cursor or pointing device (including stylus or finger tip) from the object to the menu items provided in window 302. A variety of distances and configurations of window 302 may be used. An example includes a showing of the menu options, with the centers of adjacent menu options no more than 0.2 inches apart from each other. Also, the first of the menu options may be displayed no more than 0.3 inches away from the tap point used to select object 301. The distances shown above are merely exemplary and do not limit the scope of the claims. Other examples of values include 0.5 and 1.0 inches, respectively. Again, these values are merely used for example.
  • Figure 3B shows an embodiment where window 302 overlies selected object 301. Overlaying object 301 with the window 302 provides a further minimization of the distance that needs to be traveled by the cursor or pointing device. As shown in Figure 3B, window 302 is opaque. Selected object 301 is completely hidden. Alternatively, window 302 of Figure 3B may be semi-transparent and allow a ghost or grayed out image of selected object 301 to show through window 302. Other techniques for projecting the identity of object 301 through window 302 without diminishing the display of menu items in window 302 include projecting selected object with thin lines, italicized font formats, different colors, active shifting colors (for example, the outline of each letter or character of the object being traced with alternating bouncing black and white or shifting colored lines) and the like. An advantage of overlying window 302 over selected object 301 is that the actual distance needed to move between the selected object and the menu items of window 302 is reduced.
  • The location of window 302 as shown in Figures 3A and 3B may be modified to present the user with the most recently selected menu item juxtaposed to the selected object. For example, in Figure 3A, a user may have previously added a bookmark to the displayed text 107. Remembering this action, the system presents window 302 on selection of object 301 with the menu option of "Add Bookmark" closest to the insertion point. With respect to Figure 3B, the previous action may have been to add a highlight to the displayed content. Accordingly, the position of the window 302 is modified to position the menu option of "Add Highlight" closest to the insertion point. It is readily appreciated that the window 302 may be positioned so that the text of the last selected menu item (in Figure 3B, "Add Highlight") does not directly overlie the ghosted image of the selected object 301 to improve readability of the selected object 301.
  • Here, the events as shown in Figures 3A and 3B are based on the primary input function including a left mouse click or a single tap of a stylus, among other things. The primary input function becomes the initial operational interface device revealed to the user. So, when a user first initiates a document active reading system as described herein, the options available through window 302 will become readily apparent with the first selection of an object 301. Moreover, the same movement (i.e., tapping object 301) controls both the pointing to and selection of the object. Visual feedback occurs immediately at or near the selection. The
  • As represented in Figures 3A and 3B, the following options are displayed:
  • Add Bookmark
  • Add Highlight
  • Add Note
  • Add Drawing
  • Find ...
  • Copy Text
  • Lookup ...
  • These options as provided in window 302, in conjunction with object 301, present two types of feedback in response to a user selection. Here, both highlighting of the object 301 and presentation of the list of choices (the menu in window 302) occur based on the single user selection (tapping or clicking) of object 301. Various aspects of the actions associated with these menu options are treated in greater detail in the following disclosures,
       U.S. Serial No. (BW 03797.80027), filed December , 1999, entitled "Bookmarking and Placemarking a Displayed Document in a Computer System,"
       U.S. Serial No. (BW 03797.84619) filed , entitled "Method and Apparatus for Installing and Using Reference Materials In Conjunction With Reading Electronic Content,"
       U.S. Serial No. (BW 03797.78802), filed December , 1999, entitled "System and Method for Annotating an Electronic Document Independently of Its Content,"
       U.S. Serial No. (BW 03797.84618), filed December , 1999, entitled "Method and Apparatus for Capturing and Rendering Text Annotations For Non-Modifiable Electronic Content," and,
       U.S. Serial No. (BW 03797.84617), filed December , 1999, entitled "Method and Apparatus For Capturing and Rendering Annotations For Non-Modifiable Electronic Content,"
       which are incorporated herein by reference for any essential disclosure.
  • Alternatively, window 302 may also include options that may affect the display of the content as well. For example, window 302 may include menu options that allow for switching from one book to another. An advantage of displaying more information to the user may include a net reduction in the number of navigation steps required to perform a process. For example, by providing a menu option to allow one to change which book is currently being displayed, a user may switch between books with a few navigational commands. However, the total options available to a user at any given may be substantial. Due to the overcrowding of window 302, displaying all options decreases the intuitive nature of adding an annotation to a page.
  • As shown in Figures 3A and 3B, the number of options available to the user are limited. By providing a limited number of options, the menu quickly becomes tailored to the intention of the user. By minimizing the actions displayed to the user, the desire is to provide pure functionality to the user without cluttering the user's reading space.
  • Further, as shown in Figures 3A and 3B, the size of the font as displayed in text 107 may be different from that displayed in window 302. For example, the size of the font used for text 107 may be a 14-point font. The size of the font used for the menu items in window 302 may be an 18-point font. Thus, based on user selection of an object 301, the choices available (menu options in window 302) are displayed in a larger font and close to the selection 301. The resulting user interface will be easier to operate based on an analysis of the interface using Fitt's Law. Fitt's Law defines an index of difficulty for human responses as ID=(log2(2A/W)) where A is the amplitude or size of a movement and W is the width of a target. Fitt's Law indicates that movement time = a + b(ID) where a and b are constants. Here, the smaller the movement and the larger the target, the smaller the index of difficulty. More information regarding Fitt's Law may be found at http://psych.hanover.edu/classes/hfnotes2/sId041.html.
  • At least in one embodiment, the present invention improves the user interface presented to the user by minimizing the distance traversed to select a menu option as well as increases the size of the target area (for example, by increasing the font size) of the menu options.
  • Figures 4 and 5 show further examples of the present invention with respect to display portions. Figure 4 shows a reference window 402 having been opened based on the selection of object 401 (here, the word "natural") and the "Lookup ..." option (from Figure 3, window 302). Reference window 402 displays the results of looking up a definition for the word "natural." In the example, of Figure 4, only one reference document was consulted as only one document may have been installed in conjunction with the lookup functionality. As shown here, where only reference exists, the system may skip directly to the sole reference, rather than displaying to the user a choice of only one item. If multiple reference documents have been installed with the lookup functionality, then multiple choices relating to the documents installed may be available for the user for selection. In an alternative embodiment, a user may also be given the option of editing the form of the selected object (for lookup purposes).
  • Figure 4 also shows the page number "i" as 403. In one embodiment, the page number is always displayed in order to provide the user with standard window appearance for reference window 402. In another embodiment, the page number 403 may be omitted where there is only enough information to fill one reference window 402 and included where there is more information than space available in a single window 402.
  • Figure 5 shows an example of a second page of uncovered reference information relating to the object "natural." The second page of reference information is shown as reference window 404 with page number 405. It will be appreciated that changing from reference window 402 to reference window 404 may or may not involve navigating to a new window. If multiple windows are employed, each reference window (402, 404) may overlie one another. Alternatively, they may be cascaded allowing a user to jump pages by selecting new windows. Further, there may only be a single reference window 402 with different content displayed therein with the content being altered only to display additional information regarding the selected object or on navigation to display new information for a newly selected object.
  • The menu choices available to users include a number of annotation features. These annotation features can add value to the book. For example, while a textbook alone may not have much value, the textbook with annotations from Albert Einstein or Stephen Hawking may be extremely valuable. However, if one were to purchase a book with annotations, one would not readily want to modify the purchased book (at least for copyright concerns) based on his own annotations or other active reading activities. At least one aspect of the present invention allows users the freedom to read actively a displayed text without the burden of contemplating how one is modifying the underlying document. Here, the user may be shielded from modifying the underlying document by having all annotations added to separate document or to a modifiable portion of the document apart from the displayed, non-modifiable portion. Figures 6A and 6B described how annotations may be captured and stored.
  • Figures 6A and 6B show various storage techniques for storing annotations in accordance with embodiments of the present invention. Figure 6A shows a reference B 701 as having been annotated. The file structure of Figure 6A has modifiable (703-706) and non-modifiable (702) portions. Files of this type include Infotext file formats as are known in the art. Annotations 706 may be stored in combination with the non-modifiable content 702. An annotation 706 may be stored in a file with header 703 and body 706. The header 703 includes, for example, the file position 704 of the object with which the annotation 706 is associated. It may also include an indication of the type of annotation 706 in file portion 705. As discussed above, the annotation 706 may include a highlight, a bookmark, a drawing to be overlaid over the object, or a text annotation.
  • Figure 6B shows the non-modifiable content of reference B 702 as a separate file apart from the annotation file 707. The annotation file 707 of Figure 6B has similar constituent elements to that of annotation file 707 of Figure 6A. Annotation file 707 may include a file portion 708 that indicates to which non-modifiable document (here, 702) it is linked. Using the approach set forth in Figure 6B, one file may store all annotations for a user with the non-modifiable content portions 702 being stored separately. This approach has the advantage of being able to quickly scan all annotations at one time rather than accessing all documents 701 (as including non-modifiable portions 707 of Figure 6A) to obtain all annotations stored therein. Greater detail on how to create and store annotations is disclosed in U.S. Serial No. (BW 03797.84617), filed December , 1999, entitled "Method and Apparatus For Capturing and Rendering Annotations For Non-Modifiable Electronic Content," whose contents are incorporated by reference for any essential disclosure.
  • Figure 7 shows a method for controlling the display of a user interface in accordance with embodiments of the present invention. Figure 7 is described with reference to the use of a stylus as an input device for tapping on a computer screen. For the purposes of Figure 7, a "tap" may be defined as an event defined as a contact with the surface of the display screen and a subsequent release that is near the initial contact point in both time and distance across the surface of the display screen. It will be readily appreciated that a mouse with the operation of the primary mouse button as may be used as well. Starting with step 801, a tap is received. Next, the system selects the object under the tap point of the computer screen as represented by step 802. Menu option window 302 is rendered with options relevant to selected object 301, as shown in step 803. If a timeout interval is in operation, the system determines whether a subsequent tap was received within a given time interval from the initial tap of step 801. If a timeout interval is in operation and if no subsequent tap was received (or was received after time interval T), then the menu option window 302 is released (step 805) and waits for the next tap (step 801). If the subsequent tap was received within time interval T, the system determines whether the subsequent tap was inside the menu option window 302 or outside the menu option window 302 (step 806). If inside the window, the system executes the selected option (step 807). If outside menu option window 302, the menu option window 302 is released and the system waits for the next tap (step 801).
  • In the foregoing specification, the present invention has been described with reference to specific exemplary embodiments thereof. Although the invention has been described in terms of various embodiments, those skilled in the art will recognize that various modifications, embodiments or variations of the invention can be practiced within the spirit and scope of the invention as set forth in the appended claims. All are considered within the sphere, spirit, and scope of the invention. The specification and drawings are, therefore, to be regarded in an illustrative rather than restrictive sense. Accordingly, it is not intended that the invention be limited except as may be necessary in view of the appended claims.
  • The following is a list of further preferred embodiments of the invention:
  • Embodiment 1. A computer-implemented method for providing a list of menu options comprising the steps of:
  • receiving a tap from a stylus at a tap point;
  • identifying an object displayed below the tap point;
  • highlighting the object and displaying a window containing a list of menu options related to the object.
  • Embodiment 2. The computer-implemented method according to embodiment 1, wherein said displaying step further comprises the step of:
  • displaying the list of menu options adjacent to said tap point.
  • Embodiment 3. The computer-implemented method according to embodiment 1, wherein said displaying step further comprises the step of:
  • displaying the list of menu options on top of said point.
  • Embodiment 4. The computer-implemented method according to embodiment 1, further comprising the steps of:
  • receiving another tap from said stylus outside the window; and,
  • closing the window.
  • Embodiment 5. The computer-implemented method according to embodiment 1, further comprising the steps of:
  • receiving another tap from said stylus designating one of said menu options inside the window; and,
  • executing said designated one of said menu options.
  • Embodiment 6. A computer-implemented method for providing a list of menu options in a system displaying a non-modifiable electronic book comprising the steps of:
  • receiving a designation at a point;
  • identifying an object displayed below the point;
  • highlighting the object and displaying a window containing a list of menu options related to the object.
  • Embodiment 7. The computer-implemented method according to embodiment 6, wherein said displaying step further comprises the step of:
  • displaying the list of menu options adjacent to said tap point.
  • Embodiment 8. The computer-implemented method according to embodiment 6, wherein said displaying step further comprises the step of:
  • displaying the list of menu options on top of said point.
  • Embodiment 9. The computer-implemented method according to embodiment 6, further comprising the steps of:
  • receiving another tap from said stylus outside the window; and,
  • closing the window.
  • Embodiment 10. The computer-implemented method according to embodiment 6, further comprising the steps of:
  • receiving another tap from said stylus designating one of said menu options inside the window; and,
  • executing said designated one of said menu options.
  • Embodiment 11. The computer-implemented method according to embodiment 6, wherein said designation is determined by a stylus tapping a screen of a computer and said point is a tap point designated by said stylus.
  • Embodiment 12. A user interface for providing a list of menu options in response to a designation of an object comprising:
  • a first display portion having objects in a non-modifiable portion of a document;
  • a second display portion having a selection of at least one of said objects;
  • a third display portion having a list of menu options relevant to said selection, wherein said second and said third display portions are displayed in response to a user interaction with said first display portion.
  • Embodiment 13. The user interface according to embodiment 12, wherein said second and third display portions are removed upon selection of said first display portion outside the third display portion.
  • Embodiment 14. A user interface for providing a list of menu options in response to a designation of an object comprising:
  • a first display portion having objects in a non-modifiable portion of a document;
  • a second display portion having a list of menu options,
  •    wherein said second display portion is displayed in response to a user interaction with said first display portion and selection of at least one of said objects, and
       wherein said second display portion overlies said selection.

Claims (12)

  1. A computer-implemented method for providing a list of menu options in a system displaying a non-modifiable electronic book comprising the steps of:
    receiving a first designation at a first point;
    highlighting an object displayed below the first point and displaying a first window containing a first list of menu options related to the object, executing one of the menu options in said first list, and storing an indication of the executed menu option;
    receiving a second designation at a second point; and
    displaying a second window having a second list of menu options including said executed menu option from said first window.
  2. The computer-implemented method according to claim 1, wherein said displaying step further comprises the step of:
    displaying said executed menu option adjacent to said second point.
  3. The computer-implemented method according to claim 1, wherein said displaying step further comprises the step of:
    displaying the second list of menu options on top of said second point.
  4. The computer-implemented method according to claim 1, further comprising the steps of:
    receiving a third designation outside the second window; and,
    closing the second window.
  5. The computer-implemented method according to claim 1, further comprising the step of displaying said second window so that the executed menu option is the nearest of the second list of menu options adjacent to said second point.
  6. The computer-implemented method according to claim 1, wherein at least one of the first designation and the second designation is determined by a stylus tapping a screen of a computer and at least one of said first point and the second point is a tap point designated by said stylus.
  7. A user interface for providing a list of menu options in response to a designation of an object in an electronic book comprising:
    a first display portion having objects in a non-modifiable portion of a document;
    a second display portion having a selection of at least one of said objects;
    a third display portion having a list of menu options relevant to said selection, wherein said second and said third display portions are displayed in response to a user interaction with said first display portion, wherein the third display portion includes a previously selected menu option within a predetermined distance to said selection of at least one of said objects.
  8. The user interface according to claim 7, wherein said second and third display portions are removed upon selection of said first display portion outside the third display portion.
  9. The user interface according to claim 7, wherein said predetermined distance ranges from 0.20 inch to 1.0 inch.
  10. The user interface according to claim 9, wherein said predetermined distance ranges from 0.30 inch to 0.50 inch.
  11. A user interface for providing a list of menu options in response to a designation of an object comprising:
    a first display portion having objects in a non-modifiable portion of a document;
    a second display portion having a list of menu options, wherein said second display portion is displayed in response to a user interaction with said first display portion and selection of at least one of said objects, and
       wherein said second display portion, including a recently selected menu option juxtaposed to said selection.
  12. The user interface according to claim 14, wherein said object has a first font size and said menu options have a second font size, said first font size being different from the second font size.
EP04018712A 1999-12-07 2000-12-07 System, method and user interface for active reading of electronic content Expired - Lifetime EP1473624B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US455808 1999-12-07
US09/455,808 US6714214B1 (en) 1999-12-07 1999-12-07 System method and user interface for active reading of electronic content
EP00983977A EP1236081B1 (en) 1999-12-07 2000-12-07 System, method and user interface for active reading of electronic content

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
EP00983977A Division EP1236081B1 (en) 1999-12-07 2000-12-07 System, method and user interface for active reading of electronic content
EP00983977.0 Division 2000-12-07

Publications (3)

Publication Number Publication Date
EP1473624A2 true EP1473624A2 (en) 2004-11-03
EP1473624A3 EP1473624A3 (en) 2005-03-02
EP1473624B1 EP1473624B1 (en) 2010-08-04

Family

ID=23810360

Family Applications (2)

Application Number Title Priority Date Filing Date
EP04018712A Expired - Lifetime EP1473624B1 (en) 1999-12-07 2000-12-07 System, method and user interface for active reading of electronic content
EP00983977A Expired - Lifetime EP1236081B1 (en) 1999-12-07 2000-12-07 System, method and user interface for active reading of electronic content

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP00983977A Expired - Lifetime EP1236081B1 (en) 1999-12-07 2000-12-07 System, method and user interface for active reading of electronic content

Country Status (6)

Country Link
US (2) US6714214B1 (en)
EP (2) EP1473624B1 (en)
AT (2) ATE273534T1 (en)
AU (1) AU2066001A (en)
DE (2) DE60044787D1 (en)
WO (1) WO2001042899A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010056483A1 (en) * 2008-11-13 2010-05-20 Qualcomm Incorporated Method and system for context dependent pop-up menus
EP2444884A3 (en) * 2010-10-14 2012-05-09 LG Electronics Inc. Electronic device and method for providing menu using the same

Families Citing this family (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7028267B1 (en) * 1999-12-07 2006-04-11 Microsoft Corporation Method and apparatus for capturing and rendering text annotations for non-modifiable electronic content
US7337389B1 (en) 1999-12-07 2008-02-26 Microsoft Corporation System and method for annotating an electronic document independently of its content
US9424240B2 (en) 1999-12-07 2016-08-23 Microsoft Technology Licensing, Llc Annotations for electronic content
US7243299B1 (en) * 2000-04-21 2007-07-10 Microsoft Corporation Methods and apparatus for displaying multiple contexts in electronic documents
US7234108B1 (en) * 2000-06-29 2007-06-19 Microsoft Corporation Ink thickness rendering for electronic annotations
US20020059343A1 (en) * 2000-11-10 2002-05-16 Masahiro Kurishima Client apparatus and recording medium that records a program thereof
US20020099552A1 (en) * 2001-01-25 2002-07-25 Darryl Rubin Annotating electronic information with audio clips
US20060161622A1 (en) * 2001-04-13 2006-07-20 Elaine Montgomery Methods and apparatuses for selectively sharing a portion of a display for application based screen sampling using direct draw applications
US20060161623A1 (en) * 2001-04-13 2006-07-20 Elaine Montgomery Methods and apparatuses for selectively sharing a portion of a display for application based screen sampling
US20060161624A1 (en) * 2001-04-13 2006-07-20 Elaine Montgomery Methods and apparatuses for dynamically sharing a portion of a display for application based screen sampling
US20020188630A1 (en) * 2001-05-21 2002-12-12 Autodesk, Inc. Method and apparatus for annotating a sequence of frames
US7246118B2 (en) * 2001-07-06 2007-07-17 International Business Machines Corporation Method and system for automated collaboration using electronic book highlights and notations
US7103848B2 (en) 2001-09-13 2006-09-05 International Business Machines Corporation Handheld electronic book reader with annotation and usage tracking capabilities
US20040205632A1 (en) * 2001-12-21 2004-10-14 Tung-Liang Li Electronic book
US20040072212A1 (en) * 2002-06-07 2004-04-15 Rokita Steven E. Recognition-driven alkylation of biopolymers
US7058902B2 (en) * 2002-07-30 2006-06-06 Microsoft Corporation Enhanced on-object context menus
US7137076B2 (en) * 2002-07-30 2006-11-14 Microsoft Corporation Correcting recognition results associated with user input
US8261184B2 (en) * 2002-08-02 2012-09-04 Ignatius Xavier Haase Apparatus and method for encoding and displaying documents
US7269787B2 (en) * 2003-04-28 2007-09-11 International Business Machines Coporation Multi-document context aware annotation system
JP4366149B2 (en) * 2003-09-03 2009-11-18 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
FI20031859A (en) * 2003-12-18 2005-06-19 Nokia Corp Creating a message from information displayed on the screen
US7890526B1 (en) * 2003-12-30 2011-02-15 Microsoft Corporation Incremental query refinement
JP4756876B2 (en) 2004-06-09 2011-08-24 キヤノン株式会社 Image display control device, image display control method, program, and storage medium
US8321786B2 (en) * 2004-06-17 2012-11-27 Apple Inc. Routine and interface for correcting electronic text
US7173619B2 (en) * 2004-07-08 2007-02-06 Microsoft Corporation Matching digital information flow to a human perception system
US20060047770A1 (en) * 2004-09-02 2006-03-02 International Business Machines Corporation Direct information copy and transfer between real-time messaging applications
US20060190826A1 (en) * 2005-02-22 2006-08-24 Elaine Montgomery Methods and apparatuses for dynamically sharing a portion of a display during a collaboration session
US8117560B1 (en) * 2005-02-22 2012-02-14 Cisco Technology, Inc. Methods and apparatuses for selectively removing sensitive information during a collaboration session
US7672512B2 (en) * 2005-03-18 2010-03-02 Searete Llc Forms for completion with an electronic writing device
US20070234232A1 (en) * 2006-03-29 2007-10-04 Gheorghe Adrian Citu Dynamic image display
US7631013B2 (en) * 2005-04-06 2009-12-08 Sierra Interactive Systems, Inc. System and method for publishing, distributing, and reading electronic interactive books
US20060277482A1 (en) * 2005-06-07 2006-12-07 Ilighter Corp. Method and apparatus for automatically storing and retrieving selected document sections and user-generated notes
US20070050701A1 (en) * 2005-08-31 2007-03-01 Khaled El Emam Method, system and computer program product for medical form creation
US7882565B2 (en) * 2005-09-02 2011-02-01 Microsoft Corporation Controlled access to objects or areas in an electronic document
US7606856B2 (en) * 2005-11-09 2009-10-20 Scenera Technologies, Llc Methods, systems, and computer program products for presenting topical information referenced during a communication
WO2007069104A1 (en) * 2005-12-12 2007-06-21 Koninklijke Philips Electronics N.V. System and method for opening web links in a browser application
US20070162849A1 (en) * 2006-01-09 2007-07-12 Elizabeth Marciano Interactive kitchen recipe workstation
US20070205992A1 (en) * 2006-03-06 2007-09-06 Samsung Electronics Co., Ltd. Touch sensitive scrolling system and method
US20070205990A1 (en) * 2006-03-06 2007-09-06 Samsung Electronics Co., Ltd. System and method for text entry with touch sensitive keypad
US20070205989A1 (en) * 2006-03-06 2007-09-06 Samsung Electronics Co., Ltd. Camera with a touch sensitive keypad
US20070205991A1 (en) * 2006-03-06 2007-09-06 Samsung Electronics Co., Ltd. System and method for number dialing with touch sensitive keypad
US20070205993A1 (en) * 2006-03-06 2007-09-06 Samsung Electronics Co., Ltd. Mobile device having a keypad with directional controls
US20080022224A1 (en) * 2006-03-07 2008-01-24 Marengo Intellectual Property Ltd. Pushed and pulled information display on a computing device
US20080028324A1 (en) * 2006-03-07 2008-01-31 Marengo Intellectual Property Ltd. Multi-applicaton bulletin board
US20070214430A1 (en) * 2006-03-07 2007-09-13 Coutts Daryl D Textpane for pushed and pulled information on a computing device
US8607149B2 (en) * 2006-03-23 2013-12-10 International Business Machines Corporation Highlighting related user interface controls
US8782133B2 (en) * 2006-07-12 2014-07-15 Daryl David Coutts Multi-conversation instant messaging
US8185605B2 (en) * 2006-07-18 2012-05-22 Cisco Technology, Inc. Methods and apparatuses for accessing an application on a remote device
US20080141160A1 (en) * 2006-12-07 2008-06-12 Nokia Corporation Systems, methods, devices, and computer program products for adding chapters to continuous media while recording
US8219374B1 (en) 2007-02-21 2012-07-10 University Of Central Florida Research Foundation, Inc. Symbolic switch/linear circuit simulator systems and methods
US10078414B2 (en) * 2007-03-29 2018-09-18 Apple Inc. Cursor for presenting information regarding target
US20080256114A1 (en) * 2007-04-10 2008-10-16 Microsoft Corporation Techniques to display associated information between application programs
US8473850B2 (en) * 2007-05-24 2013-06-25 Cisco Technology, Inc. Methods and apparatuses for displaying and managing content during a collaboration session
US8751947B2 (en) 2008-02-29 2014-06-10 Adobe Systems Incorporated Visual and functional transform
JP2009284468A (en) * 2008-04-23 2009-12-03 Sharp Corp Personal digital assistant, computer readable program and recording medium
JP4577428B2 (en) * 2008-08-11 2010-11-10 ソニー株式会社 Display device, display method, and program
US8621390B1 (en) * 2008-10-21 2013-12-31 Amazon Technologies, Inc. Table of contents menu over electronic book content on an electronic paper display
JP5500818B2 (en) * 2008-11-18 2014-05-21 シャープ株式会社 Display control apparatus and display control method
US20100146459A1 (en) * 2008-12-08 2010-06-10 Mikko Repka Apparatus and Method for Influencing Application Window Functionality Based on Characteristics of Touch Initiated User Interface Manipulations
US8839154B2 (en) 2008-12-31 2014-09-16 Nokia Corporation Enhanced zooming functionality
US20100164878A1 (en) * 2008-12-31 2010-07-01 Nokia Corporation Touch-click keypad
US20100225809A1 (en) * 2009-03-09 2010-09-09 Sony Corporation And Sony Electronics Inc. Electronic book with enhanced features
US20100240019A1 (en) * 2009-03-17 2010-09-23 Sathy Kumar R Instructional aids and methods
US8484027B1 (en) 2009-06-12 2013-07-09 Skyreader Media Inc. Method for live remote narration of a digital book
US20140154657A1 (en) * 2012-11-02 2014-06-05 Coursesmart Llc System and method for assessing a user's engagement with digital resources
AU2016202713B2 (en) * 2010-01-11 2018-01-25 Apple Inc. Electronic text manipulation and display
CN108629033B (en) 2010-01-11 2022-07-08 苹果公司 Manipulation and display of electronic text
USD660862S1 (en) * 2010-01-27 2012-05-29 Apple Inc. Display screen or portion thereof with graphical user interface
US20110191692A1 (en) * 2010-02-03 2011-08-04 Oto Technologies, Llc System and method for e-book contextual communication
US20110231388A1 (en) * 2010-03-19 2011-09-22 I/O Interconnect, Ltd. E-book read apparatus and operation thereof
TW201203193A (en) * 2010-07-13 2012-01-16 Pegatron Corp Electronic book and control method thereof
JP2012069065A (en) * 2010-09-27 2012-04-05 Nintendo Co Ltd Information processing program, and information processing device and method
US9678572B2 (en) 2010-10-01 2017-06-13 Samsung Electronics Co., Ltd. Apparatus and method for turning e-book pages in portable terminal
KR101743632B1 (en) 2010-10-01 2017-06-07 삼성전자주식회사 Apparatus and method for turning e-book pages in portable terminal
EP2437153A3 (en) * 2010-10-01 2016-10-05 Samsung Electronics Co., Ltd. Apparatus and method for turning e-book pages in portable terminal
EP2437151B1 (en) 2010-10-01 2020-07-08 Samsung Electronics Co., Ltd. Apparatus and method for turning e-book pages in portable terminal
US8478757B2 (en) * 2010-11-25 2013-07-02 Kobo Inc. Systems and methods for managing profiles
US20120159373A1 (en) * 2010-12-15 2012-06-21 Verizon Patent And Licensing, Inc. System for and method of generating dog ear bookmarks on a touch screen device
US9645986B2 (en) 2011-02-24 2017-05-09 Google Inc. Method, medium, and system for creating an electronic book with an umbrella policy
USD761840S1 (en) 2011-06-28 2016-07-19 Google Inc. Display screen or portion thereof with an animated graphical user interface of a programmed computer system
US8755058B1 (en) 2011-08-26 2014-06-17 Selfpublish Corporation System and method for self-publication
US9141404B2 (en) 2011-10-24 2015-09-22 Google Inc. Extensible framework for ereader tools
EP2587482A3 (en) * 2011-10-25 2013-06-26 Samsung Electronics Co., Ltd Method for applying supplementary attribute information to e-book content and mobile device adapted thereto
US9031493B2 (en) 2011-11-18 2015-05-12 Google Inc. Custom narration of electronic books
US20140074648A1 (en) * 2012-09-11 2014-03-13 Google Inc. Portion recommendation for electronic books
US11763070B2 (en) * 2013-03-15 2023-09-19 PowerNotes LLC Method and system for labeling and organizing data for summarizing and referencing content via a communication network
US9430141B1 (en) * 2014-07-01 2016-08-30 Amazon Technologies, Inc. Adaptive annotations
US10417309B2 (en) * 2014-10-16 2019-09-17 Liquidtext, Inc Facilitating active reading of digital documents
EP3259680A4 (en) 2015-02-20 2018-10-17 Hewlett-Packard Development Company, L.P. Citation explanations
US10552514B1 (en) * 2015-02-25 2020-02-04 Amazon Technologies, Inc. Process for contextualizing position
US10255701B2 (en) 2016-09-21 2019-04-09 International Business Machines Corporation System, method and computer program product for electronic document display
US10261987B1 (en) * 2017-12-20 2019-04-16 International Business Machines Corporation Pre-processing E-book in scanned format
US11693676B2 (en) 2019-10-11 2023-07-04 Kahana Group Inc. Computer based unitary workspace leveraging multiple file-type toggling for dynamic content creation
US11397844B2 (en) 2019-10-11 2022-07-26 Kahana Group Inc. Computer based unitary workspace leveraging multiple file-type toggling for dynamic content creation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0342838A2 (en) * 1988-05-20 1989-11-23 International Business Machines Corporation User interface for a data input
US5893132A (en) * 1995-12-14 1999-04-06 Motorola, Inc. Method and system for encoding a book for reading using an electronic book
WO1999049383A1 (en) * 1998-03-20 1999-09-30 Nuvomedia, Inc. Electronic book system
US5982370A (en) * 1997-07-18 1999-11-09 International Business Machines Corporation Highlighting tool for search specification in a user interface of a computer system

Family Cites Families (158)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1987001481A1 (en) 1985-08-30 1987-03-12 Norbert Joseph Stein Electronic book device
US5337438A (en) * 1992-05-04 1994-08-16 The Babcock & Wilcox Company Method and apparatus for constant progression of a cleaning jet across heated surfaces
WO1989011693A1 (en) 1988-05-27 1989-11-30 Wang Laboratories, Inc. Document annotation and manipulation in a data processing system
US5146552A (en) 1990-02-28 1992-09-08 International Business Machines Corporation Method for associating annotation with electronically published material
USRE34476E (en) * 1990-05-14 1993-12-14 Norwood Donald D Hybrid information management system for handwriting and text
JPH07104765B2 (en) * 1990-08-24 1995-11-13 ゼロックス コーポレイション Electronic documentation as a user interface to computer-resident software systems
US5239466A (en) 1990-10-04 1993-08-24 Motorola, Inc. System for selectively routing and merging independent annotations to a document at remote locations
US5347295A (en) 1990-10-31 1994-09-13 Go Corporation Control of a computer through a position-sensed stylus
CA2048039A1 (en) 1991-07-19 1993-01-20 Steven Derose Data processing system and method for generating a representation for and random access rendering of electronic documents
US5666113A (en) * 1991-07-31 1997-09-09 Microtouch Systems, Inc. System for using a touchpad input device for cursor control and keyboard emulation
US5632022A (en) 1991-11-13 1997-05-20 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Encyclopedia of software components
WO1994008310A1 (en) 1992-10-01 1994-04-14 Quark, Inc. Publication system management and coordination
US5434965A (en) * 1992-12-23 1995-07-18 Taligent, Inc. Balloon help system
US5920694A (en) 1993-03-19 1999-07-06 Ncr Corporation Annotation of computer video displays
US5649104A (en) * 1993-03-19 1997-07-15 Ncr Corporation System for allowing user of any computer to draw image over that generated by the host computer and replicating the drawn image to other computers
US5559942A (en) 1993-05-10 1996-09-24 Apple Computer, Inc. Method and apparatus for providing a note for an application program
US5471568A (en) * 1993-06-30 1995-11-28 Taligent, Inc. Object-oriented apparatus and method for scan line conversion of graphic edges
CN1052129C (en) * 1993-07-16 2000-05-03 索尼公司 Image pickup parameter recorder, image pickup method, and editing system
US5390138A (en) * 1993-09-13 1995-02-14 Taligent, Inc. Object-oriented audio system
US6243071B1 (en) 1993-11-03 2001-06-05 Apple Computer, Inc. Tool set for navigating through an electronic book
US5806079A (en) 1993-11-19 1998-09-08 Smartpatents, Inc. System, method, and computer program product for using intelligent notes to organize, link, and manipulate disparate data objects
US5623679A (en) 1993-11-19 1997-04-22 Waverley Holdings, Inc. System and method for creating and manipulating notes each containing multiple sub-notes, and linking the sub-notes to portions of data objects
JP2521035B2 (en) * 1993-12-03 1996-07-31 インターナショナル・ビジネス・マシーンズ・コーポレイション Placing annotations on the display
US5729687A (en) 1993-12-20 1998-03-17 Intel Corporation System for sending differences between joining meeting information and public meeting information between participants in computer conference upon comparing annotations of joining and public meeting information
JP3546337B2 (en) * 1993-12-21 2004-07-28 ゼロックス コーポレイション User interface device for computing system and method of using graphic keyboard
DE4446139C2 (en) 1993-12-30 2000-08-17 Intel Corp Method and device for highlighting objects in a conference system
US5822720A (en) * 1994-02-16 1998-10-13 Sentius Corporation System amd method for linking streams of multimedia data for reference material for display
SE502627C2 (en) * 1994-04-25 1995-11-27 Sandvik Ab Device for protection against over tightening of bolts, screws and the like
US5630125A (en) * 1994-05-23 1997-05-13 Zellweger; Paul Method and apparatus for information management using an open hierarchical data structure
US5948040A (en) 1994-06-24 1999-09-07 Delorme Publishing Co. Travel reservation information and planning system
US5434929A (en) * 1994-07-12 1995-07-18 Apple Computer, Inc. Method and apparatus for setting character style preferences in a pen-based computer system
US5621871A (en) * 1994-08-31 1997-04-15 Jaremko; Mark Automated system and method for annotation using callouts
US5831615A (en) 1994-09-30 1998-11-03 Intel Corporation Method and apparatus for redrawing transparent windows
US5801687A (en) 1994-09-30 1998-09-01 Apple Computer, Inc. Authoring tool comprising nested state machines for use in a computer system
FR2728894A1 (en) 1994-12-29 1996-07-05 Inst Francais Du Petrole PARAXYLENE SEPARATION PROCESS CONTAINING AT LEAST TWO HIGH TEMPERATURE CRYSTALLIZATION STAGES
US5760773A (en) 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles
US5734883A (en) 1995-04-27 1998-03-31 Michael Umen & Co., Inc. Drug document production system
US5719595A (en) * 1995-05-09 1998-02-17 Apple Computer, Inc. Method and apparauts for generating a text image on a display with anti-aliasing effect
US6018342A (en) * 1995-07-03 2000-01-25 Sun Microsystems, Inc. Automatically generated content-based history mechanism
TW387181B (en) 1995-07-10 2000-04-11 Hitachi Ltd Electronic press information dispatching system
US6199082B1 (en) * 1995-07-17 2001-03-06 Microsoft Corporation Method for delivering separate design and content in a multimedia publishing system
US5845240A (en) * 1996-07-24 1998-12-01 Fielder; Mark Selective recall and preservation of continuously recorded data
US5687331A (en) 1995-08-03 1997-11-11 Microsoft Corporation Method and system for displaying an animated focus item
US5682439A (en) * 1995-08-07 1997-10-28 Apple Computer, Inc. Boxed input correction system and method for pen based computer systems
US5826025A (en) 1995-09-08 1998-10-20 Sun Microsystems, Inc. System for annotation overlay proxy configured to retrieve associated overlays associated with a document request from annotation directory created from list of overlay groups
US6486895B1 (en) * 1995-09-08 2002-11-26 Xerox Corporation Display system for displaying lists of linked documents
US5717860A (en) 1995-09-20 1998-02-10 Infonautics Corporation Method and apparatus for tracking the navigation path of a user on the world wide web
WO1997012328A1 (en) 1995-09-25 1997-04-03 Adobe Systems Incorporated Optimum access to electronic documents
US5737599A (en) 1995-09-25 1998-04-07 Rowe; Edward R. Method and apparatus for downloading multi-page electronic documents with hint information
US5572643A (en) 1995-10-19 1996-11-05 Judson; David H. Web browser with dynamic display of information objects during linking
US6405221B1 (en) 1995-10-20 2002-06-11 Sun Microsystems, Inc. Method and apparatus for creating the appearance of multiple embedded pages of information in a single web browser display
US5717879A (en) 1995-11-03 1998-02-10 Xerox Corporation System for the capture and replay of temporal data representing collaborative activities
US5838313A (en) * 1995-11-20 1998-11-17 Siemens Corporate Research, Inc. Multimedia-based reporting system with recording and playback of dynamic annotation
JPH09153059A (en) 1995-11-30 1997-06-10 Matsushita Electric Ind Co Ltd History display method
US5761485A (en) 1995-12-01 1998-06-02 Munyan; Daniel E. Personal electronic book system
AU1569797A (en) 1995-12-14 1997-07-03 Motorola, Inc. Electronic book and method of annotation therefor
DE19548532A1 (en) * 1995-12-22 1997-06-26 Giesecke & Devrient Gmbh Method for the detection of an electrically conductive element in a document
US5821925A (en) 1996-01-26 1998-10-13 Silicon Graphics, Inc. Collaborative work environment supporting three-dimensional objects and multiple remote participants
US6081829A (en) 1996-01-31 2000-06-27 Silicon Graphics, Inc. General purpose web annotations without modifying browser
US5832263A (en) 1996-03-15 1998-11-03 Digidox, Inc. System and method for in-place modification of information recorded in read-only storage using modifiable non-volatile storage associated with an agent
US6035330A (en) 1996-03-29 2000-03-07 British Telecommunications World wide web navigational mapping system and method
US5801685A (en) * 1996-04-08 1998-09-01 Tektronix, Inc. Automatic editing of recorded video elements sychronized with a script text read or displayed
US6012055A (en) 1996-04-09 2000-01-04 Silicon Graphics, Inc. Mechanism for integrated information search and retrieval from diverse sources using multiple navigation methods
US5835092A (en) 1996-04-09 1998-11-10 Silicon Graphics, Inc. Mechanism for non-linear browsing of diverse information sources
US5784058A (en) * 1996-05-28 1998-07-21 Sun Microsystems, Inc. User-controllable persistent browser display pages
JPH09322058A (en) 1996-05-29 1997-12-12 Zekuu:Kk Video signal editing device
US6122649A (en) 1996-05-30 2000-09-19 Microsoft Corporation Method and system for user defined and linked properties
US5727129A (en) * 1996-06-04 1998-03-10 International Business Machines Corporation Network system for profiling and actively facilitating user activities
US5918236A (en) 1996-06-28 1999-06-29 Oracle Corporation Point of view gists and generic gists in a document browsing system
US5854630A (en) 1996-07-01 1998-12-29 Sun Microsystems, Inc. Prospective view for web backtrack
US6054990A (en) 1996-07-05 2000-04-25 Tran; Bao Q. Computer system with handwriting annotation
AU3825197A (en) 1996-08-05 1998-02-25 Motorola, Inc. Book-like interface for browsing on-line documents and methods therefor
TW362057B (en) 1996-08-05 1999-06-21 Hh Patent As Method for the deburring of items
US5931912A (en) 1996-08-09 1999-08-03 International Business Machines Corporation Traversal path-based approach to understanding user-oriented hypertext object usage
US5956034A (en) 1996-08-13 1999-09-21 Softbook Press, Inc. Method and apparatus for viewing electronic reading materials
WO1998009446A2 (en) 1996-08-26 1998-03-05 Seng Beng Ho A browsing system and method for computer information
US6064384A (en) * 1996-08-26 2000-05-16 E-Brook Systems Pte Ltd Computer user interface system and method having book image features
GB2317090B (en) 1996-09-06 2001-04-04 Quantel Ltd An electronic graphic system
US5745116A (en) * 1996-09-09 1998-04-28 Motorola, Inc. Intuitive gesture-based graphical user interface
US5940080A (en) * 1996-09-12 1999-08-17 Macromedia, Inc. Method and apparatus for displaying anti-aliased text
US5890172A (en) 1996-10-08 1999-03-30 Tenretni Dynamics, Inc. Method and apparatus for retrieving data from a network using location identifiers
US6049812A (en) 1996-11-18 2000-04-11 International Business Machines Corp. Browser and plural active URL manager for network computers
US6011537A (en) 1997-01-27 2000-01-04 Slotznick; Benjamin System for delivering and simultaneously displaying primary and secondary information, and for displaying only the secondary information during interstitial space
US5933139A (en) * 1997-01-31 1999-08-03 Microsoft Corporation Method and apparatus for creating help functions
US6018334A (en) 1997-02-20 2000-01-25 Eckerberg; Mark Computer pointing device
US6091930A (en) 1997-03-04 2000-07-18 Case Western Reserve University Customizable interactive textbook
US6279005B1 (en) 1997-03-04 2001-08-21 Paul Zellweger Method and apparatus for generating paths in an open hierarchical data structure
US6195694B1 (en) * 1997-03-13 2001-02-27 International Business Machines Corporation Server for reconfiguring control of a subset of devices on one or more kiosks
US5937416A (en) 1997-03-25 1999-08-10 Bennethum Computer Systems Method for preserving data in an electronic document
US5978818A (en) 1997-04-29 1999-11-02 Oracle Corporation Automated hypertext outline generation for documents
US5877757A (en) 1997-05-23 1999-03-02 International Business Machines Corporation Method and system for providing user help information in network applications
US5956885A (en) 1997-06-16 1999-09-28 Zirbes; Michael L. Fishing reel cover
US5933140A (en) 1997-06-30 1999-08-03 Sun Microsystems, Inc. Child window containing context-based help and a miniaturized web page
US6573907B1 (en) * 1997-07-03 2003-06-03 Obvious Technology Network distribution and management of interactive video and multi-media containers
US6016492A (en) * 1997-07-15 2000-01-18 Microsoft Corporation Forward extensible property modifiers for formatting information in a program module
JP3420472B2 (en) * 1997-07-22 2003-06-23 富士通株式会社 System and recording medium used for certification of electronic publication
US6301590B1 (en) 1997-08-11 2001-10-09 Viador Method and apparatus for formatting and displaying data from the internet
JP3853034B2 (en) * 1997-08-13 2006-12-06 シスメックス株式会社 Object boundary determination method and apparatus, and recording medium recording object boundary determination program
US5877766A (en) 1997-08-15 1999-03-02 International Business Machines Corporation Multi-node user interface component and method thereof for use in accessing a plurality of linked records
US5987482A (en) * 1997-09-08 1999-11-16 International Business Machines Corporation Computer system and method of displaying hypertext documents with internal hypertext link definitions
US6279014B1 (en) 1997-09-15 2001-08-21 Xerox Corporation Method and system for organizing documents based upon annotations in context
US5956048A (en) * 1997-11-10 1999-09-21 Kerry R. Gaston Electronic book system
US6157381A (en) 1997-11-18 2000-12-05 International Business Machines Corporation Computer system, user interface component and method utilizing non-linear scroll bar
US6243091B1 (en) 1997-11-21 2001-06-05 International Business Machines Corporation Global history view
US6037934A (en) 1997-11-21 2000-03-14 International Business Machines Corporation Named bookmark sets
US6571211B1 (en) * 1997-11-21 2003-05-27 Dictaphone Corporation Voice file header data in portable digital audio recorder
US6321244B1 (en) 1997-12-04 2001-11-20 Siemens Corporate Research, Inc. Style specifications for systematically creating card-based hypermedia manuals
US6055538A (en) * 1997-12-22 2000-04-25 Hewlett Packard Company Methods and system for using web browser to search large collections of documents
US6195679B1 (en) 1998-01-06 2001-02-27 Netscape Communications Corporation Browsing session recording playback and editing system for generating user defined paths and allowing users to mark the priority of items in the paths
US6163778A (en) * 1998-02-06 2000-12-19 Sun Microsystems, Inc. Probabilistic web link viability marker and web page ratings
US6421065B1 (en) 1998-02-09 2002-07-16 Microsoft Corporation Access of online information featuring automatic hide/show function
US6144991A (en) * 1998-02-19 2000-11-07 Telcordia Technologies, Inc. System and method for managing interactions between users in a browser-based telecommunications network
US6038598A (en) * 1998-02-23 2000-03-14 Intel Corporation Method of providing one of a plurality of web pages mapped to a single uniform resource locator (URL) based on evaluation of a condition
US6105055A (en) * 1998-03-13 2000-08-15 Siemens Corporate Research, Inc. Method and apparatus for asynchronous multimedia collaboration
US6181344B1 (en) * 1998-03-20 2001-01-30 Nuvomedia, Inc. Drag-and-release method for configuring user-definable function key of hand-held computing device
US6331867B1 (en) * 1998-03-20 2001-12-18 Nuvomedia, Inc. Electronic book with automated look-up of terms of within reference titles
US6356287B1 (en) * 1998-03-20 2002-03-12 Nuvomedia, Inc. Citation selection and routing feature for hand-held content display device
IE980959A1 (en) 1998-03-31 1999-10-20 Datapage Ireland Ltd Document Production
US6272484B1 (en) * 1998-05-27 2001-08-07 Scansoft, Inc. Electronic document manager
US6154771A (en) 1998-06-01 2000-11-28 Mediastra, Inc. Real-time receipt, decompression and play of compressed streaming video/hypervideo; with thumbnail display of past scenes and with replay, hyperlinking and/or recording permissively intiated retrospectively
US6535294B1 (en) 1998-06-23 2003-03-18 Discount Labels, Inc. System and method for preparing customized printed products over a communications network
CA2241836A1 (en) * 1998-06-29 1999-12-29 Object Technology International Inc. Natural language transformations for propagating hypertext label changes
US6018742A (en) 1998-07-07 2000-01-25 Perigis Corporation Constructing a bifurcated database of context-dependent and context-independent data items
SG85661A1 (en) * 1998-08-12 2002-01-15 Nippon Telegraph & Telephone Recording medium with a signed hypertext recorded thereon, signed hypertext generating method and apparatus, and signed hypertext verifying method and apparatus
US6710790B1 (en) * 1998-08-13 2004-03-23 Symantec Corporation Methods and apparatus for tracking the active window of a host computer in a remote computer display window
US6226618B1 (en) * 1998-08-13 2001-05-01 International Business Machines Corporation Electronic content delivery system
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6230171B1 (en) 1998-08-29 2001-05-08 International Business Machines Corporation Markup system for shared HTML documents
US6377983B1 (en) 1998-08-31 2002-04-23 International Business Machines Corporation Method and system for converting expertise based on document usage
US6289362B1 (en) 1998-09-01 2001-09-11 Aidministrator Nederland B.V. System and method for generating, transferring and using an annotated universal address
JP3763678B2 (en) * 1998-09-01 2006-04-05 三菱電機株式会社 Reflective liquid crystal display
US6184886B1 (en) * 1998-09-04 2001-02-06 International Business Machines Corporation Apparatus and method for staging bookmarks
US6369811B1 (en) 1998-09-09 2002-04-09 Ricoh Company Limited Automatic adaptive document help for paper documents
WO2000016541A1 (en) * 1998-09-15 2000-03-23 Microsoft Corporation Annotation creation and notification via electronic mail
US6271840B1 (en) 1998-09-24 2001-08-07 James Lee Finseth Graphical search engine visual index
US6076917A (en) 1998-09-30 2000-06-20 Eastman Kodak Company Ink jet printing of color image and annotations
WO2000020770A1 (en) 1998-10-08 2000-04-13 Imo Industries, Inc. Universal joint for vehicle steering systems
US6320577B1 (en) * 1998-11-03 2001-11-20 Agilent Technologies, Inc. System and method for graphically annotating a waveform display in a signal-measurement system
US6539370B1 (en) 1998-11-13 2003-03-25 International Business Machines Corporation Dynamically generated HTML formatted reports
US6393422B1 (en) 1998-11-13 2002-05-21 International Business Machines Corporation Navigation method for dynamically generated HTML pages
US6034589A (en) 1998-12-17 2000-03-07 Aem, Inc. Multi-layer and multi-element monolithic surface mount fuse and method of making the same
US20020194260A1 (en) * 1999-01-22 2002-12-19 Kent Lawrence Headley Method and apparatus for creating multimedia playlists for audio-visual systems
US6529920B1 (en) * 1999-03-05 2003-03-04 Audiovelocity, Inc. Multimedia linking device and method
US6687878B1 (en) * 1999-03-15 2004-02-03 Real Time Image Ltd. Synchronizing/updating local client notes with annotations previously made by other clients in a notes database
US6425525B1 (en) * 1999-03-19 2002-07-30 Accenture Llp System and method for inputting, retrieving, organizing and analyzing data
US6446110B1 (en) 1999-04-05 2002-09-03 International Business Machines Corporation Method and apparatus for representing host datastream screen image information using markup languages
US6636238B1 (en) * 1999-04-20 2003-10-21 International Business Machines Corporation System and method for linking an audio stream with accompanying text material
US6549220B1 (en) * 1999-06-10 2003-04-15 International Business Machines Corporation Method, system, and program for providing pages of information with navigation and content areas
US6647534B1 (en) * 1999-06-30 2003-11-11 Ricoh Company Limited Method and system for organizing document information in a non-directed arrangement of documents
US6276005B1 (en) 1999-07-02 2001-08-21 Mark G. Sanders Water recycling device
US6760884B1 (en) * 1999-08-09 2004-07-06 Internal Research Corporation Interactive memory archive
US6397264B1 (en) 1999-11-01 2002-05-28 Rstar Corporation Multi-browser client architecture for managing multiple applications having a history list
US7403888B1 (en) * 1999-11-05 2008-07-22 Microsoft Corporation Language input user interface
US6600497B1 (en) 1999-11-15 2003-07-29 Elliot A. Gottfurcht Apparatus and method to navigate interactive television using unique inputs with a remote control
WO2001041452A2 (en) * 1999-12-03 2001-06-07 Mti Film, Llc System and method for identifying inconsistencies in duplicate digital videos
US6831912B1 (en) 2000-03-09 2004-12-14 Raytheon Company Effective protocol for high-rate, long-latency, asymmetric, and bit-error prone data links
US6580821B1 (en) 2000-03-30 2003-06-17 Nec Corporation Method for computing the location and orientation of an object in three dimensional space
US20020099552A1 (en) 2001-01-25 2002-07-25 Darryl Rubin Annotating electronic information with audio clips

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0342838A2 (en) * 1988-05-20 1989-11-23 International Business Machines Corporation User interface for a data input
US5893132A (en) * 1995-12-14 1999-04-06 Motorola, Inc. Method and system for encoding a book for reading using an electronic book
US5982370A (en) * 1997-07-18 1999-11-09 International Business Machines Corporation Highlighting tool for search specification in a user interface of a computer system
WO1999049383A1 (en) * 1998-03-20 1999-09-30 Nuvomedia, Inc. Electronic book system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010056483A1 (en) * 2008-11-13 2010-05-20 Qualcomm Incorporated Method and system for context dependent pop-up menus
US8321802B2 (en) 2008-11-13 2012-11-27 Qualcomm Incorporated Method and system for context dependent pop-up menus
CN102203711B (en) * 2008-11-13 2014-07-16 高通股份有限公司 Method and system for context dependent pop-up menus
US9058092B2 (en) 2008-11-13 2015-06-16 Qualcomm Incorporated Method and system for context dependent pop-up menus
EP2444884A3 (en) * 2010-10-14 2012-05-09 LG Electronics Inc. Electronic device and method for providing menu using the same
US9043709B2 (en) 2010-10-14 2015-05-26 Lg Electronics Inc. Electronic device and method for providing menu using the same

Also Published As

Publication number Publication date
ATE273534T1 (en) 2004-08-15
US20030206189A1 (en) 2003-11-06
AU2066001A (en) 2001-06-18
EP1236081B1 (en) 2004-08-11
WO2001042899A1 (en) 2001-06-14
DE60044787D1 (en) 2010-09-16
DE60012971D1 (en) 2004-09-16
EP1473624B1 (en) 2010-08-04
ATE476698T1 (en) 2010-08-15
EP1236081A1 (en) 2002-09-04
US7260781B2 (en) 2007-08-21
US6714214B1 (en) 2004-03-30
EP1473624A3 (en) 2005-03-02
DE60012971T2 (en) 2005-01-05

Similar Documents

Publication Publication Date Title
US7260781B2 (en) System, method and user interface for active reading of electronic content
US7028267B1 (en) Method and apparatus for capturing and rendering text annotations for non-modifiable electronic content
US5488685A (en) Method and apparatus for providing visual cues in a graphic user interface
US5825355A (en) Method and apparatus for providing a help based window system using multiple access methods
US8627197B2 (en) System and method for annotating an electronic document independently of its content
US6529215B2 (en) Method and apparatus for annotating widgets
US7458014B1 (en) Computer user interface architecture wherein both content and user interface are composed of documents with links
US5371844A (en) Palette manager in a graphical user interface computer system
US7185274B1 (en) Computer user interface architecture wherein users interact with both content and user interface by activating links
US5828374A (en) Method and apparatus for selecting characters along a scroll bar with a slider
US7496830B2 (en) Computer user interface architecture that saves a user's non-linear navigation history and intelligently maintains that history
US5469540A (en) Method and apparatus for generating and displaying multiple simultaneously-active windows
EP1285330B1 (en) Zeroclick
US5461710A (en) Method for providing a readily distinguishable template and means of duplication thereof in a computer system graphical user interface
US9424240B2 (en) Annotations for electronic content
US20020167534A1 (en) Reading aid for electronic text and displays
US7437683B1 (en) Method and apparatus for fostering immersive reading of electronic documents
WO1995007510A1 (en) Method and system for an electronic forms generation user interface
KR100881479B1 (en) Information processing method and apparatus thereof
Walkenbach et al. Office 2010 Library: Excel 2010 Bible, Access 2010 Bible, PowerPoint 2010 Bible, Word 2010 Bible
WO1994017469A1 (en) Graphical user interface for a help system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040806

AC Divisional application: reference to earlier application

Ref document number: 1236081

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: SILVER, DAVID M.

Inventor name: KEELY, LEROY B.

Inventor name: MADAN, VIKRAM

Inventor name: DEMELLO, MARCO A.

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AKX Designation fees paid

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/048 20060101AFI20100127BHEP

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AC Divisional application: reference to earlier application

Ref document number: 1236081

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 60044787

Country of ref document: DE

Date of ref document: 20100916

Kind code of ref document: P

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20100804

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20101206

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20101105

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20101201

Year of fee payment: 11

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20101115

26N No opposition filed

Effective date: 20110506

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20101231

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 60044787

Country of ref document: DE

Effective date: 20110506

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20101231

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20101207

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20101231

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20111219

Year of fee payment: 12

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20101207

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20100804

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20121207

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20130830

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130102

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20121207

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60044787

Country of ref document: DE

Representative=s name: GRUENECKER, KINKELDEY, STOCKMAIR & SCHWANHAEUS, DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20150115 AND 20150121

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 60044787

Country of ref document: DE

Representative=s name: GRUENECKER PATENT- UND RECHTSANWAELTE PARTG MB, DE

Effective date: 20150126

Ref country code: DE

Ref legal event code: R081

Ref document number: 60044787

Country of ref document: DE

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, REDMOND, US

Free format text: FORMER OWNER: MICROSOFT CORP., REDMOND, WASH., US

Effective date: 20150126

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20171129

Year of fee payment: 18

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 60044787

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20190702