US20060028674A1 - Printer with user ID sensor - Google Patents
Printer with user ID sensor Download PDFInfo
- Publication number
- US20060028674A1 US20060028674A1 US11/193,479 US19347905A US2006028674A1 US 20060028674 A1 US20060028674 A1 US 20060028674A1 US 19347905 A US19347905 A US 19347905A US 2006028674 A1 US2006028674 A1 US 2006028674A1
- Authority
- US
- United States
- Prior art keywords
- printer
- user
- token
- reader
- computer network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/06—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the phase of light
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
- G06F3/0321—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present invention relates to printing systems, and in particular printing systems involving interactive paper, computer publishing, computer applications, human-computer interfaces, and information appliances.
- documents to be printed are typically sent via local computer networks to one of a number of printers connected to the network.
- the nominated printer is usually the most convenient to the user but unless the user goes to collect the document immediately after sending the print job, the printed document waits in the collection tray. If the document is sensitive, there is a risk that its contents will be disclosed to others passing the printer.
- the present invention provides a computer network for a plurality of users, the computer network comprising:
- Print jobs can be collected from the most convenient printer regardless of a users current location in the office.
- the Netpage system is comprehensively described in the cross referenced documents as well as the Detailed Description below.
- This system uses a paper- and pen-based interface to computer-based and typically network-based information and applications.
- the user can request print jobs by ‘clicking’ an interactive element on a Netpage document with a Netpage pen and therefore may be remote from any of the networked printers or even the office when print jobs are requested.
- According the invention is particularly suited to the Netpage system and will be described with particular reference to its operation within this environment. However, it will be appreciated that the invention has much broader application than Netpage and is not limited or restricted to printing Netpage documents.
- the network comprises a plurality of said printers, each printer associated with one of the printer identifiers respectively;
- each of the network user identifiers is a token and each of the printer identifiers has a token reader such that the user presents their token to the token reader associated with one of the printers to request actual printing of their queued printouts via that printer.
- the tokens are a short-range RFID tag, a smartcard or a magnetic stripe card.
- the token reader notifies a walk-up-handling application on the server of the user's proximity to the associated printer which in turn initiates printing.
- each of the printer identifiers is a token and each of the network user identifiers has a token reader associated with the user.
- the token reader is an electronic stylus with an optical sensor
- the tokens are a surface each of the printers with coded data disposed on it, the coded data being readable by the optical sensors of each users' electronic stylus.
- the pending printouts are maintained in a queue by the server and each pending printout has a priority such that higher-priority printouts are printed before earlier-queued but lower-priority printouts.
- the token readers are associated with respective printers such that when the user presents their token to the reader it reads the token and identifies both the user and the printer to the server.
- the token identfies the user explicitly.
- the token has a token identifier and the server performs a database lookup to translate the token identifier into a user identity.
- the token reader identifies the printer explicitly.
- the reader has a reader identifier and the server performs a database lookup to translate the reader identifier into a printer indentity.
- the token reader and the printer are separate devices which have an electrical connection.
- the token reader is physically built into the printer.
- the reader informs the printer that the user has presented a token and the printer then explicitly retrieves the user's pending printouts for printing.
- the token is a security access or identification badge or card.
- FIG. 1 shows the data flow between Netpage publishers and applications, Netpage services, and Netpage devices
- FIG. 2 is a diagram of the range of content type within a Netpage document
- FIG. 3 shows a Netpage document with a physical structure consisting of a sequence of numbered pages
- FIG. 4 shows a printout consisting of a series of impressions
- FIG. 5 is a diagram showing a user with a pen and default printer
- FIG. 6 shows the pen events recorded in a digital ink stream
- FIG. 7 shows the form data submitted to an application
- FIG. 8 shows a dynamic element for use as a document element
- FIG. 9 shows a dynamic object linked to an existing impression
- FIG. 10 shows the relationship between the document, printout and digital ink stores
- FIG. 11 shows the fundamental flow of data in the Netpage system in greater detail than FIG. 1 ;
- FIG. 12 shows the data flow associated with reprinting impressions
- FIG. 13 shows the data flow associated with printing
- FIG. 14 shows a bifurcated general printing data flow
- FIG. 15 shows the data flow associated with walk-up printing
- FIG. 16 shows the data flow associated with the establishment of a printout queue
- FIG. 17 shows the different levels of network distribution and access possible within the Netpage system
- FIG. 18 shows the data flow if the user has a token read by a reader associated with the printer
- FIG. 19 shows the data flow if the user has a reader for reading the token associated with the printer
- FIG. 20 shows the data flow if the user has a reader that reads the printer token but then uses the printer reader to connect to the Netpage server;
- FIG. 21 shows the data flow between a privately hosted network and a publicly hosted network
- FIG. 22 shows a PC or device hosted Netpage system
- FIG. 23 shows the structure of a complete tag
- FIG. 24 shows a symbol unit cell
- FIG. 25 shows nine symbol unit cells
- FIG. 26 shows the bit ordering in a symbol
- FIG. 27 shows a tag with all bits set
- FIG. 28 shows a tag group made up of four tag types
- FIG. 29 shows the continuous tiling of tag groups
- FIG. 30 shows the interleaving of codewords A, B, C & D with a tag
- FIG. 31 shows a codeword layout
- FIG. 32 shows a tag and its eight immediate neighbours labelled with its corresponding bit index.
- the invention is well suited for incorporation in the Assignee's Netpage system.
- the invention has been described as a component of a broader Netpage architecture.
- the invention is also applicable to other computer networks.
- FIG. 1 shows the interaction between Netpage publishers, applications, services and devices.
- the Netpage document service 1 accepts a document 2 from a Netpage publisher 3 or other Netpage application 4 , and produces a printout 5 via a Netpage printer 6 .
- a printout 5 consists of a series of impressions on either or both sides of a series of paper sheets.
- the printer 6 also lays down a coordinate grid in the form of an array of invisible millimetre-scale tags 7 (see U.S. Ser. No. 10/309,358 cross referenced above). Each tag encodes the two-dimensional coordinates of its location on the impression as well as the impression's unique identifier.
- a tag When a tag is optically imaged by a Netpage pen 8 (see below and U.S. Ser. No. 10/815,636 cross referenced above) the pen is able to identify the corresponding impression as well as its own position relative to the impression.
- the pen When the user of the pen 8 moves the pen relative to the coordinate grid 7 , the pen generates a stream of positions. This stream is referred to as digital ink 9 .
- a digital ink stream also records when the pen makes contact with a surface and when it loses contact with a surface, and each pair of these so-called pen down and pen up events delineates a stroke drawn by the user using the pen.
- the Netpage tag pattern 7 is typically printed using an invisible infrared ink while visible graphic content is printed using colored inks which are transparent in the infrared part of the spectrum.
- the Netpage pen 8 incorporates a conventional marking nib which utilises an infrared-transparent ink so as not to obscure the tag pattern 7 .
- impression identifiers tags
- the document 2 may include an input description 11 which defines command and form data 12 .
- the commands are instructions that may be activated by the user and the forms have designated fields that may be filled in by the user. Both commands and form fields have active zones, i.e. areas of the page where they capture user input.
- the Netpage digital ink service 13 accepts digital ink 9 from a Netpage pen 8 . Since the pen typically only has a short-range communications capability, it forwards the digital ink 9 to the Netpage digital ink service 13 via a Netpage relay 14 which has a longer-range communications capability.
- Typical relays include mobile phones, PDAs and personal computers.
- the digital ink service 13 uses the impression identifier 7 in the digital ink 9 to retrieve the corresponding impression and input description 11 from the document service 1 , and attempts to assign each individual digital ink stroke to a form of the input description 11 . Once it detects that the user of the pen 8 has designated a form submission command, it interprets the digital ink 9 assigned to the form and submits the resultant form data 12 to the application associated with the command.
- the document service 1 keeps a copy of every input description 11 it prints.
- the digital ink service 13 In order to allow a user to fill in a form over an arbitrarily long time, the digital ink service 13 retains a copy of all digital ink 9 it receives, at least until the digital ink is interpreted and submitted to an application 4 .
- the digital ink service 13 optionally retains all digital ink 9 indefinitely, to allow digital ink searching of both form content and document annotations.
- the Netpage pen 8 may be incorporated directly into a hand-held device such as a mobile phone or PDA. Conversely, the pen may incorporate a long-range communications capability and not need a separate relay.
- the digital ink service 13 may identify the interactive display 15 to a target application 4 to allow the application to communicate directly with the interactive display, thus allowing an interaction initiated via paper and pen to lead to a richer screen-based interaction, and generally allowing the development of hybrid paper- and screen-based applications which make the most of both media.
- a pen 8 may use a name service to resolve the network address of a target digital ink service, based on pen identifier and possibly impression identifier.
- a digital ink service 13 uses a name service to resolve the network address of a document service, based on impression identifier.
- the digital ink service also supports streaming delivery of digital ink to an application. This allows an application to be more directly responsive to pen input.
- streaming mode the digital ink service delivers both stroke digital ink and intervening “hover” digital ink to allow the application to provide real-time positional feedback to the user via a display.
- the object model is a logical model relating to the external interfaces of the Netpage services. It is not intended as an implementation model.
- FIG. 2 is a class diagram showing a document 2 comprising a visual description 16 and an input description 11 . For a given document, either description may be empty. Each document 2 is uniquely identified 18 .
- the visual description 16 is a collection of visual elements 20 representing static 22 and dynamic elements 24 .
- Static elements represent textflows 26 , images 28 , graphics 30 etc.
- Dynamic elements 24 are described below.
- the input description 11 is a collection of forms 32 , each of which consists of a collection of input elements 34 representing commands 36 and fields 38 .
- Forms 32 may overlap both physically and logically, and the same input element 34 may participate in multiple forms.
- Each input element 34 has a zone 40 which defines the area within which it captures input.
- Each form 32 is associated with a target application 42 .
- the application 42 receives submissions of the form 32 .
- the application 42 is identified by an address 44 .
- the impression 58 is associated with both the printer 6 on which it was printed and the user 62 who requested it, if known.
- a pen 8 is owned by a single user 62 but a user may own any number of pens 8 . Accordingly, the user 62 is assigned a user ID and other user details 68 , and likewise, each pen 8 and printer 6 has a pen ID and details 70 , and printer ID and details 72 . A user 62 optionally has a default printer 6 .
- the class diagram in FIG. 7 shows form data 12 submitted to an application consists of a collection of field values 90 .
- the form data 12 is associated with a unique form instance 92 appearing in a printout 5 .
- An application may specify a transaction identifier when the form instance 92 is first created (as part of a printout).
- the transaction identifier 94 is submitted together with the form data 12 , allowing the target application to use it to index a unique transaction context.
- the digital ink service 13 (see FIG. 1 ) supports a form lifecycle wherein a form may only be submitted once, may expire, may become frozen after being signed, and may be voided.
- the form instance reflects the status of the form with respect to the form lifecycle.
- a document 2 may also include dynamic elements 24 .
- Each dynamic element has an associated dynamic object 96 , which in turn has associated object data 98 and a (typically type-specific) object application 99 .
- a dynamic element 24 may be activated in place using a device such as a Netpage viewer (see U.S. Ser. No. 09/722,175 cross referenced above), or may be activated on an arbitrary interactive display, such as the interactive display 15 associated with the relay 14 (see FIG. 1 ), or may be activated via the Netpage Explorer (described below).
- Examples of dynamic objects and their related applications include an audio clip and an audio player, a video clip and a video player, a photo and a photo viewer, a URL and a Web browser, an editable document and a word processor, to name just a few.
- a dynamic object 96 may also be dynamically linked to an arbitrary location on an existing impression, e.g. by being “pasted” onto a virtual view of the impression or onto the impression itself.
- FIG. 10 shows the relationships between the three stores nominally maintained by the Netpage document service 1 and the Netpage digital ink service 13 (see FIG. 1 ), with navigational qualifiers.
- the Netpage services may have additional stores for registered users 62 , pens 8 and printers 6 , identifier allocation, and service address resolution (not shown).
- FIG. 11 shows the fundamental flow of data in the Netpage System in more detail than FIG. 1 .
- the document service 1 allows an application 4 to lodge a document 2 and to separately transmit a print request 106 to print the document 2 . It retains a copy of each lodged document in the document store 100 , and retains a copy of the document's input description, if any, in the document store 100 . When it prints a document 2 to a specified printer 6 , it records the printout 5 in the printout store 102 .
- the digital ink service 13 accepts digital ink 9 from a pen 8 via a relay 14 , and retains a copy of received digital ink in the digital ink store 104 . It uses the impression identifier 60 in the digital ink 9 to retrieve the corresponding impression 58 and input description from the document service 1 . It then assigns each individual digital ink stroke to an element of the input description such as a command or a form field, according to the position and extent of the stroke and the active zone of the input element. Once it detects that the user of the pen 8 has designated a form submission command, the digital ink 9 assigned to each field is interpreted 108 according to field type, and the resultant form data 12 is submitted to the application 4 associated with the command.
- an element of the input description such as a command or a form field
- the digital ink service 13 interprets a mark in a checkbox as a check mark; it converts handwritten text in a text field into a string of text characters using intelligent character recognition; and it compares a handwritten signature in a signature field with the recorded signature of the user of the pen, and, if the signatures match, digitally signs the form data on behalf of the user.
- FIG. 12 illustrates the flow of data in response to a reprint request 110 from an application 4 .
- the document service 1 reprints a set of impressions 58 it optionally includes any drawings and handwriting captured via those impressions, and retrieves the corresponding digital ink from the digital ink store 104 in the digital ink service 13 (subject to visibility and access). It records a new printout to record the impression identifiers assigned to the reprinted impressions 112 .
- FIG. 13 illustrates the flow of data in response to a general printing request from a non-Netpage-aware application 114 .
- a Netpage-aware printer driver 116 converts platform-specific drawing commands 118 into a Netpage-compatible document 2 which it lodges with the document service 1 , and then sends a print request 106 for the document service 1 to print the document 2 via a specified printer 6 .
- FIG. 14 illustrates the corresponding flow of data when the printer is not accessible by the document service 1 .
- the printer driver 116 still lodges the document 2 with the document service 1 and records the printout 5 in the printout store 102 , but actually prints the documents 2 directly via the specified printer 6 .
- FIG. 15 shows the flow of data in a walk-up environment. All print (and re-print) requests 120 from the Netpage application 4 are typically deferred. In response to a deferred print request 120 , the document service 1 records a printout 5 in the printout store 102 to capture impression-related information, and places the printout in a printout pending queue 122 for the requesting user.
- each printer 6 has an associated token reader 124 , and the user presents a token 126 to the token reader to request actual printing of queued printouts via the printer 6 .
- the token 126 may be a short-range RFID tag, a smartcard, a magnetic stripe card, etc.
- the token reader 124 notifies a walk-up-handling application 128 of the user's proximity to the printer which in turn initiates printing via the document service 1 .
- the document service can be used to provide walk-up printing for documents which are not encoded with Netpage tags and retained.
- the token 126 may be any of a number of passive, semi-passive or active devices, including a surface or object bearing a Netpage tag pattern, linear barcode or two-dimensional barcode; a magnetic stripe card; a smart card or contact-less smart card; or a radio-frequency identification (RFID) tag.
- the reader 124 may be any reader matched to the type of the token 126 , such as an optical reader utilising a scanning laser or a two-dimensional image sensor, as in conventional barcode readers or a Netpage sensing device; a magnetic stripe reader; a smart card reader; or an RFID reader.
- the token reader 124 is associated with the printer 6 and the user presents the token 126 to the reader.
- the reader 124 reads the token 126 and communicates the walk-up event to the Netpage server 1 .
- the walk-up event identifies both the user 62 and the printer 6 .
- the token 126 and hence the walk-up event may identify the user 62 explicitly, or the server may be required to perform a database lookup to translate the token identifier into a user identifier.
- the reader and hence the walk-up event may identify the printer 6 explicitly, or the server 1 may be required to perform a database lookup to translate the reader identifier into a printer identifier.
- FIG. 18 shows the reader 124 and the printer 6 as separate devices which are physically associated.
- the reader 124 may be physically built into the printer 6 . It may also be electrically connected to the printer, with the printer delivering the walk-up event to the server. Alternatively and equivalently, the printer 6 may interpret the walk-up event itself, and explicitly retrieve the user's pending printouts for printing.
- the user token 126 may be attached to or built into a portable device which the user 62 carries, such as a mobile phone, pen, electronic pen (such as a Netpage pen 8 ), wallet, security access card or token, or identification badge or card. It may also be stand-alone and purpose-specific.
- the printer reader 124 may provide a receptacle for receiving the pen, whereby the pen makes electrical contact and establishes a wired communication link (e.g. USB) with the reader to communicate the user identifier to the reader.
- a wired communication link e.g. USB
- the token reader 124 is associated with the user 62 and the user presents the reader to the token 126 .
- the reader 124 reads the token 126 and communicates the walk-up event to the Netpage server 1 .
- the walk-up event identifies both the user 62 and the printer 6 .
- the token 126 and hence the walk-up event may identify the printer 6 explicitly, or the server 1 may be required to perform a database lookup to translate the token identifier into a printer identifier.
- the reader 124 and hence the walk-up event may identify the user 62 explicitly, or the server 1 may be required to perform a database lookup to translate the reader identifier into a user identifier.
- the user 62 presents the reader 125 to the token 127 .
- the reader 125 reads the token 127 . From the token it determines a short-range communication link to the printer 6 .
- This may be a personal-area network (PAN) wireless link such as Bluetooth, wireless USB or ZigBee, or a local-area network (LAN) wireless link such as IEEE 802.11 (WiFi). It may also be a short-range optical link such as IrDA.
- PAN personal-area network
- LAN local-area network
- WiFi IEEE 802.11
- the token supplies the target address.
- the tag pattern encodes the target address instead of an impression ID, x-y location, etc., and flags it as such.
- the token 127 merely signals the user's token reader 126 to communicate a user identifier to the printer's token reader 126 .
- the tag pattern flags the command to communicate the user identifier to the printer reader 124 . If a range of communication link types are supported, then the token 127 (e.g.
- the user 62 may key a user identifier or job identifier into a keypad associated with the printer 6 , with an optional password.
- the user 62 may also use a display-based input device associated with the printer to select their identity or their pending printout(s) from a list of users or jobs.
- the Netpage system acts as a virtual filing cabinet for any printed document.
- the Netpage system therefore provides users with a screen-based browser—the Netpage Explorer—for browsing and searching collections of printouts maintained by a document service, and for viewing individual printouts on-screen, including their digital ink.
- the Netpage Explorer also supports real-time display of streaming digital ink, and so provides a basis for remote conferencing.
- the Netpage System supports the embedding of dynamic objects in documents, and the dynamic linking of dynamic objects to locations on printed impressions.
- the Netpage Explorer supports viewing of, and interaction with, such objects via the virtual view it provides of printed impressions, as well as the dynamic linking of such objects.
- FIG. 17 shows a system using public Netpage services 134 running on a distributed set of servers on the public Internet 133 , and serving applications 4 and users on the public Internet 133 .
- FIG. 21 shows a private Netpage system with services 136 (e.g. private Netpage document and digital ink services) running on one or more servers on a private intranet 138 , and serving applications 4 and users on the private intranet.
- FIG. 22 shows a personal Netpage system with services 142 running on a single personal computer or other personal device 140 .
- pre-printed Netpage content such as magazine adverts, catalogues, brochures, and product item Hyperlabels is typically hosted by public Netpage document services running on the Internet.
- a private document service may also act as a caching proxy for a public document service.
- a Netpage pen (or its relay) may therefore have knowledge of both a private and a public digital ink service, and may route digital ink pertaining to private impressions to the former and digital ink pertaining to public impressions to the latter. Even when a given pen's digital ink relates to a public impression and is nominally accessible on a public server, this need not imply that the owner of the impression or other users of the impression automatically gain access to that digital ink.
- the Netpage system uses a surface coding to imbue otherwise passive surfaces with interactivity in conjunction with Netpage sensing devices such as the Netpage pen and the Netpage viewer.
- Netpage sensing devices such as the Netpage pen and the Netpage viewer.
- a Netpage sensing device When interacting with a Netpage coded surface, a Netpage sensing device generates a digital ink stream which indicates both the identity of the surface region relative to which the sensing device is moving, and the absolute path of the sensing device within the region.
- This section defines optional authentication features of the Netpage surface coding, and associated authentication protocols.
- the Netpage surface coding consists of a dense planar tiling of tags. Each tag encodes its own location in the plane. Each tag also encodes, in conjunction with adjacent tags, an identifier of the region containing the tag. This region ID is unique among all regions. In the Netpage system the region typically corresponds to the entire extent of the tagged surface, such as one side of a sheet of paper. In the Hyperlabel system the region typically corresponds to the surface of an entire product item, and the region ID corresponds to the unique item ID. For clarity in the following discussion, references to items and item IDs (or simply IDs), correspond to the region ID.
- the surface coding is designed so that an acquisition field of view large enough to guarantee acquisition of an entire tag is large enough to guarantee acquisition of the ID of the region containing the tag. Acquisition of the tag itself guarantees acquisition of the tag's two-dimensional position within the region, as well as other tag-specific data.
- the surface coding therefore allows a sensing device to acquire a region ID and a tag position during a purely local interaction with a coded surface, e.g. during a “click” or tap on a coded surface with a pen.
- Cryptography is used to protect sensitive information, both in storage and in transit, and to authenticate parties to a transaction.
- the Netpage and Hyperlabel systems use both classes of cryptography.
- Secret-key cryptography also referred to as symmetric cryptography, uses the same key to encrypt and decrypt a message. Two parties wishing to exchange messages must first arrange to securely exchange the secret key.
- Public-key cryptography also referred to as asymmetric cryptography, uses two encryption keys.
- the two keys are mathematically related in such a way that any message encrypted using one key can only be decrypted using the other key.
- One of these keys is then published, while the other is kept private. They are referred to as the public and private key respectively.
- the public key is used to encrypt any message intended for the holder of the private key. Once encrypted using the public key, a message can only be decrypted using the private key.
- two parties can securely exchange messages without first having to exchange a secret key. To ensure that the private key is secure, it is normal for the holder of the private key to generate the public-private key pair.
- Public-key cryptography can be used to create a digital signature. If the holder of the private key creates a known hash of a message and then encrypts the hash using the private key, then anyone can verify that the encrypted hash constitutes the “signature” of the holder of the private key with respect to that particular message, simply by decrypting the encrypted hash using the public key and verifying the hash against the message. If the signature is appended to the message, then the recipient of the message can verify both that the message is genuine and that it has not been altered in transit.
- Secret-key can also be used to create a digital signature, but has the disadvantage that signature verification can also be performed by a party privy to the secret key.
- a certificate authority is a trusted third party which authenticates the association between a public key and a person's or other entity's identity.
- the certificate authority verifies the identity by examining identity documents etc., and then creates and signs a digital certificate containing the identity details and public key.
- Anonymously who trusts the certificate authority can use the public key in the certificate with a high degree of certainty that it is genuine. They just have to verify that the certificate has indeed been signed by the certificate authority, whose public key is well-known.
- public-key cryptography utilises key lengths an order of magnitude larger, i.e. a few thousand bits compared with a few hundred bits.
- Netpage surface coding security has two corresponding purposes:
- a user If a user is able to determine the authenticity of the surface coding of an item, then the user may be able to make an informed decision about the likely authenticity of the item.
- the only tractable way of forging an item with an authentic surface coding is to duplicate the surface coding of an existing item (and hence its ID). If the user is able to determine by other means that the ID of an item is likely to be unique, then the user may assume that the item is authentic.
- the Netpage surface coding allows meaningful interaction between a sensing device and a coded surface during a purely local interaction, it is desirable for the surface coding to support authentication during a similarly local interaction, i.e. without requiring an increase in the size of the sensing device field of view.
- authentication relies on verifying the correspondence between data and a signature of that data.
- the item ID is unique and therefore provides a basis for a signature. If online authentication access is assumed, then the signature may simply be a random number associated with the item ID in an authentication database accessible to the trusted online authenticator.
- the random number may be generated by any suitable method, such as via a deterministic (pseudo-random) algorithm, or via a stochastic physical process.
- a keyed hash or encrypted hash may be preferable to a random number since it requires no additional space in the authentication database.
- the signature To prevent forgery of a signature for an unsighted ID, the signature must be large enough to make exhaustive search via repeated accesses to the online authenticator intractable. If generated using a key rather than randomly, then the length of the signature must also be large enough to prevent the forger from deducing the key from known ID-signature pairs. Signatures of a few hundred bits are considered secure, whether generated using private or secret keys.
- Fragment verification requires fragment identification. Fragments may be explicitly numbered, or may more economically be identified by the two-dimensional coordinate of their tag, modulo the repetition of the signature across a continuous tiling of tags.
- the ID itself introduces a further vulnerability. Ideally it should be at least a few hundred bits. In the Netpage and Hyperlabel surface coding schemes it is 96 bits or less. To overcome this, the ID may be padded. For this to be effective the padding must be variable, i.e. it must vary from one ID to the next. Ideally the padding is simply a random number, and must then be stored in the authentication database indexed by ID. If the padding is deterministically generated from the ID then it is worthless.
- Offline authentication of secret-key signatures requires the use of a trusted offline authentication device.
- the QA chip (see U.S. Pat. No. 6,374,354, issued 16 Apr. 2002) provides the basis for such a device, although of limited capacity.
- the QA chip can be programmed to verify a signature using a secret key securely held in its internal memory. In this scenario, however, it is impractical to support per-ID padding, and it is impractical even to support more than a very few secret keys.
- a QA chip programmed in this manner is susceptible to a chosen-message attack. These constraints limit the applicability of a QA-chip-based trusted offline authentication device to niche applications.
- offline authentication of public-key signatures i.e. generated using the corresponding private keys
- An offline authentication device utilising public keys can trivially hold any number of public keys, and may be designed to retrieve additional public keys on demand, via a transient online connection, when it encounters an ID for which it knows it has no corresponding public signature key.
- Untrusted offline authentication is likely to be attractive to most creators of secure items, since they are able to retain exclusive control of their private signature keys.
- a disadvantage of offline authentication of a public-key signature is that the entire signature must be acquired from the coding, violating our desire to support authentication with a minimal field of view.
- a corresponding advantage of offline authentication of a public-key signature is that access to the ID padding is no longer required, since decryption of the signature using the public signature key generates both the ID and its padding, and the padding can then be ignored.
- Any random or linear swipe of a hand-held sensing device across a coded surface allows it to quickly acquire all of the fragments of the signature.
- the sensing device can easily be programmed to signal the user when it has acquired a full set of fragments and has completed authentication.
- a scanning laser can also easily acquire all of the fragments of the signature. Both kinds of devices may be programmed to only perform authentication when the tags indicate the presence of a signature.
- a public-key signature may be authenticated online via any of its fragments in the same way as any signature, whether generated randomly or using a secret key.
- the trusted online authenticator may generate the signature on demand using the private key and ID padding, or may store the signature explicitly in the authentication database. The latter approach obviates the need to store the ID padding.
- signature-based authentication may be used in place of fragment-based authentication even when online access to a trusted authenticator is available.
- This section defines a surface coding used by the Netpage system (described above in ‘Netpage Architecture’) to imbue otherwise passive surfaces with interactivity in conjunction with Netpage sensing devices such as the Netpage pen and the Netpage viewer.
- a Netpage sensing device When interacting with a Netpage coded surface, a Netpage sensing device generates a digital ink stream which indicates both the identity of the surface region relative to which the sensing device is moving, and the absolute path of the sensing device within the region.
- the Netpage surface coding consists of a dense planar tiling of tags. Each tag encodes its own location in the plane. Each tag also encodes, in conjunction with adjacent tags, an identifier of the region containing the tag. In the Netpage system, the region typically corresponds to the entire extent of the tagged surface, such as one side of a sheet of paper.
- Each tag is represented by a pattern which contains two kinds of elements.
- the first kind of element is a target.
- Targets allow a tag to be located in an image of a coded surface, and allow the perspective distortion of the tag to be inferred.
- the second kind of element is a macrodot. Each macrodot encodes the value of a bit by its presence or absence.
- the pattern is represented on the coded surface in such a way as to allow it to be acquired by an optical imaging system, and in particular by an optical system with a narrowband response in the near-infrared.
- the pattern is typically printed onto the surface using a narrowband near-infrared ink.
- FIG. 23 shows the structure of a complete tag 200 .
- Each of the four black circles 202 is a target.
- the tag 200 and the overall pattern, has four-fold rotational symmetry at the physical level.
- Each square region represents a symbol 204 , and each symbol represents four bits of information.
- Each symbol 204 shown in the tag structure has a unique label 216 .
- Each label 216 has an alphabetic prefix and a numeric suffix.
- FIG. 24 shows the structure of a symbol 204 . It contains four macrodots 206 , each of which represents the value of one bit by its presence (one) or absence (zero).
- the macrodot 206 spacing is specified by the parameter S throughout this specification. It has a nominal value of 143 ⁇ m, based on 9 dots printed at a pitch of 1600 dots per inch. However, it is allowed to vary within defined bounds according to the capabilities of the device used to produce the pattern.
- FIG. 25 shows an array 208 of nine adjacent symbols 204 .
- the macrodot 206 spacing is uniform both within and between symbols 208 .
- FIG. 26 shows the ordering of the bits within a symbol 204 .
- Bit zero 210 is the least significant within a symbol 204 ; bit three 212 is the most significant. Note that this ordering is relative to the orientation of the symbol 204 .
- the orientation of a particular symbol 204 within the tag 200 is indicated by the orientation of the label 216 of the symbol in the tag diagrams (see for example FIG. 23 ). In general, the orientation of all symbols 204 within a particular segment of the tag 200 is the same, consistent with the bottom of the symbol being closest to the centre of the tag.
- FIG. 27 shows the actual pattern of a tag 200 with every bit 206 set. Note that, in practice, every bit 206 of a tag 200 can never be set.
- a macrodot 206 is nominally circular with a nominal diameter of (5/9)s. However, it is allowed to vary in size by ⁇ 10% according to the capabilities of the device used to produce the pattern.
- a target 202 is nominally circular with a nominal diameter of (17/9)s. However, it is allowed to vary in size by ⁇ 10% according to the capabilities of the device used to produce the pattern.
- the tag pattern is allowed to vary in scale by up to ⁇ 10% according to the capabilities of the device used to produce the pattern. Any deviation from the nominal scale is recorded in the tag data to allow accurate generation of position samples.
- Tags 200 are arranged into tag groups 218 . Each tag group contains four tags arranged in a square. Each tag 200 has one of four possible tag types, each of which is labelled according to its location within the tag group 218 .
- the tag type labels 220 are 00, 10, 01 and 11, as shown in FIG. 28 .
- FIG. 29 shows how tag groups are repeated in a continuous tiling of tags, or tag pattern 222 .
- the tiling guarantees the any set of four adjacent tags 200 contains one tag of each type 220 .
- the tag contains four complete codewords.
- the layout of the four codewords is shown in FIG. 30 .
- Each codeword is of a punctured 2 4 -ary (8, 5) Reed-Solomon code.
- the codewords are labelled A, B, C and D. Fragments of each codeword are distributed throughout the tag 200 .
- Two of the codewords are unique to the tag 200 . These are referred to as local codewords 224 and are labelled A and B.
- the tag 200 therefore encodes up to 40 bits of information unique to the tag.
- the remaining two codewords are unique to a tag type, but common to all tags of the same type within a contiguous tiling of tags 222 . These are referred to as global codewords 226 and are labelled C and D, subscripted by tag type.
- a tag group 218 therefore encodes up to 160 bits of information common to all tag groups within a contiguous tiling of tags.
- Codewords are encoded using a punctured 2 4 -ary (8, 5) Reed-Solomon code.
- a 2 4 -ary (8, 5) code encodes 20 data bits (i.e. five 4-bit symbols) and 12 redundancy bits (i.e. three 4-bit symbols) in each codeword. Its error-detecting capacity is three symbols. Its error-correcting capacity is one symbol.
- FIG. 31 shows a codeword 228 of eight symbols 204 , with five symbols encoding data coordinates 230 and three symbols encoding redundancy coordinates 232 .
- the codeword coordinates are indexed in coefficient order, and the data bit ordering follows the codeword bit ordering.
- a punctured 2 4 -ary (8, 5) Reed-Solomon code is a 2 4 -ary (15, 5) Reed-Solomon code with seven redundancy coordinates removed. The removed coordinates are the most significant redundancy coordinates.
- Reed-Solomon codes For a detailed description of Reed-Solomon codes, refer to Wicker, S. B. and V. K. Bhargava, eds., Reed - Solomon Codes and Their Applications , IEEE Press, 1994, the contents of which are incorporated herein by reference.
- the tag coordinate space has two orthogonal axes labelled x and y respectively. When the positive x axis points to the right, then the positive y axis points down.
- the surface coding does not specify the location of the tag coordinate space origin on a particular tagged surface, nor the orientation of the tag coordinate space with respect to the surface. This information is application-specific.
- the application which prints the tags onto the paper may record the actual offset and orientation, and these can be used to normalise any digital ink subsequently captured in conjunction with the surface.
- the position encoded in a tag is defined in units of tags. By convention, the position is taken to be the position of the centre of the target closest to the origin.
- Table 1 defines the information fields embedded in the surface coding. Table 2 defines how these fields map to codewords.
- TABLE 1 Field definitions field width description per codeword codeword type 2 The type of the codeword, i.e. one of A (b′00′), B (b′01′), C (b′10′) and D (b′11′). per tag tag type 2
- the type 1 of the tag i.e. one of 00 (b′00′), 01 (b′01′), 10 (b′10′) and 11 (b′11′).
- x coordinate 13 The unsigned x coordinate of the tag 2 .
- y coordinate 13 The unsigned y coordinate of the tag b . active area flag 1 A flag indicating whether the tag is a member of an active area.
- b′1′ indicates membership.
- b′1′ indicates the presence of a map (see next field). If the map is absent then the value of each map entry is derived from the active area flag (see previous field).
- b′1′ indicates membership.
- per tag group encoding format 8 The format of the encoding. 0: the present encoding Other values are TBA. region flags 8 Flags controlling the interpretation and routing of region-related information.
- region ID is an EPC 1: region is linked 2: region is interactive 3: region is signed 4: region includes data 5: region relates to mobile application Other bits are reserved and must be zero.
- tag size 16 The difference between the actual tag size adjustment and the nominal tag size 4 , in 10 nm units, in sign-magnitude format.
- region ID 96 The ID of the region containing the tags.
- CRC 16 A CRC 5 of tag group data. total 320 1 corresponds to the bottom two bits of the x and y coordinates of the tag 2 allows a maximum coordinate value of approximately 14 m 3
- FIG. 29 indicates the bit ordering of the map 4 the nominal tag size is 1.7145 mm (based on 1600 dpi, 9 dots per macrodot, and 12 macrodots per tag) 5 CCITT CRC-16 [7]
- FIG. 32 shows a tag 200 and its eight immediate neighbours, each labelled with its corresponding bit index in the active area map.
- An active area map indicates whether the corresponding tags are members of an active area.
- An active area is an area within which any captured input should be immediately forwarded to the corresponding Netpage server for interoperation. It also allows the Netpage sensing device to signal to the user that the input will have an immediate effect.
- the tag type can be moved into a global codeword to maximise local codeword utilization. This in turn can allow larger coordinates and/or 16-bit data fragments (potentially configurably in conjunction with coordinate precision). However, this reduces the independence of position decoding from region ID decoding and has not been included in the specification at this time.
- the surface coding contains embedded data.
- the data is encoded in multiple contiguous tags' data fragments, and is replicated in the surface coding as many times as it will fit.
- the embedded data is encoded in such a way that a random and partial scan of the surface coding containing the embedded data can be sufficient to retrieve the entire data.
- the scanning system reassembles the data from retrieved fragments, and reports to the user when sufficient fragments have been retrieved without error.
- a 200-bit data block encodes 160 bits of data.
- the block data is encoded in the data fragments of A contiguous group of 25 tags arranged in a 5 ⁇ 5 square.
- a tag belongs to a block whose integer coordinate is the tag's coordinate divided by 5.
- Within each block the data is arranged into tags with increasing x coordinate within increasing y coordinate.
- a data fragment may be missing from a block where an active area map is present. However, the missing data fragment is likely to be recoverable from another copy of the block.
- Data of arbitrary size is encoded into a superblock consisting of a contiguous set of blocks arranged in a rectangle.
- the size of the superblock is encoded in each block.
- a block belongs to a superblock whose integer coordinate is the block's coordinate divided by the superblock size.
- the superblock is replicated in the surface coding as many times as it will fit, including partially along the edges of the surface coding.
- the data encoded in the superblock may include more precise type information, more precise size information, and more extensive error detection and/or correction data.
- TABLE 3 Embedded data block field width description data type 8 The type of the data in the superblock. Values include: 0: type is controlled by region flags 1: MIME Other values are TBA.
- superblock width 8 The width of the superblock, in blocks.
- superblock height 8 The height of the superblock, in blocks. data 160 The block data.
- the surface coding contains a 160-bit cryptographic signature of the region ID.
- the signature is encoded in a one-block superblock.
- any signature fragment can be used, in conjunction with the region ID, to validate the signature.
- the entire signature can be recovered by reading multiple tags, and can then be validated using the corresponding public signature key. This is discussed in more detail in Netpage Surface Coding Security section above.
- the superblock contains Multipurpose Internet Mail Extensions (MIME) data according to RFC 2045 (see Freed, N., and N. Borenstein, “Multipurpose Internet Mail Extensions (MIME)-Part One: Format of Internet Message Bodies”, RFC 2045, November 1996), RFC 2046 (see Freed, N., and N. Borenstein, “Multipurpose Internet Mail Extensions (MIME)—Part Two: Media Types”, RFC 2046, November 1996) and related RFCs.
- the MIME data consists of a header followed by a body.
- the header is encoded as a variable-length text string preceded by an 8-bit string length.
- the body is encoded as a variable-length type-specific octet stream preceded by a 16-bit size in big-endian format.
- the basic top-level media types described in RFC 2046 include text, image, audio, video and application.
- RFC 2425 (see Howes, T., M. Smith and F. Dawson, “A MIME Content-Type for Directory Information”, RFC 2045, September 1998) and RFC 2426 (see Dawson, F., and T. Howes, “vCard MIME Directory Profile”, RFC 2046, September 1998) describe a text subtype for directory information suitable, for example, for encoding contact information which might appear on a business card.
- the Print Engine Controller supports the encoding of two fixed (per-page) 2 4 -ary (15, 5) Reed-Solomon codewords and six variable (per-tag) 2 4 -ary (15, 5) Reed-Solomon codewords. Furthermore, PEC supports the rendering of tags via a rectangular unit cell whose layout is constant (per page) but whose variable codeword data may vary from one unit cell to the next. PEC does not allow unit cells to overlap in the direction of page movement.
- a unit cell compatible with PEC contains a single tag group consisting of four tags. The tag group contains a single A codeword unique to the tag group but replicated four times within the tag group, and four unique B codewords.
- the tag group also contains eight fixed C and D codewords. One of these can be encoded using the remaining one of PEC's variable codewords, two more can be encoded using PEC's two fixed codewords, and the remaining five can be encoded and pre-rendered into the Tag Format Structure (TFS) supplied to PEC.
- TFS Tag Format Structure
- PEC imposes a limit of 32 unique bit addresses per TFS row. The contents of the unit cell respect this limit. PEC also imposes a limit of 384 on the width of the TFS. The contents of the unit cell respect this limit.
- the minimum imaging field of view required to guarantee acquisition of an entire tag has a diameter of 39.6s (i.e. (2 ⁇ (12+2)) ⁇ square root over (2) ⁇ s), allowing for arbitrary alignment between the surface coding and the field of view. Given a macrodot spacing of 143 ⁇ m, this gives a required field of view of 5.7 mm.
- region ID decoding need not occur at the same rate as position decoding.
- decoding of a codeword can be avoided if the codeword is found to be identical to an already-known good codeword.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Pens And Brushes (AREA)
- Accessory Devices And Overall Control Thereof (AREA)
- Position Input By Displaying (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Force Measurement Appropriate To Specific Purposes (AREA)
- Processing Or Creating Images (AREA)
- Ink Jet (AREA)
- Computer And Data Communications (AREA)
Abstract
A computer network for a plurality of users, the computer network comprising: a server; a printer; a network user identifier for a network user to carry on their person; and, a printer identifier associated with the printer; wherein during use, the network user identifier and the printer identifier interact such that any of the network user's pending printouts are sent to the printer for printing when the network user is proximate the printer.
Description
- The present invention relates to printing systems, and in particular printing systems involving interactive paper, computer publishing, computer applications, human-computer interfaces, and information appliances.
CO-PENDING REFERENCES NPS101US NPS108US NPS109US -
CROSS-REFERENCES 10/815621 10/815612 10/815630 10/815637 10/815638 10/815640 10/815642 10/815643 10/815644 10/815618 10/815639 10/815635 10/815647 10/815634 10/815632 10/815631 10/815648 10/815641 10/815645 10/815646 10/815617 10/815620 10/815615 10/815613 10/815633 10/815619 10/815616 10/815614 10/815636 10/815649 11/041650 11/041651 11/041652 11/041649 11/041610 11/041609 11/041626 11/041627 11/041624 11/041625 11/041556 11/041580 11/041723 11/041698 11/041648 10/815609 10/815627 10/815626 10/815610 10/815611 10/815623 10/815622 10/815629 10/815625 10/815624 10/815628 10/913375 10/913373 10/913374 10/913372 10/913377 10/913378 10/913380 10/913379 10/913376 10/913381 10/986402 IRB013US 11/172815 11/172814 10/409876 10/409848 10/409845 11/084769 11/084742 11/084806 09/575197 09/575195 09/575159 09/575132 09/575123 6825945 09/575130 09/575165 6813039 09/693415 09/575118 6824044 09/608970 09/575131 09/575116 6816274 09/575139 09/575186 6681045 6678499 6679420 09/663599 09/607852 6728000 09/693219 09/575145 09/607656 6813558 6766942 09/693515 09/663701 09/575192 6720985 09/609303 6922779 09/609596 6847883 09/693647 09/721895 09/721894 09/607843 09/693690 09/607605 09/608178 09/609553 09/609233 09/609149 09/608022 09/575181 09/722174 09/721896 10/291522 6718061 10/291523 10/291471 10/291470 6825956 10/291481 10/291509 10/291825 10/291519 10/291575 10/291557 6862105 10/291558 10/291587 10/291818 10/291576 6829387 6714678 6644545 6609653 6651879 10/291555 10/291510 10/291592 10/291542 10/291820 10/291516 6867880 10/291487 10/291520 10/291521 10/291556 10/291821 10/291525 10/291586 10/291822 10/291524 10/291553 6850931 6865570 6847961 10/685523 10/685583 10/685455 10/685584 10/757600 10/804034 10/793933 6889896 10/831232 10/884882 10/943875 10/943938 10/943874 10/943872 10/944044 10/943942 10/944043 10/949293 10/943877 10/965913 10/954170 10/981773 10/981626 10/981616 10/981627 10/974730 10/986337 10/992713 11/006536 11/020256 11/020106 11/020260 11/020321 11/020319 11/026045 11/059696 11/051032 11/059674 NPA19NUS 11/107944 11/107941 11/082940 11/082815 11/082827 11/082829 11/082956 11/083012 11/124256 11/123136 11/154676 11/159196 NPA225US 09/575193 09/575156 09/609232 09/607844 6457883 09/693593 10/743671 11/033379 09/928055 09/927684 09/928108 09/927685 09/927809 09/575183 6789194 09/575150 6789191 10/900129 10/900127 10/913328 10/913350 10/982975 10/983029 6644642 6502614 6622999 6669385 6827116 10/933285 10/949307 6549935 NPN004US 09/575187 6727996 6591884 6439706 6760119 09/575198 09/722148 09/722146 6826547 6290349 6428155 6785016 6831682 6741871 09/722171 09/721858 09/722142 6840606 10/202021 10/291724 10/291512 10/291554 10/659027 10/659026 10/831242 10/884885 10/884883 10/901154 10/932044 10/962412 10/962510 10/962552 10/965733 10/965933 10/974742 10/982974 10/983018 10/986375 11/107817 11/148238 11/149160 09/693301 6870966 6822639 6474888 6627870 6724374 6788982 09/722141 6788293 09/722147 6737591 09/722172 09/693514 6792165 09/722088 6795593 10/291823 6768821 10/291366 10/291503 6797895 10/274817 10/782894 10/782895 10/778056 10/778058 10/778060 10/778059 10/778063 10/778062 10/778061 10/778057 10/846895 10/917468 10/917467 10/917466 10/917465 10/917356 10/948169 10/948253 10/948157 10/917436 10/943856 10/919379 10/943843 10/943878 10/943849 10/965751 11/071267 11/144840 11/155556 11/155557 09/575154 09/575129 6830196 6832717 09/721862 10/473747 10/120441 6843420 10/291718 6,789,731 10/291543 6766944 6766945 10/291715 10/291559 10/291660 10/409864 NPT019USNP 10/537159 NPT022US 10/410484 10/884884 10/853379 10/786631 10/853782 10/893372 10/893381 10/893382 10/893383 10/893384 10/971051 10/971145 10/971146 10/986403 10/986404 10/990459 11/059684 11/074802 10/492169 10/492152 10/492168 10/492161 10/492154 10/502575 10/683151 10/531229 10/683040 NPW009USNP 10/510391 10/919260 10/510392 10/919261 10/778090 09/575189 09/575162 09/575172 09/575170 09/575171 09/575161 10/291716 10/291547 10/291538 6786397 10/291827 10/291548 10/291714 10/291544 10/291541 6839053 10/291579 10/291824 10/291713 6914593 10/291546 10/917355 10/913340 10/940668 11/020160 11/039897 11/074800 NPX044US 11/075917 11/102698 11/102843 6593166 10/428823 10/849931 11/144807 6454482 6808330 6527365 6474773 6550997 10/181496 10/274119 10/309185 10/309066 10/949288 10/962400 10/969121 UP21US UP23US 09/517539 6566858 09/112762 6331946 6246970 6442525 09/517384 09/505951 6374354 09/517608 6816968 6757832 6334190 6745331 09/517541 10/203559 10/203560 10/203564 10/636263 10/636283 10/866608 10/902889 10/902833 10/940653 10/942858 10/727181 10/727162 10/727163 10/727245 10/727204 10/727233 10/727280 10/727157 10/727178 10/727210 10/727257 10/727238 10/727251 10/727159 10/727180 10/727179 10/727192 10/727274 10/727164 10/727161 10/727198 10/727158 10/754536 10/754938 10/727227 10/727160 10/934720 10/296522 6795215 10/296535 09/575109 6805419 6859289 09/607985 6398332 6394573 6622923 6747760 6921144 10/884881 10/943941 10/949294 11/039866 11/123011 11/123010 11/144769 11/148237 10/922846 10/922845 10/854521 10/854522 10/854488 10/854487 10/854503 10/854504 10/854509 10/854510 10/854496 10/854497 10/854495 10/854498 10/854511 10/854512 10/854525 10/854526 10/854516 10/854508 10/854507 10/854515 10/854506 10/854505 10/854493 10/854494 10/854489 10/854490 10/854492 10/854491 10/854528 10/854523 10/854527 10/854524 10/854520 10/854514 10/854519 10/854513 10/854499 10/854501 10/854500 10/854502 10/854518 10/854517 10/934628 11/003786 11/003354 11/003616 11/003418 11/003334 11/003600 11/003404 11/003419 11/003700 11/003601 11/003618 11/003615 11/003337 11/003698 11/003420 11/003682 11/003699 11/071473 11/003463 11/003701 11/003683 11/003614 11/003702 11/003684 11/003619 11/003617 10/760254 10/760210 10/760202 10/760197 10/760198 10/760249 10/760263 10/760196 10/760247 10/760223 10/760264 10/760244 10/760245 10/760222 10/760248 10/760236 10/760192 10/760203 10/760204 10/760205 10/760206 10/760267 10/760270 10/760259 10/760271 10/760275 10/760274 10/760268 10/760184 10/760195 10/760186 10/760261 10/760258 11/014764 11/014763 11/014748 11/014747 11/014761 11/014760 11/014757 11/014714 11/014713 11/014762 11/014724 11/014723 11/014756 11/014736 11/014759 11/014758 11/014725 11/014739 11/014738 11/014737 11/014726 11/014745 11/014712 11/014715 11/014751 11/014735 11/014734 11/014719 11/014750 11/014749 11/014746 11/014769 11/014729 11/014743 11/014733 11/014754 11/014755 11/014765 11/014766 11/014740 11/014720 11/014753 11/014752 11/014744 11/014741 11/014768 11/014767 11/014718 11/014717 11/014716 11/014732 11/014742 11/097268 11/097185 11/097184 10/728804 10/728952 10/728806 10/728834 10/729790 10/728884 10/728970 10/728784 10/728783 10/728925 10/728842 10/728803 10/728780 10/728779 10/773189 10/773204 10/773198 10/773199 6830318 10/773201 10/773191 10/773183 10/773195 10/773196 10/773186 10/773200 10/773185 10/773192 10/773197 10/773203 10/773187 10/773202 10/773188 10/773194 10/773193 10/773184 11/008118 11/060751 11/060805 MTB40US 11/097308 11/097309 11/097335 11/097299 11/097310 11/097213 11/097212 10/760272 10/760273 10/760187 10/760182 10/760188 10/760218 10/760217 10/760216 10/760233 10/760246 10/760212 10/760243 10/760201 10/760185 10/760253 10/760255 10/760209 10/760208 10/760194 10/760238 10/760234 10/760235 10/760183 10/760189 10/760262 10/760232 10/760231 10/760200 10/760190 10/760191 10/760227 10/760207 10/760181 10/407212 10/407207 10/683064 10/683041 6750901 6476863 6788336 6623101 6406129 6505916 6457809 6550895 6457812 10/296434 6428133 6746105 - The disclosures of these co-pending applications are incorporated herein by cross-reference. Some applications are temporarily identified by their docket number. This will be replaced by the corresponding USSN when available.
- In office environments, documents to be printed are typically sent via local computer networks to one of a number of printers connected to the network. The nominated printer is usually the most convenient to the user but unless the user goes to collect the document immediately after sending the print job, the printed document waits in the collection tray. If the document is sensitive, there is a risk that its contents will be disclosed to others passing the printer.
- According to a first aspect, the present invention provides a computer network for a plurality of users, the computer network comprising:
-
- a server;
- a printer;
- a network user identifier for a network user to carry on their person; and, a printer identifier associated with the printer; wherein during use,
- the network user identifier and the printer identifier interact such that any of the network user's pending printouts are sent to the printer for printing when the network user is proximate the printer.
- By keeping print jobs on the network until the user is at a printer, the printed document does not sit in a collection tray until the user retrieves it. This reduces the risk of others seeing any sensitive documents. If all printers connected to the network have sensors for identifying individual users, then users will not need to select a printer (or designate a default printer). Print jobs can be collected from the most convenient printer regardless of a users current location in the office.
- The Netpage system is comprehensively described in the cross referenced documents as well as the Detailed Description below. This system uses a paper- and pen-based interface to computer-based and typically network-based information and applications. The user can request print jobs by ‘clicking’ an interactive element on a Netpage document with a Netpage pen and therefore may be remote from any of the networked printers or even the office when print jobs are requested. According the invention is particularly suited to the Netpage system and will be described with particular reference to its operation within this environment. However, it will be appreciated that the invention has much broader application than Netpage and is not limited or restricted to printing Netpage documents.
- Optionally, the network comprises a plurality of said printers, each printer associated with one of the printer identifiers respectively; and
-
- a plurality of said network user identifiers, each uniquely identifying different network users.
- Optionally, each of the network user identifiers is a token and each of the printer identifiers has a token reader such that the user presents their token to the token reader associated with one of the printers to request actual printing of their queued printouts via that printer.
- Optionally, the tokens are a short-range RFID tag, a smartcard or a magnetic stripe card.
- Optionally, the token reader notifies a walk-up-handling application on the server of the user's proximity to the associated printer which in turn initiates printing.
- Optionally, each of the printer identifiers is a token and each of the network user identifiers has a token reader associated with the user. Optionally, the token reader is an electronic stylus with an optical sensor, and the tokens are a surface each of the printers with coded data disposed on it, the coded data being readable by the optical sensors of each users' electronic stylus.
- Optionally, the pending printouts are maintained in a queue by the server and each pending printout has a priority such that higher-priority printouts are printed before earlier-queued but lower-priority printouts.
- Optionally, the token readers are associated with respective printers such that when the user presents their token to the reader it reads the token and identifies both the user and the printer to the server. Optionally, the token identfies the user explicitly. Optionally, the token has a token identifier and the server performs a database lookup to translate the token identifier into a user identity. Optionally, the token reader identifies the printer explicitly.
- Optionally, the reader has a reader identifier and the server performs a database lookup to translate the reader identifier into a printer indentity.
- Optionally, the token reader and the printer are separate devices which have an electrical connection. Optionally, the token reader is physically built into the printer. Optionally, the reader informs the printer that the user has presented a token and the printer then explicitly retrieves the user's pending printouts for printing.
- Optionally, the token is a security access or identification badge or card.
- Embodiments of the invention will now be described by way of example only with reference to the accompanying drawings in which:
-
FIG. 1 shows the data flow between Netpage publishers and applications, Netpage services, and Netpage devices; -
FIG. 2 is a diagram of the range of content type within a Netpage document; -
FIG. 3 shows a Netpage document with a physical structure consisting of a sequence of numbered pages; -
FIG. 4 shows a printout consisting of a series of impressions; -
FIG. 5 is a diagram showing a user with a pen and default printer; -
FIG. 6 shows the pen events recorded in a digital ink stream; -
FIG. 7 shows the form data submitted to an application; -
FIG. 8 shows a dynamic element for use as a document element; -
FIG. 9 shows a dynamic object linked to an existing impression; -
FIG. 10 shows the relationship between the document, printout and digital ink stores; -
FIG. 11 shows the fundamental flow of data in the Netpage system in greater detail thanFIG. 1 ; -
FIG. 12 shows the data flow associated with reprinting impressions; -
FIG. 13 shows the data flow associated with printing; -
FIG. 14 shows a bifurcated general printing data flow; -
FIG. 15 shows the data flow associated with walk-up printing; -
FIG. 16 shows the data flow associated with the establishment of a printout queue; -
FIG. 17 shows the different levels of network distribution and access possible within the Netpage system; -
FIG. 18 shows the data flow if the user has a token read by a reader associated with the printer; -
FIG. 19 shows the data flow if the user has a reader for reading the token associated with the printer; -
FIG. 20 shows the data flow if the user has a reader that reads the printer token but then uses the printer reader to connect to the Netpage server; -
FIG. 21 shows the data flow between a privately hosted network and a publicly hosted network; -
FIG. 22 shows a PC or device hosted Netpage system; -
FIG. 23 shows the structure of a complete tag; -
FIG. 24 shows a symbol unit cell; -
FIG. 25 shows nine symbol unit cells; -
FIG. 26 shows the bit ordering in a symbol; -
FIG. 27 shows a tag with all bits set; -
FIG. 28 shows a tag group made up of four tag types; -
FIG. 29 shows the continuous tiling of tag groups; -
FIG. 30 shows the interleaving of codewords A, B, C & D with a tag; -
FIG. 31 shows a codeword layout; and -
FIG. 32 shows a tag and its eight immediate neighbours labelled with its corresponding bit index. - As discussed above, the invention is well suited for incorporation in the Assignee's Netpage system. In light of this, the invention has been described as a component of a broader Netpage architecture. However, it will be readily appreciated that the invention is also applicable to other computer networks.
- Overview
-
FIG. 1 shows the interaction between Netpage publishers, applications, services and devices. TheNetpage document service 1 accepts adocument 2 from aNetpage publisher 3 orother Netpage application 4, and produces aprintout 5 via aNetpage printer 6. Aprintout 5 consists of a series of impressions on either or both sides of a series of paper sheets. In addition to reproducing the graphic content of thedocument 2 on paper, theprinter 6 also lays down a coordinate grid in the form of an array of invisible millimetre-scale tags 7 (see U.S. Ser. No. 10/309,358 cross referenced above). Each tag encodes the two-dimensional coordinates of its location on the impression as well as the impression's unique identifier. When a tag is optically imaged by a Netpage pen 8 (see below and U.S. Ser. No. 10/815,636 cross referenced above) the pen is able to identify the corresponding impression as well as its own position relative to the impression. When the user of thepen 8 moves the pen relative to the coordinategrid 7, the pen generates a stream of positions. This stream is referred to asdigital ink 9. A digital ink stream also records when the pen makes contact with a surface and when it loses contact with a surface, and each pair of these so-called pen down and pen up events delineates a stroke drawn by the user using the pen. - The
Netpage tag pattern 7 is typically printed using an invisible infrared ink while visible graphic content is printed using colored inks which are transparent in the infrared part of the spectrum. TheNetpage pen 8 incorporates a conventional marking nib which utilises an infrared-transparent ink so as not to obscure thetag pattern 7. - Because the impression identifiers (tags) manifest themselves in printed impressions, they are engineered to be unique among all Netpage systems, and therefore rely on a global allocation mechanism.
- The
document 2 may include aninput description 11 which defines command andform data 12. The commands are instructions that may be activated by the user and the forms have designated fields that may be filled in by the user. Both commands and form fields have active zones, i.e. areas of the page where they capture user input. - The Netpage
digital ink service 13 acceptsdigital ink 9 from aNetpage pen 8. Since the pen typically only has a short-range communications capability, it forwards thedigital ink 9 to the Netpagedigital ink service 13 via aNetpage relay 14 which has a longer-range communications capability. Typical relays include mobile phones, PDAs and personal computers. - The
digital ink service 13 uses theimpression identifier 7 in thedigital ink 9 to retrieve the corresponding impression andinput description 11 from thedocument service 1, and attempts to assign each individual digital ink stroke to a form of theinput description 11. Once it detects that the user of thepen 8 has designated a form submission command, it interprets thedigital ink 9 assigned to the form and submits theresultant form data 12 to the application associated with the command. - In order to allow the digital ink service to interpret pen input in relation to a particular impression, the
document service 1 keeps a copy of everyinput description 11 it prints. - In order to allow a user to fill in a form over an arbitrarily long time, the
digital ink service 13 retains a copy of alldigital ink 9 it receives, at least until the digital ink is interpreted and submitted to anapplication 4. Thedigital ink service 13 optionally retains alldigital ink 9 indefinitely, to allow digital ink searching of both form content and document annotations. - The
Netpage pen 8, or a simpler Netpage pointer, may be incorporated directly into a hand-held device such as a mobile phone or PDA. Conversely, the pen may incorporate a long-range communications capability and not need a separate relay. - Since the
relay device 14 typically incorporates aninteractive display 15, thedigital ink service 13 may identify theinteractive display 15 to atarget application 4 to allow the application to communicate directly with the interactive display, thus allowing an interaction initiated via paper and pen to lead to a richer screen-based interaction, and generally allowing the development of hybrid paper- and screen-based applications which make the most of both media. - In the presence of multiple distributed digital ink services 13 a pen 8 (or its relay 14) may use a name service to resolve the network address of a target digital ink service, based on pen identifier and possibly impression identifier. In the presence of multiple distributed
document services 1, adigital ink service 13 uses a name service to resolve the network address of a document service, based on impression identifier. - Although the above description centres on a forms-based interpretation of digital ink and subsequent delivery of form data to an application, the digital ink service also supports streaming delivery of digital ink to an application. This allows an application to be more directly responsive to pen input. In streaming mode the digital ink service delivers both stroke digital ink and intervening “hover” digital ink to allow the application to provide real-time positional feedback to the user via a display.
- The object model is a logical model relating to the external interfaces of the Netpage services. It is not intended as an implementation model.
- Document
-
FIG. 2 is a class diagram showing adocument 2 comprising avisual description 16 and aninput description 11. For a given document, either description may be empty. Eachdocument 2 is uniquely identified 18. - The
visual description 16 is a collection ofvisual elements 20 representing static 22 anddynamic elements 24. Static elements representtextflows 26,images 28,graphics 30 etc.Dynamic elements 24 are described below. Theinput description 11 is a collection offorms 32, each of which consists of a collection ofinput elements 34 representingcommands 36 and fields 38.Forms 32 may overlap both physically and logically, and thesame input element 34 may participate in multiple forms. Eachinput element 34 has azone 40 which defines the area within which it captures input. Eachform 32 is associated with atarget application 42. Theapplication 42 receives submissions of theform 32. Theapplication 42 is identified by anaddress 44. - As illustrated in the class diagram in
FIG. 3 , adocument 2 has a physical structure which consists of a sequence of numbered pages, eachpage 46 assigned apage number 54. Thedocument elements 48 are each assigned aspecific position 52 in the sequence of pages. Since asingle document element 48 may span a number ofpages 46, it may have a corresponding number ofpage elements 50, each defining theposition 52 of a fragment of thedocument element 48. - Printout.
- Referring to the class diagram in
FIG. 4 , aprintout 5 consists of a series ofimpressions 58 assigned aprintout ID 56. With “N-up” printing,multiple pages 46 may appear on asingle impression 58, while with “poster” printing asingle page 46 may spanmultiple impressions 58. Apage impression 64 uses atransform 66 to represent the position, scale and rotation of apage 46 on animpression 58. - Each
impression 58 is identified by aunique identifier 60 which is encoded with the impression's coordinate grid when the impression is printed. - Once actually printed (or pending printing in the ‘walk-up scenario’ described below), the
impression 58 is associated with both theprinter 6 on which it was printed and theuser 62 who requested it, if known. - As shown in
FIG. 5 , apen 8 is owned by asingle user 62 but a user may own any number ofpens 8. Accordingly, theuser 62 is assigned a user ID and other user details 68, and likewise, eachpen 8 andprinter 6 has a pen ID and details 70, and printer ID and details 72. Auser 62 optionally has adefault printer 6. - The class diagram in
FIG. 6 illustrates a singledigital ink stream 74 associated with apen 8, consisting of a sequence ofpen events 76. Each pen event is timestamped 78 using a nib force sensor.Successive segments 80 within thestream 74 relate todifferent impressions 58. Each segment is assigned anumber 82. Each sequence ofpen events 76 is a series ofpen position events 88 between a pen downevent 84 and a pen upevent 86. This defines a stroke drawn by the user of the pen. Often a succession of strokes relates to thesame impression 58, and usually a segment boundary corresponds to a stroke boundary. However, a stroke may also traversemultiple impressions 58, and astream 74 may contain “hover” pen events between strokes. - The class diagram in
FIG. 7 shows formdata 12 submitted to an application consists of a collection of field values 90. Theform data 12 is associated with aunique form instance 92 appearing in aprintout 5. An application may specify a transaction identifier when theform instance 92 is first created (as part of a printout). Thetransaction identifier 94 is submitted together with theform data 12, allowing the target application to use it to index a unique transaction context. - The digital ink service 13 (see
FIG. 1 ) supports a form lifecycle wherein a form may only be submitted once, may expire, may become frozen after being signed, and may be voided. The form instance reflects the status of the form with respect to the form lifecycle. - As illustrated in the class diagram in
FIG. 8 , adocument 2 may also includedynamic elements 24. Each dynamic element has an associateddynamic object 96, which in turn has associatedobject data 98 and a (typically type-specific)object application 99. Adynamic element 24 may be activated in place using a device such as a Netpage viewer (see U.S. Ser. No. 09/722,175 cross referenced above), or may be activated on an arbitrary interactive display, such as theinteractive display 15 associated with the relay 14 (seeFIG. 1 ), or may be activated via the Netpage Explorer (described below). - Examples of dynamic objects and their related applications include an audio clip and an audio player, a video clip and a video player, a photo and a photo viewer, a URL and a Web browser, an editable document and a word processor, to name just a few.
- As illustrated in the class diagram in
FIG. 9 , adynamic object 96 may also be dynamically linked to an arbitrary location on an existing impression, e.g. by being “pasted” onto a virtual view of the impression or onto the impression itself. -
FIG. 10 shows the relationships between the three stores nominally maintained by theNetpage document service 1 and the Netpage digital ink service 13 (seeFIG. 1 ), with navigational qualifiers. - Apart from the
document store 100, theprintout store 102 and thedigital ink store 104, the Netpage services may have additional stores for registeredusers 62, pens 8 andprinters 6, identifier allocation, and service address resolution (not shown). - Functions
- The processes and stores described in the following sub-sections are meant to elucidate functionality rather than imply implementation.
- Form Input
-
FIG. 11 shows the fundamental flow of data in the Netpage System in more detail thanFIG. 1 . Thedocument service 1 allows anapplication 4 to lodge adocument 2 and to separately transmit aprint request 106 to print thedocument 2. It retains a copy of each lodged document in thedocument store 100, and retains a copy of the document's input description, if any, in thedocument store 100. When it prints adocument 2 to a specifiedprinter 6, it records theprintout 5 in theprintout store 102. - The
digital ink service 13 acceptsdigital ink 9 from apen 8 via arelay 14, and retains a copy of received digital ink in thedigital ink store 104. It uses theimpression identifier 60 in thedigital ink 9 to retrieve thecorresponding impression 58 and input description from thedocument service 1. It then assigns each individual digital ink stroke to an element of the input description such as a command or a form field, according to the position and extent of the stroke and the active zone of the input element. Once it detects that the user of thepen 8 has designated a form submission command, thedigital ink 9 assigned to each field is interpreted 108 according to field type, and theresultant form data 12 is submitted to theapplication 4 associated with the command. For example, thedigital ink service 13 interprets a mark in a checkbox as a check mark; it converts handwritten text in a text field into a string of text characters using intelligent character recognition; and it compares a handwritten signature in a signature field with the recorded signature of the user of the pen, and, if the signatures match, digitally signs the form data on behalf of the user. - Reprinting Impressions
- The Netpage system supports reprinting of previously printed impressions, with or without any drawings or handwriting captured via those impressions. It thus supports source-independent document reproduction.
FIG. 12 illustrates the flow of data in response to areprint request 110 from anapplication 4. When thedocument service 1 reprints a set ofimpressions 58 it optionally includes any drawings and handwriting captured via those impressions, and retrieves the corresponding digital ink from thedigital ink store 104 in the digital ink service 13 (subject to visibility and access). It records a new printout to record the impression identifiers assigned to the reprintedimpressions 112. - General Printing
- Netpage system acts as a virtual filing cabinet for any printed document, whether produced by a Netpage-aware application or not. A Netpage-aware application has the advantage that it can include input descriptions in its documents, while a non-Netpage-aware application benefits from its printed documents supporting searchable annotations and source-independent reprinting.
-
FIG. 13 illustrates the flow of data in response to a general printing request from a non-Netpage-aware application 114. A Netpage-aware printer driver 116 converts platform-specific drawing commands 118 into a Netpage-compatible document 2 which it lodges with thedocument service 1, and then sends aprint request 106 for thedocument service 1 to print thedocument 2 via a specifiedprinter 6. -
FIG. 14 illustrates the corresponding flow of data when the printer is not accessible by thedocument service 1. Here theprinter driver 116 still lodges thedocument 2 with thedocument service 1 and records theprintout 5 in theprintout store 102, but actually prints thedocuments 2 directly via the specifiedprinter 6. - Walk Up Printing
- When a user requests printing of a document via a conventional user interface, it is usually convenient to specify a target printer. In the Netpage system, however, printing often occurs in response to user input via a printed form, and it may be inconvenient to specify a target printer. In some environments, such as in a home containing a single printer, the desired target printer may be inferred. In other environments, such as in an office containing multiple networked printers, the desired target printer may not be so easily inferred. In such environments it is useful to allow a user to specify a target printer by walking up to it.
-
FIG. 15 shows the flow of data in a walk-up environment. All print (and re-print) requests 120 from theNetpage application 4 are typically deferred. In response to a deferredprint request 120, thedocument service 1 records aprintout 5 in theprintout store 102 to capture impression-related information, and places the printout in aprintout pending queue 122 for the requesting user. - In one possible configuration, each
printer 6 has an associatedtoken reader 124, and the user presents a token 126 to the token reader to request actual printing of queued printouts via theprinter 6. The token 126 may be a short-range RFID tag, a smartcard, a magnetic stripe card, etc. Thetoken reader 124 notifies a walk-up-handling application 128 of the user's proximity to the printer which in turn initiates printing via thedocument service 1. - In another possible configuration, the
token reader 124 is associated with the user and the token 126 is associated with theprinter 6. For example, thetoken reader 124 may be the user'sNetpage pen 8, and the token 124 may be a tag pattern disposed on the printer. -
FIG. 16 shows the class diagram of the pendingprintout queue 122 maintained by thedocument service 1. Each pendingprintout 128 has apriority 130, allowing higher-priority printouts to be printed before earlier-queued but lower-priority printouts. - The document service can be used to provide walk-up printing for documents which are not encoded with Netpage tags and retained.
- In general, the token 126 may be any of a number of passive, semi-passive or active devices, including a surface or object bearing a Netpage tag pattern, linear barcode or two-dimensional barcode; a magnetic stripe card; a smart card or contact-less smart card; or a radio-frequency identification (RFID) tag. The
reader 124 may be any reader matched to the type of the token 126, such as an optical reader utilising a scanning laser or a two-dimensional image sensor, as in conventional barcode readers or a Netpage sensing device; a magnetic stripe reader; a smart card reader; or an RFID reader. - As illustrated in
FIG. 18 , in a first configuration thetoken reader 124 is associated with theprinter 6 and the user presents the token 126 to the reader. Thereader 124 reads the token 126 and communicates the walk-up event to theNetpage server 1. The walk-up event identifies both theuser 62 and theprinter 6. The token 126 and hence the walk-up event may identify theuser 62 explicitly, or the server may be required to perform a database lookup to translate the token identifier into a user identifier. The reader and hence the walk-up event may identify theprinter 6 explicitly, or theserver 1 may be required to perform a database lookup to translate the reader identifier into a printer identifier. Once theserver 1 has identified theuser 62 and theprinter 6, it retrieves printouts pending for the user and sends them to the printer to be printed. -
FIG. 18 shows thereader 124 and theprinter 6 as separate devices which are physically associated. Thereader 124 may be physically built into theprinter 6. It may also be electrically connected to the printer, with the printer delivering the walk-up event to the server. Alternatively and equivalently, theprinter 6 may interpret the walk-up event itself, and explicitly retrieve the user's pending printouts for printing. - The
user token 126 may be attached to or built into a portable device which theuser 62 carries, such as a mobile phone, pen, electronic pen (such as a Netpage pen 8), wallet, security access card or token, or identification badge or card. It may also be stand-alone and purpose-specific. - In the case of a
Netpage pen 8, theprinter reader 124 may provide a receptacle for receiving the pen, whereby the pen makes electrical contact and establishes a wired communication link (e.g. USB) with the reader to communicate the user identifier to the reader. - As illustrated in
FIG. 19 , in a second configuration thetoken reader 124 is associated with theuser 62 and the user presents the reader to thetoken 126. Thereader 124 reads the token 126 and communicates the walk-up event to theNetpage server 1. The walk-up event identifies both theuser 62 and theprinter 6. The token 126 and hence the walk-up event may identify theprinter 6 explicitly, or theserver 1 may be required to perform a database lookup to translate the token identifier into a printer identifier. Thereader 124 and hence the walk-up event may identify theuser 62 explicitly, or theserver 1 may be required to perform a database lookup to translate the reader identifier into a user identifier. Once theserver 1 has identified theuser 62 and theprinter 6, it retrieves printouts pending for the user and sends them to the printer to be printed. - The
printer token 126 may be attached to or built into theprinter 6 or a device co-located with the printer. - As illustrated in
FIG. 20 , even when theuser 62 presents atoken reader 125, it may be more convenient for theuser reader 125 to rely on the communication link between thetoken reader 124 on the printer (or printer itself) and theserver 1, since this communication link is guaranteed to be present. As inFIG. 19 , theuser 62 presents thereader 125 to thetoken 127. Thereader 125 reads the token 127. From the token it determines a short-range communication link to theprinter 6. This may be a personal-area network (PAN) wireless link such as Bluetooth, wireless USB or ZigBee, or a local-area network (LAN) wireless link such as IEEE 802.11 (WiFi). It may also be a short-range optical link such as IrDA. Where the link requires a target address (such as in the case of Bluetooth), the token supplies the target address. For example, if the token 127 on theprinter 6 uses a Netpage tag pattern, then the tag pattern encodes the target address instead of an impression ID, x-y location, etc., and flags it as such. Where the link does not require a target address (such as in the case of IrDA), the token 127 merely signals the user'stoken reader 126 to communicate a user identifier to the printer'stoken reader 126. Again, if the printer token uses a Netpage tag pattern, then the tag pattern flags the command to communicate the user identifier to theprinter reader 124. If a range of communication link types are supported, then the token 127 (e.g. tag pattern) can identify a particular link type. Theprinter reader 124 receives the user identifier from theuser reader 125 and communicates the walk-up event to theNetpage server 1. Once the server has identified theuser 62 and theprinter 6, it retrieves printouts pending for the user and sends them to the printer to be printed. - In the absence of a
user token 126 oruser reader 125, theuser 62 may key a user identifier or job identifier into a keypad associated with theprinter 6, with an optional password. Theuser 62 may also use a display-based input device associated with the printer to select their identity or their pending printout(s) from a list of users or jobs. - Netpage Explorer
- As discussed above, the Netpage system acts as a virtual filing cabinet for any printed document. The Netpage system therefore provides users with a screen-based browser—the Netpage Explorer—for browsing and searching collections of printouts maintained by a document service, and for viewing individual printouts on-screen, including their digital ink. The Netpage Explorer also supports real-time display of streaming digital ink, and so provides a basis for remote conferencing.
- As described above, the Netpage System supports the embedding of dynamic objects in documents, and the dynamic linking of dynamic objects to locations on printed impressions. The Netpage Explorer supports viewing of, and interaction with, such objects via the virtual view it provides of printed impressions, as well as the dynamic linking of such objects.
- Product Variants
- This section describes three Netpage product variants, each reflecting a different level of network distribution and access.
FIG. 17 shows a system usingpublic Netpage services 134 running on a distributed set of servers on thepublic Internet 133, and servingapplications 4 and users on thepublic Internet 133.FIG. 21 shows a private Netpage system with services 136 (e.g. private Netpage document and digital ink services) running on one or more servers on aprivate intranet 138, and servingapplications 4 and users on the private intranet.FIG. 22 shows a personal Netpage system withservices 142 running on a single personal computer or otherpersonal device 140. - In each case, pre-printed Netpage content such as magazine adverts, catalogues, brochures, and product item Hyperlabels is typically hosted by public Netpage document services running on the Internet.
- In private Netpage systems, security and privacy considerations may motivate the use of a private digital ink service even where the document service is public, as implied in
FIGS. 21 and 22 . A private document service may also act as a caching proxy for a public document service. - More generally, security and privacy considerations may motivate routing a user's digital ink to a constrained set of digital ink services, independent of the proliferation of document services. Some national governments may mandate that their citizens' digital ink be routed to national digital ink servers, even when interacting with international document services. A Netpage pen (or its relay) may therefore have knowledge of both a private and a public digital ink service, and may route digital ink pertaining to private impressions to the former and digital ink pertaining to public impressions to the latter. Even when a given pen's digital ink relates to a public impression and is nominally accessible on a public server, this need not imply that the owner of the impression or other users of the impression automatically gain access to that digital ink.
- Introduction
- The Netpage system uses a surface coding to imbue otherwise passive surfaces with interactivity in conjunction with Netpage sensing devices such as the Netpage pen and the Netpage viewer. When interacting with a Netpage coded surface, a Netpage sensing device generates a digital ink stream which indicates both the identity of the surface region relative to which the sensing device is moving, and the absolute path of the sensing device within the region. This section defines optional authentication features of the Netpage surface coding, and associated authentication protocols.
- Surface Coding Security
- Surface Coding Background
- The Netpage surface coding consists of a dense planar tiling of tags. Each tag encodes its own location in the plane. Each tag also encodes, in conjunction with adjacent tags, an identifier of the region containing the tag. This region ID is unique among all regions. In the Netpage system the region typically corresponds to the entire extent of the tagged surface, such as one side of a sheet of paper. In the Hyperlabel system the region typically corresponds to the surface of an entire product item, and the region ID corresponds to the unique item ID. For clarity in the following discussion, references to items and item IDs (or simply IDs), correspond to the region ID.
- The surface coding is designed so that an acquisition field of view large enough to guarantee acquisition of an entire tag is large enough to guarantee acquisition of the ID of the region containing the tag. Acquisition of the tag itself guarantees acquisition of the tag's two-dimensional position within the region, as well as other tag-specific data. The surface coding therefore allows a sensing device to acquire a region ID and a tag position during a purely local interaction with a coded surface, e.g. during a “click” or tap on a coded surface with a pen.
- Cryptography Background
- Cryptography is used to protect sensitive information, both in storage and in transit, and to authenticate parties to a transaction. There are two classes of cryptography in widespread use: secret-key cryptography and public-key cryptography. The Netpage and Hyperlabel systems use both classes of cryptography.
- Secret-key cryptography, also referred to as symmetric cryptography, uses the same key to encrypt and decrypt a message. Two parties wishing to exchange messages must first arrange to securely exchange the secret key.
- Public-key cryptography, also referred to as asymmetric cryptography, uses two encryption keys. The two keys are mathematically related in such a way that any message encrypted using one key can only be decrypted using the other key. One of these keys is then published, while the other is kept private. They are referred to as the public and private key respectively. The public key is used to encrypt any message intended for the holder of the private key. Once encrypted using the public key, a message can only be decrypted using the private key. Thus two parties can securely exchange messages without first having to exchange a secret key. To ensure that the private key is secure, it is normal for the holder of the private key to generate the public-private key pair.
- Public-key cryptography can be used to create a digital signature. If the holder of the private key creates a known hash of a message and then encrypts the hash using the private key, then anyone can verify that the encrypted hash constitutes the “signature” of the holder of the private key with respect to that particular message, simply by decrypting the encrypted hash using the public key and verifying the hash against the message. If the signature is appended to the message, then the recipient of the message can verify both that the message is genuine and that it has not been altered in transit.
- Secret-key can also be used to create a digital signature, but has the disadvantage that signature verification can also be performed by a party privy to the secret key.
- To make public-key cryptography work, there has to be a way to distribute public keys which prevents impersonation. This is normally done using certificates and certificate authorities. A certificate authority is a trusted third party which authenticates the association between a public key and a person's or other entity's identity.
- The certificate authority verifies the identity by examining identity documents etc., and then creates and signs a digital certificate containing the identity details and public key. Anyone who trusts the certificate authority can use the public key in the certificate with a high degree of certainty that it is genuine. They just have to verify that the certificate has indeed been signed by the certificate authority, whose public key is well-known.
- To achieve comparable security to secret-key cryptography, public-key cryptography utilises key lengths an order of magnitude larger, i.e. a few thousand bits compared with a few hundred bits.
- For a detailed discussion of cryptographic techniques, see Schneier, B., Applied Cryptography, Second Edition, John Wiley & Sons 1996.
- Security Requirements
- We define item security to have two related purposes:
-
- to allow authentication of an item
- to prevent forgery of an item
- The greater the difficulty of forgery, the greater the trustworthiness of authentication.
- When an item is coded, Netpage surface coding security has two corresponding purposes:
-
- to allow authentication of a coded item
- to prevent forgery of a coded item with a novel item ID
- If a user is able to determine the authenticity of the surface coding of an item, then the user may be able to make an informed decision about the likely authenticity of the item.
- If it is intractable to forge the surface coding for a novel ID, then the only tractable way of forging an item with an authentic surface coding is to duplicate the surface coding of an existing item (and hence its ID). If the user is able to determine by other means that the ID of an item is likely to be unique, then the user may assume that the item is authentic.
- Since the Netpage surface coding allows meaningful interaction between a sensing device and a coded surface during a purely local interaction, it is desirable for the surface coding to support authentication during a similarly local interaction, i.e. without requiring an increase in the size of the sensing device field of view.
- Since no a priori relationship exists between creators of authentic coded items and users potentially wishing to authenticate such items, it is undesirable to require a trust relationship between creators and users. For example, it is undesirable to require that creators share secret signature keys with users.
- It is reasonable for many users to rely on online access to an authenticator trusted by a creator for the purposes of authenticating items. Conversely, it is desirable to allow authentication to take place in the absence of online access.
- Security Discussion
- As discussed above in ‘Cryptography Background’, authentication relies on verifying the correspondence between data and a signature of that data. The greater the difficulty in forging a signature, the greater the trustworthiness of signature-based authentication.
- The item ID is unique and therefore provides a basis for a signature. If online authentication access is assumed, then the signature may simply be a random number associated with the item ID in an authentication database accessible to the trusted online authenticator. The random number may be generated by any suitable method, such as via a deterministic (pseudo-random) algorithm, or via a stochastic physical process. A keyed hash or encrypted hash may be preferable to a random number since it requires no additional space in the authentication database.
- In the limit case no signature is actually required, since the mere presence of the item ID in the database indicates authenticity. However, the use of a signature limits a forger to forging items he has actually sighted.
- To prevent forgery of a signature for an unsighted ID, the signature must be large enough to make exhaustive search via repeated accesses to the online authenticator intractable. If generated using a key rather than randomly, then the length of the signature must also be large enough to prevent the forger from deducing the key from known ID-signature pairs. Signatures of a few hundred bits are considered secure, whether generated using private or secret keys.
- Limited space within the surface coding tag structure makes it impractical to include a secure signature in a tag. We are therefore motivated to distribute fragments of a signature across multiple tags. If each fragment can be verified in isolation against the ID, then the goal of supporting authentication without increasing the sensing device field of view is achieved. The security of the signature still derives from the full length of the signature rather than from the length of a fragment, since a forger cannot predict which fragment a user will randomly choose to verify. Note that a trusted authenticator can always perform fragment verification, so fragment verification is always possible when online access to a trusted authenticator is available.
- Fragment verification requires fragment identification. Fragments may be explicitly numbered, or may more economically be identified by the two-dimensional coordinate of their tag, modulo the repetition of the signature across a continuous tiling of tags.
- The limited length of the ID itself introduces a further vulnerability. Ideally it should be at least a few hundred bits. In the Netpage and Hyperlabel surface coding schemes it is 96 bits or less. To overcome this, the ID may be padded. For this to be effective the padding must be variable, i.e. it must vary from one ID to the next. Ideally the padding is simply a random number, and must then be stored in the authentication database indexed by ID. If the padding is deterministically generated from the ID then it is worthless.
- Offline authentication of secret-key signatures requires the use of a trusted offline authentication device. The QA chip (see U.S. Pat. No. 6,374,354, issued 16 Apr. 2002) provides the basis for such a device, although of limited capacity. The QA chip can be programmed to verify a signature using a secret key securely held in its internal memory. In this scenario, however, it is impractical to support per-ID padding, and it is impractical even to support more than a very few secret keys. Furthermore, a QA chip programmed in this manner is susceptible to a chosen-message attack. These constraints limit the applicability of a QA-chip-based trusted offline authentication device to niche applications.
- In general, despite the claimed security of any particular trusted offline authentication device, creators of secure items are likely to be reluctant to entrust their secret signature keys to such devices, and this is again likely to limit the applicability of such devices to niche applications.
- By contrast, offline authentication of public-key signatures (i.e. generated using the corresponding private keys) is highly practical. An offline authentication device utilising public keys can trivially hold any number of public keys, and may be designed to retrieve additional public keys on demand, via a transient online connection, when it encounters an ID for which it knows it has no corresponding public signature key. Untrusted offline authentication is likely to be attractive to most creators of secure items, since they are able to retain exclusive control of their private signature keys.
- A disadvantage of offline authentication of a public-key signature is that the entire signature must be acquired from the coding, violating our desire to support authentication with a minimal field of view. A corresponding advantage of offline authentication of a public-key signature is that access to the ID padding is no longer required, since decryption of the signature using the public signature key generates both the ID and its padding, and the padding can then be ignored.
- Acquisition of an entire distributed signature is not particularly onerous. Any random or linear swipe of a hand-held sensing device across a coded surface allows it to quickly acquire all of the fragments of the signature. The sensing device can easily be programmed to signal the user when it has acquired a full set of fragments and has completed authentication. A scanning laser can also easily acquire all of the fragments of the signature. Both kinds of devices may be programmed to only perform authentication when the tags indicate the presence of a signature.
- Note that a public-key signature may be authenticated online via any of its fragments in the same way as any signature, whether generated randomly or using a secret key. The trusted online authenticator may generate the signature on demand using the private key and ID padding, or may store the signature explicitly in the authentication database. The latter approach obviates the need to store the ID padding.
- Note also that signature-based authentication may be used in place of fragment-based authentication even when online access to a trusted authenticator is available.
- Security Specification
- Setup per ID range:
-
-
- generate public-private signature key pair
- store key pair indexed by ID range
Setup per ID: - generate ID padding
- retrieve private signature key by ID
- generate signature by encrypting ID and padding using private key
- store signature in database indexed by ID
- encode signature across multiple tags in repeated fashion
Online (fragment-based) authentication (user): - acquire ID from tags
- acquire position and signature fragment from tag
- generate fragment number from position
- look up trusted authenticator by ID
- transmit ID, fragment and fragment number to trusted authenticator
Online (fragment-based) authentication (trusted authenticator): - receive ID, fragment and fragment number from user
- retrieve signature from database by ID
- compare supplied fragment with signature
- report authentication result to user
Offline (signature-based) authentication (user): - acquire ID from tags
- acquire positions and signature fragments from tags
- generate signature from fragments
- retrieve public signature key by ID
- decrypt signature using public key
- compare acquired ID with decrypted ID
- report authentication result to user
- Introduction
- This section defines a surface coding used by the Netpage system (described above in ‘Netpage Architecture’) to imbue otherwise passive surfaces with interactivity in conjunction with Netpage sensing devices such as the Netpage pen and the Netpage viewer.
- When interacting with a Netpage coded surface, a Netpage sensing device generates a digital ink stream which indicates both the identity of the surface region relative to which the sensing device is moving, and the absolute path of the sensing device within the region.
- Surface Coding
- The Netpage surface coding consists of a dense planar tiling of tags. Each tag encodes its own location in the plane. Each tag also encodes, in conjunction with adjacent tags, an identifier of the region containing the tag. In the Netpage system, the region typically corresponds to the entire extent of the tagged surface, such as one side of a sheet of paper.
- Each tag is represented by a pattern which contains two kinds of elements. The first kind of element is a target.
- Targets allow a tag to be located in an image of a coded surface, and allow the perspective distortion of the tag to be inferred. The second kind of element is a macrodot. Each macrodot encodes the value of a bit by its presence or absence.
- The pattern is represented on the coded surface in such a way as to allow it to be acquired by an optical imaging system, and in particular by an optical system with a narrowband response in the near-infrared. The pattern is typically printed onto the surface using a narrowband near-infrared ink.
- Tag Structure
-
FIG. 23 shows the structure of acomplete tag 200. Each of the fourblack circles 202 is a target. Thetag 200, and the overall pattern, has four-fold rotational symmetry at the physical level. - Each square region represents a
symbol 204, and each symbol represents four bits of information. Eachsymbol 204 shown in the tag structure has aunique label 216. Eachlabel 216 has an alphabetic prefix and a numeric suffix. -
FIG. 24 shows the structure of asymbol 204. It contains fourmacrodots 206, each of which represents the value of one bit by its presence (one) or absence (zero). - The
macrodot 206 spacing is specified by the parameter S throughout this specification. It has a nominal value of 143 μm, based on 9 dots printed at a pitch of 1600 dots per inch. However, it is allowed to vary within defined bounds according to the capabilities of the device used to produce the pattern. -
FIG. 25 shows anarray 208 of nineadjacent symbols 204. Themacrodot 206 spacing is uniform both within and betweensymbols 208. -
FIG. 26 shows the ordering of the bits within asymbol 204. - Bit zero 210 is the least significant within a
symbol 204; bit three 212 is the most significant. Note that this ordering is relative to the orientation of thesymbol 204. The orientation of aparticular symbol 204 within thetag 200 is indicated by the orientation of thelabel 216 of the symbol in the tag diagrams (see for exampleFIG. 23 ). In general, the orientation of allsymbols 204 within a particular segment of thetag 200 is the same, consistent with the bottom of the symbol being closest to the centre of the tag. - Only the
macrodots 206 are part of the representation of asymbol 204 in the pattern. Thesquare outline 214 of asymbol 204 is used in this specification to more clearly elucidate the structure of atag 204.FIG. 27 , by way of illustration, shows the actual pattern of atag 200 with everybit 206 set. Note that, in practice, everybit 206 of atag 200 can never be set. - A
macrodot 206 is nominally circular with a nominal diameter of (5/9)s. However, it is allowed to vary in size by ±10% according to the capabilities of the device used to produce the pattern. - A
target 202 is nominally circular with a nominal diameter of (17/9)s. However, it is allowed to vary in size by ±10% according to the capabilities of the device used to produce the pattern. - The tag pattern is allowed to vary in scale by up to ±10% according to the capabilities of the device used to produce the pattern. Any deviation from the nominal scale is recorded in the tag data to allow accurate generation of position samples.
- Tag Groups
-
Tags 200 are arranged intotag groups 218. Each tag group contains four tags arranged in a square. Eachtag 200 has one of four possible tag types, each of which is labelled according to its location within thetag group 218. The tag type labels 220 are 00, 10, 01 and 11, as shown inFIG. 28 . -
FIG. 29 shows how tag groups are repeated in a continuous tiling of tags, ortag pattern 222. The tiling guarantees the any set of fouradjacent tags 200 contains one tag of eachtype 220. - Codewords
- The tag contains four complete codewords. The layout of the four codewords is shown in
FIG. 30 . Each codeword is of a punctured 24-ary (8, 5) Reed-Solomon code. The codewords are labelled A, B, C and D. Fragments of each codeword are distributed throughout thetag 200. - Two of the codewords are unique to the
tag 200. These are referred to aslocal codewords 224 and are labelled A and B. Thetag 200 therefore encodes up to 40 bits of information unique to the tag. - The remaining two codewords are unique to a tag type, but common to all tags of the same type within a contiguous tiling of
tags 222. These are referred to asglobal codewords 226 and are labelled C and D, subscripted by tag type. - A
tag group 218 therefore encodes up to 160 bits of information common to all tag groups within a contiguous tiling of tags. - Codewords are encoded using a punctured 24-ary (8, 5) Reed-Solomon code. A 24-ary (8, 5) code encodes 20 data bits (i.e. five 4-bit symbols) and 12 redundancy bits (i.e. three 4-bit symbols) in each codeword. Its error-detecting capacity is three symbols. Its error-correcting capacity is one symbol.
-
FIG. 31 shows acodeword 228 of eightsymbols 204, with five symbols encoding data coordinates 230 and three symbols encoding redundancy coordinates 232. The codeword coordinates are indexed in coefficient order, and the data bit ordering follows the codeword bit ordering. - A punctured 24-ary (8, 5) Reed-Solomon code is a 24-ary (15, 5) Reed-Solomon code with seven redundancy coordinates removed. The removed coordinates are the most significant redundancy coordinates.
- The code has the following primitive polynominal:
p(x)=x 4 +x+1 (EQ 1) - The code has the following generator polynominal:
g(x)=(x+a)(x+a 2) . . . (x+a 10) (EQ 2) - For a detailed description of Reed-Solomon codes, refer to Wicker, S. B. and V. K. Bhargava, eds., Reed-Solomon Codes and Their Applications, IEEE Press, 1994, the contents of which are incorporated herein by reference.
- The Tag Coordinate Space
- The tag coordinate space has two orthogonal axes labelled x and y respectively. When the positive x axis points to the right, then the positive y axis points down.
- The surface coding does not specify the location of the tag coordinate space origin on a particular tagged surface, nor the orientation of the tag coordinate space with respect to the surface. This information is application-specific.
- For example, if the tagged surface is a sheet of paper, then the application which prints the tags onto the paper may record the actual offset and orientation, and these can be used to normalise any digital ink subsequently captured in conjunction with the surface.
- The position encoded in a tag is defined in units of tags. By convention, the position is taken to be the position of the centre of the target closest to the origin.
- Tag Information Content
- Table 1 defines the information fields embedded in the surface coding. Table 2 defines how these fields map to codewords.
TABLE 1 Field definitions field width description per codeword codeword type 2 The type of the codeword, i.e. one of A (b′00′), B (b′01′), C (b′10′) and D (b′11′). per tag tag type 2 The type1 of the tag, i.e. one of 00 (b′00′), 01 (b′01′), 10 (b′10′) and 11 (b′11′). x coordinate 13 The unsigned x coordinate of the tag2. y coordinate 13 The unsigned y coordinate of the tagb. active area flag 1 A flag indicating whether the tag is a member of an active area. b′1′ indicates membership. active area map 1 A flag indicating whether an active area map flag is present. b′1′ indicates the presence of a map (see next field). If the map is absent then the value of each map entry is derived from the active area flag (see previous field). active area map 8 A map3 of which of the tag's immediate eight neighbours are members of an active area. b′1′ indicates membership. data fragment 8 A fragment of an embedded data stream. Only present if the active area map is absent. per tag group encoding format 8 The format of the encoding. 0: the present encoding Other values are TBA. region flags 8 Flags controlling the interpretation and routing of region-related information. 0: region ID is an EPC 1: region is linked 2: region is interactive 3: region is signed 4: region includes data 5: region relates to mobile application Other bits are reserved and must be zero. tag size 16 The difference between the actual tag size adjustment and the nominal tag size4, in 10 nm units, in sign-magnitude format. region ID 96 The ID of the region containing the tags. CRC 16 A CRC5 of tag group data. total 320
1corresponds to the bottom two bits of the x and y coordinates of the tag
2allows a maximum coordinate value of approximately 14 m
3FIG. 29 indicates the bit ordering of the map
4the nominal tag size is 1.7145 mm (based on 1600 dpi, 9 dots per macrodot, and 12 macrodots per tag)
5CCITT CRC-16 [7]
-
FIG. 32 shows atag 200 and its eight immediate neighbours, each labelled with its corresponding bit index in the active area map. An active area map indicates whether the corresponding tags are members of an active area. An active area is an area within which any captured input should be immediately forwarded to the corresponding Netpage server for interoperation. It also allows the Netpage sensing device to signal to the user that the input will have an immediate effect.TABLE 2 Mapping of fields to codewords codeword field codeword bits field width bits A 1:0 codeword type 2 all (b′00′) 10:2 x coordinate 9 12:4 19:11 y coordinate 9 12:4 B 1:0 codeword type 2 all (b′01′) 2 tag type 1 0 5:2 x coordinate 4 3:0 6 tag type 1 1 9:6 y coordinate 4 3:0 10 active area flag 1 all 11 active area map flag 1 all 19:12 active area map 8 all 19:12 data fragment 8 all C00 1:0 codeword type 2 all (b′10′) 9:2 encoding format 8 all 17:10 region flags 8 all 19:18 tag size adjustment 2 1:0 C01 1:0 codeword type 2 all (b′10′) 15:2 tag size adjustment 14 15:2 19:16 region ID 4 3:0 C10 1:0 codeword type 2 all (b′10′) 19:2 region ID 18 21:4 C11 1:0 codeword type 2 all (b′10′) 19:2 region ID 18 39:22 D00 1:0 codeword type 2 all (b′11′) 19:2 region ID 18 57:40 D01 1:0 codeword type 2 all (b′11′) 19:2 region ID 18 75:58 D10 1:0 codeword type 2 all (b′11′) 19:2 region ID 18 93:76 D11 1:0 codeword type 2 all (b′11′) 3:2 region ID 2 95:94 19:4 CRC 16 all - Note that the tag type can be moved into a global codeword to maximise local codeword utilization. This in turn can allow larger coordinates and/or 16-bit data fragments (potentially configurably in conjunction with coordinate precision). However, this reduces the independence of position decoding from region ID decoding and has not been included in the specification at this time.
- Embedded Data
- If the “region includes data” flag in the region flags is set then the surface coding contains embedded data. The data is encoded in multiple contiguous tags' data fragments, and is replicated in the surface coding as many times as it will fit.
- The embedded data is encoded in such a way that a random and partial scan of the surface coding containing the embedded data can be sufficient to retrieve the entire data. The scanning system reassembles the data from retrieved fragments, and reports to the user when sufficient fragments have been retrieved without error.
- As shown in Table 3, a 200-bit data block encodes 160 bits of data. The block data is encoded in the data fragments of A contiguous group of 25 tags arranged in a 5×5 square. A tag belongs to a block whose integer coordinate is the tag's coordinate divided by 5. Within each block the data is arranged into tags with increasing x coordinate within increasing y coordinate.
- A data fragment may be missing from a block where an active area map is present. However, the missing data fragment is likely to be recoverable from another copy of the block.
- Data of arbitrary size is encoded into a superblock consisting of a contiguous set of blocks arranged in a rectangle.
- The size of the superblock is encoded in each block. A block belongs to a superblock whose integer coordinate is the block's coordinate divided by the superblock size. Within each superblock the data is arranged into blocks with increasing x coordinate within increasing y coordinate.
- The superblock is replicated in the surface coding as many times as it will fit, including partially along the edges of the surface coding.
- The data encoded in the superblock may include more precise type information, more precise size information, and more extensive error detection and/or correction data.
TABLE 3 Embedded data block field width description data type 8 The type of the data in the superblock. Values include: 0: type is controlled by region flags 1: MIME Other values are TBA. superblock width 8 The width of the superblock, in blocks. superblock height 8 The height of the superblock, in blocks. data 160 The block data. CRC 16 A CRC6 of the block data. total 200
6CCITT CRC-16 [7]
Cryptographic Signature of Region ID - If the “region is signed” flag in the region flags is set then the surface coding contains a 160-bit cryptographic signature of the region ID. The signature is encoded in a one-block superblock.
- In an online environment any signature fragment can be used, in conjunction with the region ID, to validate the signature. In an offline environment the entire signature can be recovered by reading multiple tags, and can then be validated using the corresponding public signature key. This is discussed in more detail in Netpage Surface Coding Security section above.
- MIME Data
- If the embedded data type is “MIME” then the superblock contains Multipurpose Internet Mail Extensions (MIME) data according to RFC 2045 (see Freed, N., and N. Borenstein, “Multipurpose Internet Mail Extensions (MIME)-Part One: Format of Internet Message Bodies”, RFC 2045, November 1996), RFC 2046 (see Freed, N., and N. Borenstein, “Multipurpose Internet Mail Extensions (MIME)—Part Two: Media Types”, RFC 2046, November 1996) and related RFCs. The MIME data consists of a header followed by a body. The header is encoded as a variable-length text string preceded by an 8-bit string length. The body is encoded as a variable-length type-specific octet stream preceded by a 16-bit size in big-endian format.
- The basic top-level media types described in RFC 2046 include text, image, audio, video and application.
- RFC 2425 (see Howes, T., M. Smith and F. Dawson, “A MIME Content-Type for Directory Information”, RFC 2045, September 1998) and RFC 2426 (see Dawson, F., and T. Howes, “vCard MIME Directory Profile”, RFC 2046, September 1998) describe a text subtype for directory information suitable, for example, for encoding contact information which might appear on a business card.
- Encoding and Printing Considerations
- The Print Engine Controller (PEC) supports the encoding of two fixed (per-page) 24-ary (15, 5) Reed-Solomon codewords and six variable (per-tag) 24-ary (15, 5) Reed-Solomon codewords. Furthermore, PEC supports the rendering of tags via a rectangular unit cell whose layout is constant (per page) but whose variable codeword data may vary from one unit cell to the next. PEC does not allow unit cells to overlap in the direction of page movement. A unit cell compatible with PEC contains a single tag group consisting of four tags. The tag group contains a single A codeword unique to the tag group but replicated four times within the tag group, and four unique B codewords. These can be encoded using five of PEC's six supported variable codewords. The tag group also contains eight fixed C and D codewords. One of these can be encoded using the remaining one of PEC's variable codewords, two more can be encoded using PEC's two fixed codewords, and the remaining five can be encoded and pre-rendered into the Tag Format Structure (TFS) supplied to PEC.
- PEC imposes a limit of 32 unique bit addresses per TFS row. The contents of the unit cell respect this limit. PEC also imposes a limit of 384 on the width of the TFS. The contents of the unit cell respect this limit.
- Note that for a reasonable page size, the number of variable coordinate bits in the A codeword is modest, making encoding via a lookup table tractable. Encoding of the B codeword via a lookup table may also be possible. Note that since a Reed-Solomon code is systematic, only the redundancy data needs to appear in the lookup table.
- Imaging and Decoding Considerations
- The minimum imaging field of view required to guarantee acquisition of an entire tag has a diameter of 39.6s (i.e. (2×(12+2))√{square root over (2)}s), allowing for arbitrary alignment between the surface coding and the field of view. Given a macrodot spacing of 143 μm, this gives a required field of view of 5.7 mm.
- Table 4 gives pitch ranges achievable for the present surface coding for different sampling rates, assuming an image sensor size of 128 pixels.
TABLE 4 Pitch ranges achievable for present surface coding for different sampling rates; dot pitch = 1600 dpi, macrodot pitch = 9 dots, viewing distance = 30 mm, nib-to-FOV separation = 1 mm, image sensor size = 128 pixels sampling rate pitch range 2 −40 to +49 2.5 −27 to +36 3 −10 to +18 - Given the present surface coding, the corresponding decoding sequence is as follows:
-
- locate targets of complete tag
- infer perspective transform from targets
- sample and decode any one of tag's four codewords
- determine codeword type and hence tag orientation
- sample and decode required local (A and B) codewords
- codeword redundancy is only 12 bits, so only detect errors
- on decode error flag bad position sample
- determine tag x-y location, with reference to tag orientation
- infer 3D tag transform from oriented targets
- determine nib x-y location from tag x-y location and 3D transform
- determine active area status of nib location with reference to active area map
- generate local feedback based on nib active area status
- determine tag type from A codeword
- sample and decode required global (C and D) codewords (modulo window alignment, with reference to tag type)
- although codeword redundancy is only 12 bits, correct errors;
- subsequent CRC verification will detect erroneous error correction
- verify tag group data CRC
- on decode error flag bad region ID sample
- determine encoding type, and reject unknown encoding
- determine region flags
- determine region ID
- encode region ID, nib x-y location, nib active area status in digital ink
- route digital ink based on region flags
- Note that region ID decoding need not occur at the same rate as position decoding.
- Note that decoding of a codeword can be avoided if the codeword is found to be identical to an already-known good codeword.
- The above description is purely illustrative and the skilled worker in this field will readily recognize many variations and modifications that do not depart from the spirit and scope of the broad inventive concept.
Claims (17)
1. A computer network for a plurality of users, the computer network comprising:
a server;
a printer;
a network user identifier for a network user to carry on their person; and,
a printer identifier associated with the printer; wherein during use,
the network user identifier and the printer identifier interact such that any of the network user's pending printouts are sent to the printer for printing when the network user is proximate the printer.
2. A computer network according to claims 1 wherein the network has a plurality of said printers, each printer associated with one of the printer identifiers respectively; and
a plurality of said network user identifiers, each uniquely identifying different network users.
3. A computer network according to claims 2 wherein each of the network user identifiers is a token and each of the printer identifiers has a token reader such that the user presents their token to the token reader associated with one of the printers to request actual printing of their queued printouts via that printer.
4. A computer network according to claims 3 wherein the tokens are a short-range RFID tag, a smartcard or a magnetic stripe card.
5. A computer network according to claims 4 wherein the token reader notifies the server of the user's proximity to the associated printer which in turn initiates printing.
6. A computer network according to claims 2 wherein each of the printer identifiers is a token and each of the network user identifiers has a token reader associated with the user.
7. A computer network according to claims 6 wherein the token reader is an electronic stylus with an optical sensor, and the tokens are a surface on each of the printers with coded data disposed on it, the coded data being readable by the optical sensors of each user's electronic stylus.
8. A computer network according to claims 1 wherein the pending printouts are maintained in a queue by the server and each pending printout has a priority such that higher-priority printouts are printed before earlier-queued but lower-priority printouts.
9. A computer network according to claims 3 wherein the token readers identify both the user and the printer to the server when the user presents their token to the reader.
10. A computer network according to claims 9 wherein the token identfies the user explicitly.
11. A computer network according to claims 9 wherein the token has a token identifier and the server performs a database lookup to translate the token identifier into a user identity.
12. A computer network according to claims 9 wherein the token reader identifies the printer explicitly.
13. A computer network according to claims 9 wherein the reader has a reader identifier and the server performs a database lookup to translate the reader identifier into a printer indentity.
14. A computer network according to claims 3 wherein the token reader and the printer are separate devices which have an electrical connection.
15. A computer network according to claims 3 wherein the token reader is physically built into the printer.
16. A computer network according to claims 3 wherein the token reader informs the printer that the user has presented a token and the printer then explicitly retrieves the user's pending printouts for printing.
17. A computer network according to claims 3 wherein the token is a security access or identification badge or card.
Applications Claiming Priority (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2004904324 | 2004-08-03 | ||
AU2004904324A AU2004904324A0 (en) | 2004-08-03 | Methods and apparatus (NPS074) | |
AU2004904325A AU2004904325A0 (en) | 2004-08-03 | Methods and apparatus (NPS075) | |
AU2004904325 | 2004-08-03 | ||
AU2004904740A AU2004904740A0 (en) | 2004-08-20 | Methods, systems and apparatus (NPS080) | |
AU2004904740 | 2004-08-20 | ||
AU2004904803 | 2004-08-24 | ||
AU2004904803A AU2004904803A0 (en) | 2004-08-24 | Methods, systems and apparatus (NPS082) | |
AU2004905413 | 2004-09-21 | ||
AU2004905413A AU2004905413A0 (en) | 2004-09-21 | Methods, systems and apparatus (NPS083) | |
AU2005900034A AU2005900034A0 (en) | 2005-01-05 | Methods, systems and apparatus (NPS083) | |
AU2005900034 | 2005-01-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060028674A1 true US20060028674A1 (en) | 2006-02-09 |
Family
ID=35756905
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/193,482 Abandoned US20060028459A1 (en) | 2004-08-03 | 2005-08-01 | Pre-loaded force sensor |
US11/193,481 Abandoned US20060028400A1 (en) | 2004-08-03 | 2005-08-01 | Head mounted display with wave front modulator |
US11/193,479 Abandoned US20060028674A1 (en) | 2004-08-03 | 2005-08-01 | Printer with user ID sensor |
US11/193,435 Expired - Fee Related US7567241B2 (en) | 2004-08-03 | 2005-08-01 | Stylus with customizable appearance |
US12/497,684 Expired - Fee Related US8308387B2 (en) | 2004-08-03 | 2009-07-05 | Force-sensing electronic pen with user-replaceable cartridge |
US12/897,758 Abandoned US20110018903A1 (en) | 2004-08-03 | 2010-10-04 | Augmented reality device for presenting virtual imagery registered to a viewed surface |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/193,482 Abandoned US20060028459A1 (en) | 2004-08-03 | 2005-08-01 | Pre-loaded force sensor |
US11/193,481 Abandoned US20060028400A1 (en) | 2004-08-03 | 2005-08-01 | Head mounted display with wave front modulator |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/193,435 Expired - Fee Related US7567241B2 (en) | 2004-08-03 | 2005-08-01 | Stylus with customizable appearance |
US12/497,684 Expired - Fee Related US8308387B2 (en) | 2004-08-03 | 2009-07-05 | Force-sensing electronic pen with user-replaceable cartridge |
US12/897,758 Abandoned US20110018903A1 (en) | 2004-08-03 | 2010-10-04 | Augmented reality device for presenting virtual imagery registered to a viewed surface |
Country Status (9)
Country | Link |
---|---|
US (6) | US20060028459A1 (en) |
EP (3) | EP1779178A4 (en) |
JP (2) | JP2008508621A (en) |
KR (2) | KR101084853B1 (en) |
CN (1) | CN1993688B (en) |
AU (3) | AU2005269255A1 (en) |
CA (3) | CA2576010C (en) |
SG (1) | SG155167A1 (en) |
WO (3) | WO2006012679A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060193522A1 (en) * | 2005-02-28 | 2006-08-31 | Fuji Xerox Co., Ltd. | Printed material having location identification function, two-dimensional coordinate identification apparatus, image-forming apparatus and the method thereof |
US20070139711A1 (en) * | 2005-12-16 | 2007-06-21 | Brother Kogyo Kabushiki Kaisha | Image forming apparatus, image forming method, and recording sheet |
EP1835714A1 (en) | 2006-03-16 | 2007-09-19 | Océ-Technologies B.V. | Printing via kickstart function |
US20080130882A1 (en) * | 2006-12-05 | 2008-06-05 | International Business Machines Corporation | Secure printing via rfid tags |
US20090080015A1 (en) * | 2007-09-21 | 2009-03-26 | Silverbrook Research Pty Ltd | Printer driver for interactive printer |
US20090307029A1 (en) * | 2008-06-09 | 2009-12-10 | Krishnan Ramanathan | System and method for discounted printing |
US20110314539A1 (en) * | 2010-06-18 | 2011-12-22 | At&T Intellectual Property I, L.P. | Proximity Based Device Security |
US20130335758A1 (en) * | 2012-06-18 | 2013-12-19 | Canon Kabushiki Kaisha | Image-forming apparatus communicating with an information-processing apparatus |
US20140114782A1 (en) * | 2012-10-22 | 2014-04-24 | NCR Corporation, Law Dept. | Techniques for retail printing |
Families Citing this family (599)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7657128B2 (en) * | 2000-05-23 | 2010-02-02 | Silverbrook Research Pty Ltd | Optical force sensor |
US20120105740A1 (en) * | 2000-06-02 | 2012-05-03 | Oakley, Inc. | Eyewear with detachable adjustable electronics module |
US6505123B1 (en) | 2000-07-24 | 2003-01-07 | Weatherbank, Inc. | Interactive weather advisory system |
US7013009B2 (en) | 2001-06-21 | 2006-03-14 | Oakley, Inc. | Eyeglasses with wireless communication features |
JP4401728B2 (en) * | 2003-09-30 | 2010-01-20 | キヤノン株式会社 | Mixed reality space image generation method and mixed reality system |
CN100458910C (en) * | 2003-10-28 | 2009-02-04 | 松下电器产业株式会社 | Image display device and image display method |
SE527257C2 (en) * | 2004-06-21 | 2006-01-31 | Totalfoersvarets Forskningsins | Device and method for presenting an external image |
WO2006017771A1 (en) * | 2004-08-06 | 2006-02-16 | University Of Washington | Variable fixation viewing distance scanned light displays |
US8066384B2 (en) | 2004-08-18 | 2011-11-29 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
JP4268191B2 (en) * | 2004-12-14 | 2009-05-27 | パナソニック株式会社 | Information presenting apparatus, information presenting method, program, and recording medium |
US20060161469A1 (en) * | 2005-01-14 | 2006-07-20 | Weatherbank, Inc. | Interactive advisory system |
US8832121B2 (en) * | 2005-02-02 | 2014-09-09 | Accuweather, Inc. | Location-based data communications system and method |
US8120309B2 (en) * | 2005-08-05 | 2012-02-21 | Varta Microbattery Gmbh | Apparatus and method for charging a first battery from a second battery |
EP1915600A4 (en) * | 2005-08-19 | 2011-06-22 | Silverbrook Res Pty Ltd | An electronic stylus with a force re-directing coupling |
US8229467B2 (en) * | 2006-01-19 | 2012-07-24 | Locator IP, L.P. | Interactive advisory system |
CN101401059B (en) * | 2006-03-10 | 2012-08-15 | 吉田健治 | System for input to information processing device |
US7884811B2 (en) * | 2006-05-22 | 2011-02-08 | Adapx Inc. | Durable digital writing and sketching instrument |
WO2008002239A1 (en) * | 2006-06-28 | 2008-01-03 | Anoto Ab | Operation control and data processing in an electronic pen |
US8284204B2 (en) * | 2006-06-30 | 2012-10-09 | Nokia Corporation | Apparatus, method and a computer program product for providing a unified graphics pipeline for stereoscopic rendering |
DE102006031799B3 (en) * | 2006-07-06 | 2008-01-31 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method for autostereoscopic display of image information with adaptation to changes in the head position of the viewer |
KR100820639B1 (en) * | 2006-07-25 | 2008-04-10 | 한국과학기술연구원 | System and method for 3-dimensional interaction based on gaze and system and method for tracking 3-dimensional gaze |
US10168801B2 (en) * | 2006-08-31 | 2019-01-01 | Semiconductor Energy Laboratory Co., Ltd. | Electronic pen and electronic pen system |
EP1895745B1 (en) * | 2006-08-31 | 2015-04-22 | Swisscom AG | Method and communication system for continuous recording of data from the environment |
US20090267958A1 (en) * | 2006-09-19 | 2009-10-29 | Koninklijke Philips Electronics N.V. | Image viewing using multiple individual settings |
TW200823595A (en) * | 2006-11-28 | 2008-06-01 | Univ Nat Taiwan | Image capture device using programmable aperture |
US10298834B2 (en) | 2006-12-01 | 2019-05-21 | Google Llc | Video refocusing |
WO2008070724A2 (en) * | 2006-12-05 | 2008-06-12 | Adapx, Inc. | Carrier for a digital pen |
WO2008076774A2 (en) | 2006-12-14 | 2008-06-26 | Oakley, Inc. | Wearable high resolution audio visual interface |
US9217868B2 (en) * | 2007-01-12 | 2015-12-22 | Kopin Corporation | Monocular display device |
US20080174659A1 (en) * | 2007-01-18 | 2008-07-24 | Mcdowall Ian | Wide field of view display device and method |
DE102007005822A1 (en) * | 2007-01-31 | 2008-08-07 | Seereal Technologies S.A. | Holographic reconstruction system with optical wave tracking |
US20080192006A1 (en) | 2007-02-08 | 2008-08-14 | Silverbrook Research Pty Ltd | System for enabling user input and cursor control |
US8634814B2 (en) * | 2007-02-23 | 2014-01-21 | Locator IP, L.P. | Interactive advisory system for prioritizing content |
JP5507797B2 (en) * | 2007-03-12 | 2014-05-28 | キヤノン株式会社 | Head-mounted imaging display device and image generation device |
US9618748B2 (en) | 2008-04-02 | 2017-04-11 | Esight Corp. | Apparatus and method for a dynamic “region of interest” in a display system |
EP2143273A4 (en) | 2007-04-02 | 2012-08-08 | Esight Corp | An apparatus and method for augmenting sight |
US7898504B2 (en) | 2007-04-06 | 2011-03-01 | Sony Corporation | Personal theater display |
US7973763B2 (en) * | 2007-04-13 | 2011-07-05 | Htc Corporation | Electronic devices with sensible orientation structures, and associated methods |
US20220360739A1 (en) * | 2007-05-14 | 2022-11-10 | BlueRadios, Inc. | Head worn wireless computer having a display suitable for use as a mobile internet device |
US8855719B2 (en) * | 2009-05-08 | 2014-10-07 | Kopin Corporation | Wireless hands-free computing headset with detachable accessories controllable by motion, body gesture and/or vocal commands |
US9116340B2 (en) * | 2007-05-14 | 2015-08-25 | Kopin Corporation | Mobile wireless display for accessing data from a host and method for controlling |
US9235262B2 (en) * | 2009-05-08 | 2016-01-12 | Kopin Corporation | Remote control of host application using motion and voice commands |
US8218211B2 (en) * | 2007-05-16 | 2012-07-10 | Seereal Technologies S.A. | Holographic display with a variable beam deflection |
WO2008138983A2 (en) * | 2007-05-16 | 2008-11-20 | Seereal Technologies S.A. | Holographic display |
US20080294278A1 (en) * | 2007-05-23 | 2008-11-27 | Blake Charles Borgeson | Determining Viewing Distance Information for an Image |
WO2008145169A1 (en) * | 2007-05-31 | 2008-12-04 | Siemens Aktiengesellschaft | Mobile device and method for virtual retinal display |
WO2008152932A1 (en) * | 2007-06-13 | 2008-12-18 | Nec Corporation | Image display device, image display method and image display program |
US20080313037A1 (en) * | 2007-06-15 | 2008-12-18 | Root Steven A | Interactive advisory system |
JP4821716B2 (en) * | 2007-06-27 | 2011-11-24 | 富士ゼロックス株式会社 | Electronic writing instruments, caps, computer systems |
US7724322B2 (en) * | 2007-09-20 | 2010-05-25 | Sharp Laboratories Of America, Inc. | Virtual solar liquid crystal window |
CN101816186B (en) * | 2007-10-05 | 2013-05-22 | 吉田健治 | Remote control device capable of reading dot patterns formed on medium and display |
US8491121B2 (en) * | 2007-10-09 | 2013-07-23 | Elbit Systems Of America, Llc | Pupil scan apparatus |
US9703369B1 (en) * | 2007-10-11 | 2017-07-11 | Jeffrey David Mullen | Augmented reality video game systems |
WO2009120984A1 (en) | 2008-03-28 | 2009-10-01 | Kopin Corporation | Handheld wireless display device having high-resolution display suitable for use as a mobile internet device |
WO2009094399A1 (en) | 2008-01-22 | 2009-07-30 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Head-mounted projection display using reflective microdisplays |
KR20100137433A (en) * | 2008-01-28 | 2010-12-30 | 아노토 아베 | Digital pens and a method for digital recording of information |
JP5130930B2 (en) * | 2008-01-31 | 2013-01-30 | 富士ゼロックス株式会社 | Electronic writing instrument |
US9064196B1 (en) * | 2008-03-13 | 2015-06-23 | Impinj, Inc. | RFID tag dynamically adjusting clock frequency |
US8193912B1 (en) * | 2008-03-13 | 2012-06-05 | Impinj, Inc. | RFID tag dynamically adjusting clock frequency |
US8189035B2 (en) * | 2008-03-28 | 2012-05-29 | Sharp Laboratories Of America, Inc. | Method and apparatus for rendering virtual see-through scenes on single or tiled displays |
US7546694B1 (en) * | 2008-04-03 | 2009-06-16 | Il Poom Jeong | Combination drawing/measuring pen |
US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
US20090309854A1 (en) * | 2008-06-13 | 2009-12-17 | Polyvision Corporation | Input devices with multiple operating modes |
KR101493748B1 (en) * | 2008-06-16 | 2015-03-02 | 삼성전자주식회사 | Apparatus for providing product, display apparatus and method for providing GUI using the same |
US8366338B2 (en) * | 2008-06-23 | 2013-02-05 | Silverbrook Research Pty Ltd | Electronic pen having fast response time |
FR2935585B1 (en) * | 2008-09-01 | 2015-04-24 | Sagem Comm | FACADE OF ELECTRONIC APPARATUS ADAPTED AGAINST RADIATION OF INFRARED RADIATION TYPE. |
US8427424B2 (en) | 2008-09-30 | 2013-04-23 | Microsoft Corporation | Using physical objects in conjunction with an interactive surface |
US7965495B2 (en) | 2008-10-13 | 2011-06-21 | Apple Inc. | Battery connector structures for electronic devices |
US8284506B2 (en) | 2008-10-21 | 2012-10-09 | Gentex Corporation | Apparatus and method for making and assembling a multi-lens optical device |
US9600067B2 (en) * | 2008-10-27 | 2017-03-21 | Sri International | System and method for generating a mixed reality environment |
US20100110368A1 (en) * | 2008-11-02 | 2010-05-06 | David Chaum | System and apparatus for eyeglass appliance platform |
KR101564387B1 (en) * | 2009-01-26 | 2015-11-06 | 토비 에이비 | Detection of gaze point assisted by optical reference signals |
US20100208033A1 (en) * | 2009-02-13 | 2010-08-19 | Microsoft Corporation | Personal Media Landscapes in Mixed Reality |
US9740341B1 (en) | 2009-02-26 | 2017-08-22 | Amazon Technologies, Inc. | Capacitive sensing with interpolating force-sensitive resistor array |
US10180746B1 (en) | 2009-02-26 | 2019-01-15 | Amazon Technologies, Inc. | Hardware enabled interpolating sensor and display |
US20100228476A1 (en) * | 2009-03-04 | 2010-09-09 | Microsoft Corporation | Path projection to facilitate engagement |
US8494215B2 (en) * | 2009-03-05 | 2013-07-23 | Microsoft Corporation | Augmenting a field of view in connection with vision-tracking |
US8513547B2 (en) * | 2009-03-23 | 2013-08-20 | Fuji Xerox Co., Ltd. | Image reading apparatus and image reading method |
FR2943901B1 (en) * | 2009-04-01 | 2017-11-17 | E(Ye)Brain | METHOD AND SYSTEM FOR DETECTING OCULOMOTRIC ANOMALIES. |
US8943420B2 (en) * | 2009-06-18 | 2015-01-27 | Microsoft Corporation | Augmenting a field of view |
US9785272B1 (en) | 2009-07-31 | 2017-10-10 | Amazon Technologies, Inc. | Touch distinction |
US9740340B1 (en) | 2009-07-31 | 2017-08-22 | Amazon Technologies, Inc. | Visually consistent arrays including conductive mesh |
DE102009037835B4 (en) * | 2009-08-18 | 2012-12-06 | Metaio Gmbh | Method for displaying virtual information in a real environment |
US8515933B2 (en) * | 2009-08-18 | 2013-08-20 | Industrial Technology Research Institute | Video search method, video search system, and method thereof for establishing video database |
US8805862B2 (en) * | 2009-08-18 | 2014-08-12 | Industrial Technology Research Institute | Video search method using motion vectors and apparatus thereof |
US20110075257A1 (en) | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
US8810524B1 (en) | 2009-11-20 | 2014-08-19 | Amazon Technologies, Inc. | Two-sided touch sensor |
US20110156998A1 (en) * | 2009-12-28 | 2011-06-30 | Acer Incorporated | Method for switching to display three-dimensional images and digital display system |
US20110205190A1 (en) * | 2010-02-23 | 2011-08-25 | Spaulding Diana A | Keypad ring |
US8730309B2 (en) | 2010-02-23 | 2014-05-20 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction |
US8964298B2 (en) | 2010-02-28 | 2015-02-24 | Microsoft Corporation | Video display modification based on sensor input for a see-through near-to-eye display |
US9134534B2 (en) | 2010-02-28 | 2015-09-15 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including a modular image source |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US9366862B2 (en) | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
US20120194420A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Ar glasses with event triggered user action control of ar eyepiece facility |
US20150309316A1 (en) | 2011-04-06 | 2015-10-29 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
US8488246B2 (en) | 2010-02-28 | 2013-07-16 | Osterhout Group, Inc. | See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film |
US9097891B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment |
US20120242698A1 (en) * | 2010-02-28 | 2012-09-27 | Osterhout Group, Inc. | See-through near-eye display glasses with a multi-segment processor-controlled optical layer |
US20120249797A1 (en) | 2010-02-28 | 2012-10-04 | Osterhout Group, Inc. | Head-worn adaptive display |
US10180572B2 (en) | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
US9285589B2 (en) | 2010-02-28 | 2016-03-15 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered control of AR eyepiece applications |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US20110213664A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
JP2013521576A (en) * | 2010-02-28 | 2013-06-10 | オスターハウト グループ インコーポレイテッド | Local advertising content on interactive head-mounted eyepieces |
US9091851B2 (en) | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US9341843B2 (en) | 2010-02-28 | 2016-05-17 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a small scale image source |
US8482859B2 (en) | 2010-02-28 | 2013-07-09 | Osterhout Group, Inc. | See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US9223134B2 (en) | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US8477425B2 (en) | 2010-02-28 | 2013-07-02 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element |
US9759917B2 (en) | 2010-02-28 | 2017-09-12 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered AR eyepiece interface to external devices |
US8472120B2 (en) | 2010-02-28 | 2013-06-25 | Osterhout Group, Inc. | See-through near-eye display glasses with a small scale image source |
US8467133B2 (en) | 2010-02-28 | 2013-06-18 | Osterhout Group, Inc. | See-through display with an optical assembly including a wedge-shaped illumination system |
US20120200601A1 (en) * | 2010-02-28 | 2012-08-09 | Osterhout Group, Inc. | Ar glasses with state triggered eye control interaction with advertising facility |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US9479759B2 (en) * | 2010-03-29 | 2016-10-25 | Forstgarten International Holding Gmbh | Optical stereo device and autofocus method therefor |
JP5743416B2 (en) * | 2010-03-29 | 2015-07-01 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
TWI411943B (en) * | 2010-04-12 | 2013-10-11 | Hon Hai Prec Ind Co Ltd | Stylus |
KR101334107B1 (en) * | 2010-04-22 | 2013-12-16 | 주식회사 굿소프트웨어랩 | Apparatus and Method of User Interface for Manipulating Multimedia Contents in Vehicle |
EP2564259B1 (en) | 2010-04-30 | 2015-01-21 | Beijing Institute Of Technology | Wide angle and high resolution tiled head-mounted display device |
WO2011160114A1 (en) * | 2010-06-18 | 2011-12-22 | Minx, Inc. | Augmented reality |
JP5499985B2 (en) * | 2010-08-09 | 2014-05-21 | ソニー株式会社 | Display assembly |
TWI408948B (en) * | 2010-08-16 | 2013-09-11 | Wistron Corp | Method for playing corresponding 3d images according to different visual angles and related image processing system |
FR2964755B1 (en) * | 2010-09-13 | 2012-08-31 | Ait Yahiathene Daniel | DEVICE FOR IMPROVED VISION BY AN EYE WITH DMLA |
US8582206B2 (en) | 2010-09-15 | 2013-11-12 | Microsoft Corporation | Laser-scanning virtual image display |
US9122307B2 (en) * | 2010-09-20 | 2015-09-01 | Kopin Corporation | Advanced remote control of host application using motion and voice commands |
US10013976B2 (en) | 2010-09-20 | 2018-07-03 | Kopin Corporation | Context sensitive overlays in voice controlled headset computer displays |
US8706170B2 (en) * | 2010-09-20 | 2014-04-22 | Kopin Corporation | Miniature communications gateway for head mounted display |
US8862186B2 (en) * | 2010-09-21 | 2014-10-14 | Kopin Corporation | Lapel microphone micro-display system incorporating mobile information access system |
US9632315B2 (en) | 2010-10-21 | 2017-04-25 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more fresnel lenses |
US10359545B2 (en) | 2010-10-21 | 2019-07-23 | Lockheed Martin Corporation | Fresnel lens with reduced draft facet visibility |
US8625200B2 (en) | 2010-10-21 | 2014-01-07 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more reflective optical surfaces |
US8781794B2 (en) | 2010-10-21 | 2014-07-15 | Lockheed Martin Corporation | Methods and systems for creating free space reflective optical surfaces |
US9292973B2 (en) | 2010-11-08 | 2016-03-22 | Microsoft Technology Licensing, Llc | Automatic variable virtual focus for augmented reality displays |
WO2012062872A1 (en) * | 2010-11-11 | 2012-05-18 | Bae Systems Plc | Image presentation method, and apparatus therefor |
EP2453290A1 (en) * | 2010-11-11 | 2012-05-16 | BAE Systems PLC | Image presentation method and apparatus therefor |
US9304319B2 (en) | 2010-11-18 | 2016-04-05 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
US8975860B2 (en) * | 2010-11-29 | 2015-03-10 | E Ink Holdings Inc. | Electromagnetic touch input pen having a USB interface |
AU2011343660A1 (en) | 2010-12-16 | 2013-07-04 | Lockheed Martin Corporation | Collimating display with pixel lenses |
JP5678643B2 (en) * | 2010-12-21 | 2015-03-04 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US9111326B1 (en) | 2010-12-21 | 2015-08-18 | Rawles Llc | Designation of zones of interest within an augmented reality environment |
US8845110B1 (en) | 2010-12-23 | 2014-09-30 | Rawles Llc | Powered augmented reality projection accessory display device |
US9134593B1 (en) | 2010-12-23 | 2015-09-15 | Amazon Technologies, Inc. | Generation and modulation of non-visible structured light for augmented reality projection system |
US8905551B1 (en) | 2010-12-23 | 2014-12-09 | Rawles Llc | Unpowered augmented reality projection accessory display device |
US8845107B1 (en) | 2010-12-23 | 2014-09-30 | Rawles Llc | Characterization of a scene with structured light |
US9721386B1 (en) * | 2010-12-27 | 2017-08-01 | Amazon Technologies, Inc. | Integrated augmented reality environment |
US9508194B1 (en) | 2010-12-30 | 2016-11-29 | Amazon Technologies, Inc. | Utilizing content output devices in an augmented reality environment |
US9607315B1 (en) | 2010-12-30 | 2017-03-28 | Amazon Technologies, Inc. | Complementing operation of display devices in an augmented reality environment |
US9179139B2 (en) * | 2011-01-10 | 2015-11-03 | Kodak Alaris Inc. | Alignment of stereo images pairs for viewing |
WO2012103323A1 (en) * | 2011-01-28 | 2012-08-02 | More/Real Llc | Stylus |
JP5810540B2 (en) * | 2011-02-04 | 2015-11-11 | セイコーエプソン株式会社 | Head-mounted display device and method for controlling head-mounted display device |
US9329469B2 (en) * | 2011-02-17 | 2016-05-03 | Microsoft Technology Licensing, Llc | Providing an interactive experience using a 3D depth camera and a 3D projector |
JP2012174208A (en) * | 2011-02-24 | 2012-09-10 | Sony Corp | Information processing apparatus, information processing method, program, and terminal device |
GB201103200D0 (en) * | 2011-02-24 | 2011-04-13 | Isis Innovation | An optical device for the visually impaired |
US9480907B2 (en) | 2011-03-02 | 2016-11-01 | Microsoft Technology Licensing, Llc | Immersive display with peripheral illusions |
TWI436285B (en) * | 2011-03-16 | 2014-05-01 | Generalplus Technology Inc | Optical identification module device and optical reader having the same |
US10455089B2 (en) | 2011-03-22 | 2019-10-22 | Fmr Llc | Augmented reality system for product selection |
US9275254B2 (en) * | 2011-03-22 | 2016-03-01 | Fmr Llc | Augmented reality system for public and private seminars |
US8644673B2 (en) | 2011-03-22 | 2014-02-04 | Fmr Llc | Augmented reality system for re-casting a seminar with private calculations |
US10114451B2 (en) | 2011-03-22 | 2018-10-30 | Fmr Llc | Augmented reality in a virtual tour through a financial portfolio |
JP5784213B2 (en) | 2011-03-29 | 2015-09-24 | クアルコム,インコーポレイテッド | Selective hand occlusion on a virtual projection onto a physical surface using skeletal tracking |
US8810598B2 (en) | 2011-04-08 | 2014-08-19 | Nant Holdings Ip, Llc | Interference based augmented reality hosting platforms |
US8988512B2 (en) * | 2011-04-14 | 2015-03-24 | Mediatek Inc. | Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof |
EP2712432A4 (en) | 2011-05-10 | 2014-10-29 | Kopin Corp | Headset computer that uses motion and voice commands to control information display and remote devices |
US9330499B2 (en) | 2011-05-20 | 2016-05-03 | Microsoft Technology Licensing, Llc | Event augmentation with real-time information |
US9597587B2 (en) | 2011-06-08 | 2017-03-21 | Microsoft Technology Licensing, Llc | Locational node device |
WO2012170023A1 (en) * | 2011-06-08 | 2012-12-13 | Empire Technology Development Llc | Two-dimensional image capture for an augmented reality representation |
JP2013012980A (en) * | 2011-06-30 | 2013-01-17 | Sony Corp | Display control circuit and projector apparatus |
US8209183B1 (en) | 2011-07-07 | 2012-06-26 | Google Inc. | Systems and methods for correction of text from different input types, sources, and contexts |
US8885882B1 (en) | 2011-07-14 | 2014-11-11 | The Research Foundation For The State University Of New York | Real time eye tracking for human computer interaction |
US8988474B2 (en) | 2011-07-18 | 2015-03-24 | Microsoft Technology Licensing, Llc | Wide field-of-view virtual image projector |
WO2013013230A2 (en) * | 2011-07-21 | 2013-01-24 | Jonathan Arnold Bell | Wearable display devices |
US20130030896A1 (en) * | 2011-07-26 | 2013-01-31 | Shlomo Mai-Tal | Method and system for generating and distributing digital content |
US8823740B1 (en) | 2011-08-15 | 2014-09-02 | Google Inc. | Display system |
CA2750287C (en) | 2011-08-29 | 2012-07-03 | Microsoft Corporation | Gaze detection in a see-through, near-eye, mixed reality display |
EP2751609B1 (en) | 2011-08-30 | 2017-08-16 | Microsoft Technology Licensing, LLC | Head mounted display with iris scan profiling |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
US8670000B2 (en) | 2011-09-12 | 2014-03-11 | Google Inc. | Optical display system and method with virtual image contrast control |
US9118782B1 (en) | 2011-09-19 | 2015-08-25 | Amazon Technologies, Inc. | Optical interference mitigation |
US8941560B2 (en) | 2011-09-21 | 2015-01-27 | Google Inc. | Wearable computer with superimposed controls and instructions for external device |
US8998414B2 (en) | 2011-09-26 | 2015-04-07 | Microsoft Technology Licensing, Llc | Integrated eye tracking and display system |
US8966656B2 (en) * | 2011-10-21 | 2015-02-24 | Blackberry Limited | Displaying private information using alternate frame sequencing |
US9165401B1 (en) | 2011-10-24 | 2015-10-20 | Disney Enterprises, Inc. | Multi-perspective stereoscopy from light fields |
US9113043B1 (en) * | 2011-10-24 | 2015-08-18 | Disney Enterprises, Inc. | Multi-perspective stereoscopy from light fields |
US10598929B2 (en) | 2011-11-09 | 2020-03-24 | Google Llc | Measurement method and system |
US10354291B1 (en) | 2011-11-09 | 2019-07-16 | Google Llc | Distributing media to displays |
US8879155B1 (en) | 2011-11-09 | 2014-11-04 | Google Inc. | Measurement method and system |
US9222809B1 (en) * | 2011-11-13 | 2015-12-29 | SeeScan, Inc. | Portable pipe inspection systems and apparatus |
US8183997B1 (en) | 2011-11-14 | 2012-05-22 | Google Inc. | Displaying sound indications on a wearable computing system |
EP2783340A4 (en) | 2011-11-21 | 2015-03-25 | Nant Holdings Ip Llc | Subscription bill service, systems and methods |
EP2786196A4 (en) * | 2011-12-02 | 2015-11-11 | Jerry G Aguren | Wide field-of-view 3d stereo vision platform with dynamic control of immersive or heads-up display operation |
US9497501B2 (en) * | 2011-12-06 | 2016-11-15 | Microsoft Technology Licensing, Llc | Augmented reality virtual monitor |
TW201331787A (en) * | 2011-12-07 | 2013-08-01 | Microsoft Corp | Displaying virtual data as printed content |
US9183807B2 (en) * | 2011-12-07 | 2015-11-10 | Microsoft Technology Licensing, Llc | Displaying virtual data as printed content |
US9182815B2 (en) * | 2011-12-07 | 2015-11-10 | Microsoft Technology Licensing, Llc | Making static printed content dynamic with virtual data |
US9229231B2 (en) * | 2011-12-07 | 2016-01-05 | Microsoft Technology Licensing, Llc | Updating printed content with personalized virtual data |
US8681179B2 (en) | 2011-12-20 | 2014-03-25 | Xerox Corporation | Method and system for coordinating collisions between augmented reality and real reality |
US8970960B2 (en) | 2011-12-22 | 2015-03-03 | Mattel, Inc. | Augmented reality head gear |
US8996729B2 (en) | 2012-04-12 | 2015-03-31 | Nokia Corporation | Method and apparatus for synchronizing tasks performed by multiple devices |
CN104137064B (en) | 2011-12-28 | 2018-04-20 | 诺基亚技术有限公司 | Using switch |
WO2013101438A1 (en) | 2011-12-29 | 2013-07-04 | Kopin Corporation | Wireless hands-free computing head mounted video eyewear for local/remote diagnosis and repair |
US8941561B1 (en) * | 2012-01-06 | 2015-01-27 | Google Inc. | Image capture |
US9197864B1 (en) | 2012-01-06 | 2015-11-24 | Google Inc. | Zoom and image capture based on features of interest |
US8955973B2 (en) | 2012-01-06 | 2015-02-17 | Google Inc. | Method and system for input detection using structured light projection |
US9213185B1 (en) * | 2012-01-06 | 2015-12-15 | Google Inc. | Display scaling based on movement of a head-mounted display |
EP2805200B1 (en) | 2012-01-24 | 2017-09-13 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Compact eye-tracked head-mounted display |
US9734633B2 (en) * | 2012-01-27 | 2017-08-15 | Microsoft Technology Licensing, Llc | Virtual environment generating system |
US9076368B2 (en) | 2012-02-06 | 2015-07-07 | Battelle Memorial Institute | Image generation systems and image generation methods |
US8982014B2 (en) | 2012-02-06 | 2015-03-17 | Battelle Memorial Institute | Image generation systems and image generation methods |
US9052414B2 (en) | 2012-02-07 | 2015-06-09 | Microsoft Technology Licensing, Llc | Virtual image device |
US9354748B2 (en) | 2012-02-13 | 2016-05-31 | Microsoft Technology Licensing, Llc | Optical stylus interaction |
US20150109191A1 (en) * | 2012-02-16 | 2015-04-23 | Google Inc. | Speech Recognition |
US9864211B2 (en) | 2012-02-17 | 2018-01-09 | Oakley, Inc. | Systems and methods for removably coupling an electronic device to eyewear |
US9001005B2 (en) | 2012-02-29 | 2015-04-07 | Recon Instruments Inc. | Modular heads-up display systems |
US9069166B2 (en) | 2012-02-29 | 2015-06-30 | Recon Instruments Inc. | Gaze detecting heads-up display systems |
US8749529B2 (en) | 2012-03-01 | 2014-06-10 | Microsoft Corporation | Sensor-in-pixel display system with near infrared filter |
US9870066B2 (en) | 2012-03-02 | 2018-01-16 | Microsoft Technology Licensing, Llc | Method of manufacturing an input device |
US9298236B2 (en) | 2012-03-02 | 2016-03-29 | Microsoft Technology Licensing, Llc | Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter |
US9460029B2 (en) | 2012-03-02 | 2016-10-04 | Microsoft Technology Licensing, Llc | Pressure sensitive keys |
US8873227B2 (en) | 2012-03-02 | 2014-10-28 | Microsoft Corporation | Flexible hinge support layer |
US9075566B2 (en) | 2012-03-02 | 2015-07-07 | Microsoft Technoogy Licensing, LLC | Flexible hinge spine |
CN103300966B (en) * | 2012-03-12 | 2015-09-23 | 丹尼尔·阿塔 | Apparatus for improving eyesight of senile macular degeneration patient |
US8970571B1 (en) * | 2012-03-13 | 2015-03-03 | Google Inc. | Apparatus and method for display lighting adjustment |
US20130249870A1 (en) * | 2012-03-22 | 2013-09-26 | Motorola Mobility, Inc. | Dual mode active stylus for writing both on a capacitive touchscreen and paper |
US9426430B2 (en) * | 2012-03-22 | 2016-08-23 | Bounce Imaging, Inc. | Remote surveillance sensor apparatus |
WO2013138846A1 (en) * | 2012-03-22 | 2013-09-26 | Silverbrook Research Pty Ltd | Method and system of interacting with content disposed on substrates |
US10469916B1 (en) | 2012-03-23 | 2019-11-05 | Google Llc | Providing media content to a wearable device |
EP2841991B1 (en) | 2012-04-05 | 2020-01-08 | Magic Leap, Inc. | Wide-field of view (fov) imaging devices with active foveation capability |
JP6289448B2 (en) | 2012-04-25 | 2018-03-07 | コピン コーポレーション | Instant translation system |
US8929954B2 (en) | 2012-04-25 | 2015-01-06 | Kopin Corporation | Headset computer (HSC) as auxiliary display with ASR and HT input |
US9122321B2 (en) | 2012-05-04 | 2015-09-01 | Microsoft Technology Licensing, Llc | Collaboration environment using see through displays |
US9519640B2 (en) | 2012-05-04 | 2016-12-13 | Microsoft Technology Licensing, Llc | Intelligent translations in personal see through display |
US9423870B2 (en) | 2012-05-08 | 2016-08-23 | Google Inc. | Input determination method |
US9442290B2 (en) | 2012-05-10 | 2016-09-13 | Kopin Corporation | Headset computer operation using vehicle sensor feedback for remote control vehicle |
US20130300590A1 (en) | 2012-05-14 | 2013-11-14 | Paul Henry Dietz | Audio Feedback |
US10365711B2 (en) | 2012-05-17 | 2019-07-30 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for unified scene acquisition and pose tracking in a wearable display |
US9030505B2 (en) * | 2012-05-17 | 2015-05-12 | Nokia Technologies Oy | Method and apparatus for attracting a user's gaze to information in a non-intrusive manner |
JP6023801B2 (en) * | 2012-05-25 | 2016-11-09 | Hoya株式会社 | Simulation device |
KR101387189B1 (en) * | 2012-05-30 | 2014-04-29 | 삼성전기주식회사 | A display device of assistance information for driving and a display method of assistance information for driving |
US9165381B2 (en) | 2012-05-31 | 2015-10-20 | Microsoft Technology Licensing, Llc | Augmented books in a mixed reality environment |
US9583032B2 (en) * | 2012-06-05 | 2017-02-28 | Microsoft Technology Licensing, Llc | Navigating content using a physical object |
US9403399B2 (en) | 2012-06-06 | 2016-08-02 | Milwaukee Electric Tool Corporation | Marking pen |
US10031556B2 (en) | 2012-06-08 | 2018-07-24 | Microsoft Technology Licensing, Llc | User experience adaptation |
US20130328925A1 (en) * | 2012-06-12 | 2013-12-12 | Stephen G. Latta | Object focus in a mixed reality environment |
US9019615B2 (en) | 2012-06-12 | 2015-04-28 | Microsoft Technology Licensing, Llc | Wide field-of-view virtual image projector |
US9430055B2 (en) * | 2012-06-15 | 2016-08-30 | Microsoft Technology Licensing, Llc | Depth of field control for see-thru display |
US20130342572A1 (en) * | 2012-06-26 | 2013-12-26 | Adam G. Poulos | Control of displayed content in virtual environments |
US10129524B2 (en) | 2012-06-26 | 2018-11-13 | Google Llc | Depth-assigned content for depth-enhanced virtual reality images |
US9607424B2 (en) * | 2012-06-26 | 2017-03-28 | Lytro, Inc. | Depth-assigned content for depth-enhanced pictures |
US9858649B2 (en) | 2015-09-30 | 2018-01-02 | Lytro, Inc. | Depth-based image blurring |
US10176635B2 (en) | 2012-06-28 | 2019-01-08 | Microsoft Technology Licensing, Llc | Saving augmented realities |
US9339726B2 (en) | 2012-06-29 | 2016-05-17 | Nokia Technologies Oy | Method and apparatus for modifying the presentation of information based on the visual complexity of environment information |
US20140002580A1 (en) * | 2012-06-29 | 2014-01-02 | Monkeymedia, Inc. | Portable proprioceptive peripatetic polylinear video player |
US9077973B2 (en) | 2012-06-29 | 2015-07-07 | Dri Systems Llc | Wide field-of-view stereo vision platform with dynamic control of immersive or heads-up display operation |
US11266919B2 (en) | 2012-06-29 | 2022-03-08 | Monkeymedia, Inc. | Head-mounted display for navigating virtual and augmented reality |
US20140009395A1 (en) * | 2012-07-05 | 2014-01-09 | Asustek Computer Inc. | Method and system for controlling eye tracking |
US9854328B2 (en) | 2012-07-06 | 2017-12-26 | Arris Enterprises, Inc. | Augmentation of multimedia consumption |
US9355345B2 (en) | 2012-07-23 | 2016-05-31 | Microsoft Technology Licensing, Llc | Transparent tags with encoded data |
US9779757B1 (en) * | 2012-07-30 | 2017-10-03 | Amazon Technologies, Inc. | Visual indication of an operational state |
US8754829B2 (en) * | 2012-08-04 | 2014-06-17 | Paul Lapstun | Scanning light field camera and display |
US9250445B2 (en) * | 2012-08-08 | 2016-02-02 | Carol Ann Tosaya | Multiple-pixel-beam retinal displays |
US8964379B2 (en) | 2012-08-20 | 2015-02-24 | Microsoft Corporation | Switchable magnetic lock |
US20140071163A1 (en) * | 2012-09-11 | 2014-03-13 | Peter Tobias Kinnebrew | Augmented reality information detail |
US9317746B2 (en) * | 2012-09-25 | 2016-04-19 | Intel Corporation | Techniques for occlusion accomodation |
US9720231B2 (en) | 2012-09-26 | 2017-08-01 | Dolby Laboratories Licensing Corporation | Display, imaging system and controller for eyewear display device |
US20140092006A1 (en) * | 2012-09-28 | 2014-04-03 | Joshua Boelter | Device and method for modifying rendering based on viewer focus area from eye tracking |
US11126040B2 (en) | 2012-09-30 | 2021-09-21 | Optica Amuka (A.A.) Ltd. | Electrically-tunable lenses and lens systems |
EP3483648B1 (en) | 2012-09-30 | 2024-05-15 | Optica Amuka (A.A.) Ltd. | Lenses with electrically-tunable power and alignment |
US20190272029A1 (en) * | 2012-10-05 | 2019-09-05 | Elwha Llc | Correlating user reaction with at least an aspect associated with an augmentation of an augmented view |
US9152173B2 (en) | 2012-10-09 | 2015-10-06 | Microsoft Technology Licensing, Llc | Transparent display device |
US9874760B2 (en) | 2012-10-18 | 2018-01-23 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Stereoscopic displays with addressable focus cues |
CN104685423B (en) * | 2012-10-23 | 2017-07-28 | 李阳 | Dynamic solid and holographic display device |
US9479697B2 (en) | 2012-10-23 | 2016-10-25 | Bounce Imaging, Inc. | Systems, methods and media for generating a panoramic view |
US9019174B2 (en) | 2012-10-31 | 2015-04-28 | Microsoft Technology Licensing, Llc | Wearable emotion detection and feedback system |
US9014469B2 (en) * | 2012-11-01 | 2015-04-21 | Yael Zimet-Rubner | Color-mapping wand |
US10442774B1 (en) * | 2012-11-06 | 2019-10-15 | Valve Corporation | Adaptive optical path with variable focal length |
KR101991133B1 (en) * | 2012-11-20 | 2019-06-19 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Head mounted display and the method for controlling the same |
KR101987461B1 (en) * | 2012-11-21 | 2019-06-11 | 엘지전자 주식회사 | Mobile terminal and method for controlling of the same |
US10642376B2 (en) * | 2012-11-28 | 2020-05-05 | Intel Corporation | Multi-function stylus with sensor controller |
WO2014085768A1 (en) * | 2012-11-29 | 2014-06-05 | Haddish Imran | Virtual and augmented reality instruction system |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9977492B2 (en) * | 2012-12-06 | 2018-05-22 | Microsoft Technology Licensing, Llc | Mixed reality presentation |
FR2999302B1 (en) * | 2012-12-10 | 2017-12-22 | Yahiatene Daniel Ait | DEVICE FOR ENHANCING THE VISION OF A HUMAN BEING |
US9513748B2 (en) | 2012-12-13 | 2016-12-06 | Microsoft Technology Licensing, Llc | Combined display panel circuit |
US20150262424A1 (en) * | 2013-01-31 | 2015-09-17 | Google Inc. | Depth and Focus Discrimination for a Head-mountable device using a Light-Field Display System |
US10529134B2 (en) * | 2013-02-01 | 2020-01-07 | Sony Corporation | Information processing device, client device, information processing method, and program |
US9301085B2 (en) | 2013-02-20 | 2016-03-29 | Kopin Corporation | Computer headset with detachable 4G radio |
US9368985B2 (en) * | 2013-02-25 | 2016-06-14 | Htc Corporation | Electrical system, input apparatus and charging method for input apparatus |
US9638835B2 (en) | 2013-03-05 | 2017-05-02 | Microsoft Technology Licensing, Llc | Asymmetric aberration correcting lens |
US10163049B2 (en) | 2013-03-08 | 2018-12-25 | Microsoft Technology Licensing, Llc | Inconspicuous tag for generating augmented reality experiences |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US9898866B2 (en) | 2013-03-13 | 2018-02-20 | The University Of North Carolina At Chapel Hill | Low latency stabilization for head-worn displays |
US9406253B2 (en) * | 2013-03-14 | 2016-08-02 | Broadcom Corporation | Vision corrective display |
US9041741B2 (en) | 2013-03-14 | 2015-05-26 | Qualcomm Incorporated | User interface for a head mounted display |
US9721586B1 (en) | 2013-03-14 | 2017-08-01 | Amazon Technologies, Inc. | Voice controlled assistant with light indicator |
US20140268277A1 (en) * | 2013-03-14 | 2014-09-18 | Andreas Georgiou | Image correction using reconfigurable phase mask |
US9164281B2 (en) | 2013-03-15 | 2015-10-20 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US20140280502A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Crowd and cloud enabled virtual reality distributed location network |
US20140280644A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Real time unified communications interaction of a predefined location in a virtual reality location |
US20140280506A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Virtual reality enhanced through browser connections |
US20140282113A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Personal digital assistance and virtual reality |
TWI625551B (en) * | 2013-03-15 | 2018-06-01 | 傲思丹度科技公司 | 3d light field displays and methods with improved viewing angle depth and resolution |
WO2014149631A2 (en) | 2013-03-15 | 2014-09-25 | Oakley, Inc. | Electronic ornamentation for eyewear |
US9393870B2 (en) | 2013-03-15 | 2016-07-19 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
US10339711B2 (en) | 2013-03-15 | 2019-07-02 | Honda Motor Co., Ltd. | System and method for providing augmented reality based directions based on verbal and gestural cues |
US20140280505A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | Virtual reality interaction with 3d printing |
US9838506B1 (en) | 2013-03-15 | 2017-12-05 | Sony Interactive Entertainment America Llc | Virtual reality universe representation changes viewing based upon client side parameters |
US9251715B2 (en) | 2013-03-15 | 2016-02-02 | Honda Motor Co., Ltd. | Driver training system using heads-up display augmented reality graphics elements |
US10215583B2 (en) | 2013-03-15 | 2019-02-26 | Honda Motor Co., Ltd. | Multi-level navigation monitoring and control |
US9378644B2 (en) | 2013-03-15 | 2016-06-28 | Honda Motor Co., Ltd. | System and method for warning a driver of a potential rear end collision |
US20140280503A1 (en) | 2013-03-15 | 2014-09-18 | John Cronin | System and methods for effective virtual reality visitor interface |
US9747898B2 (en) | 2013-03-15 | 2017-08-29 | Honda Motor Co., Ltd. | Interpretation of ambiguous vehicle instructions |
US9818150B2 (en) | 2013-04-05 | 2017-11-14 | Digimarc Corporation | Imagery and annotations |
US10334151B2 (en) | 2013-04-22 | 2019-06-25 | Google Llc | Phase detection autofocus using subaperture images |
JP2014219448A (en) * | 2013-05-01 | 2014-11-20 | コニカミノルタ株式会社 | Display system, display method, display terminal and display program |
US9239460B2 (en) | 2013-05-10 | 2016-01-19 | Microsoft Technology Licensing, Llc | Calibration of eye location |
US9354702B2 (en) * | 2013-06-03 | 2016-05-31 | Daqri, Llc | Manipulation of virtual object in augmented reality via thought |
US9383819B2 (en) | 2013-06-03 | 2016-07-05 | Daqri, Llc | Manipulation of virtual object in augmented reality via intent |
CN205691887U (en) | 2013-06-12 | 2016-11-16 | 奥克利有限公司 | Modular communication system and glasses communication system |
US9319665B2 (en) * | 2013-06-19 | 2016-04-19 | TrackThings LLC | Method and apparatus for a self-focusing camera and eyeglass system |
US9443355B2 (en) | 2013-06-28 | 2016-09-13 | Microsoft Technology Licensing, Llc | Reprojection OLED display for augmented reality experiences |
CN103353667B (en) | 2013-06-28 | 2015-10-21 | 北京智谷睿拓技术服务有限公司 | Imaging adjustment Apparatus and method for |
CN103353677B (en) | 2013-06-28 | 2015-03-11 | 北京智谷睿拓技术服务有限公司 | Imaging device and method thereof |
CN103353663B (en) | 2013-06-28 | 2016-08-10 | 北京智谷睿拓技术服务有限公司 | Imaging adjusting apparatus and method |
US9514571B2 (en) | 2013-07-25 | 2016-12-06 | Microsoft Technology Licensing, Llc | Late stage reprojection |
GB2516499A (en) * | 2013-07-25 | 2015-01-28 | Nokia Corp | Apparatus, methods, computer programs suitable for enabling in-shop demonstrations |
CN103424891B (en) | 2013-07-31 | 2014-12-17 | 北京智谷睿拓技术服务有限公司 | Imaging device and method |
CN103431840B (en) | 2013-07-31 | 2016-01-20 | 北京智谷睿拓技术服务有限公司 | Eye optical parameter detecting system and method |
US9335548B1 (en) | 2013-08-21 | 2016-05-10 | Google Inc. | Head-wearable display with collimated light source and beam steering mechanism |
CN103431980A (en) | 2013-08-22 | 2013-12-11 | 北京智谷睿拓技术服务有限公司 | Eyesight protection imaging system and method |
CN103439801B (en) | 2013-08-22 | 2016-10-26 | 北京智谷睿拓技术服务有限公司 | Sight protectio imaging device and method |
US20150062158A1 (en) | 2013-08-28 | 2015-03-05 | Qualcomm Incorporated | Integration of head mounted displays with public display devices |
CN103605208B (en) | 2013-08-30 | 2016-09-28 | 北京智谷睿拓技术服务有限公司 | content projection system and method |
CN103500331B (en) | 2013-08-30 | 2017-11-10 | 北京智谷睿拓技术服务有限公司 | Based reminding method and device |
US9785231B1 (en) * | 2013-09-26 | 2017-10-10 | Rockwell Collins, Inc. | Head worn display integrity monitor system and methods |
KR20150037254A (en) * | 2013-09-30 | 2015-04-08 | 엘지전자 주식회사 | Wearable display device and method of controlling layer |
US20150097759A1 (en) * | 2013-10-07 | 2015-04-09 | Allan Thomas Evans | Wearable apparatus for accessing media content in multiple operating modes and method of use thereof |
EP2860697A1 (en) * | 2013-10-09 | 2015-04-15 | Thomson Licensing | Method for displaying a content through a head mounted display device, corresponding electronic device and computer program product |
CN103558909B (en) * | 2013-10-10 | 2017-03-29 | 北京智谷睿拓技术服务有限公司 | Interaction projection display packing and interaction projection display system |
US9582516B2 (en) | 2013-10-17 | 2017-02-28 | Nant Holdings Ip, Llc | Wide area augmented reality location-based services |
US9857591B2 (en) * | 2014-05-30 | 2018-01-02 | Magic Leap, Inc. | Methods and system for creating focal planes in virtual and augmented reality |
US20150169047A1 (en) * | 2013-12-16 | 2015-06-18 | Nokia Corporation | Method and apparatus for causation of capture of visual information indicative of a part of an environment |
US9690763B1 (en) | 2013-12-17 | 2017-06-27 | Bryant Christopher Lee | Display of webpage elements on a connected computer |
JP2015118578A (en) * | 2013-12-18 | 2015-06-25 | マイクロソフト コーポレーション | Augmented reality information detail |
US9551872B1 (en) | 2013-12-30 | 2017-01-24 | Google Inc. | Spatially multiplexed lens for head mounted display |
CN106464818A (en) * | 2014-01-06 | 2017-02-22 | 埃维根特公司 | Imaging a curved mirror and partially transparent plate |
US10409079B2 (en) | 2014-01-06 | 2019-09-10 | Avegant Corp. | Apparatus, system, and method for displaying an image using a plate |
US10303242B2 (en) | 2014-01-06 | 2019-05-28 | Avegant Corp. | Media chair apparatus, system, and method |
US9746942B2 (en) * | 2014-01-06 | 2017-08-29 | Delta Electronics, Inc. | Optical touch pen |
US10001645B2 (en) * | 2014-01-17 | 2018-06-19 | Sony Interactive Entertainment America Llc | Using a second screen as a private tracking heads-up display |
US9588343B2 (en) | 2014-01-25 | 2017-03-07 | Sony Interactive Entertainment America Llc | Menu navigation in a head-mounted display |
US9437159B2 (en) | 2014-01-25 | 2016-09-06 | Sony Interactive Entertainment America Llc | Environmental interrupt in a head-mounted display and utilization of non field of view real estate |
US9671612B2 (en) | 2014-01-29 | 2017-06-06 | Google Inc. | Dynamic lens for head mounted display |
US9865088B2 (en) | 2014-01-31 | 2018-01-09 | Empire Technology Development Llc | Evaluation of augmented reality skins |
EP3100098B8 (en) | 2014-01-31 | 2022-10-05 | Magic Leap, Inc. | Multi-focal display system and method |
CA2938262C (en) | 2014-01-31 | 2021-01-19 | Magic Leap, Inc. | Multi-focal display system and method |
JP6205498B2 (en) * | 2014-01-31 | 2017-09-27 | エンパイア テクノロジー ディベロップメント エルエルシー | Target person-selectable augmented reality skin |
US9990772B2 (en) | 2014-01-31 | 2018-06-05 | Empire Technology Development Llc | Augmented reality skin evaluation |
WO2015116179A1 (en) | 2014-01-31 | 2015-08-06 | Empire Technology Development, Llc | Augmented reality skin manager |
US9377626B2 (en) | 2014-02-18 | 2016-06-28 | Merge Labs, Inc. | Remote control augmented motion data capture |
US20150234188A1 (en) * | 2014-02-18 | 2015-08-20 | Aliphcom | Control of adaptive optics |
EP4016169B1 (en) | 2014-03-05 | 2023-11-22 | Arizona Board of Regents on Behalf of the University of Arizona | Wearable 3d augmented reality display with variable focus and/or object recognition |
US9404848B2 (en) * | 2014-03-11 | 2016-08-02 | The Boeing Company | Apparatuses and methods for testing adhesion of a seal to a surface |
US10120420B2 (en) | 2014-03-21 | 2018-11-06 | Microsoft Technology Licensing, Llc | Lockable display and techniques enabling use of lockable displays |
US10048647B2 (en) | 2014-03-27 | 2018-08-14 | Microsoft Technology Licensing, Llc | Optical waveguide including spatially-varying volume hologram |
JP2015194709A (en) * | 2014-03-28 | 2015-11-05 | パナソニックIpマネジメント株式会社 | image display device |
US9759918B2 (en) | 2014-05-01 | 2017-09-12 | Microsoft Technology Licensing, Llc | 3D mapping with flexible camera rig |
EP2944999A1 (en) * | 2014-05-15 | 2015-11-18 | Intral Strategy Execution S. L. | Display cap |
IL296027B2 (en) * | 2014-05-30 | 2024-08-01 | Magic Leap Inc | Methods and system for creating focal planes in virtual and augmented reality |
KR102205000B1 (en) | 2014-05-30 | 2021-01-18 | 매직 립, 인코포레이티드 | Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality |
CA2947809C (en) | 2014-06-05 | 2023-03-28 | Optica Amuka (A.A.) Ltd. | Control of dynamic lenses |
GB2527503A (en) * | 2014-06-17 | 2015-12-30 | Next Logic Pty Ltd | Generating a sequence of stereoscopic images for a head-mounted display |
US9766702B2 (en) | 2014-06-19 | 2017-09-19 | Apple Inc. | User detection by a computing device |
US10324733B2 (en) | 2014-07-30 | 2019-06-18 | Microsoft Technology Licensing, Llc | Shutdown notifications |
US9799142B2 (en) | 2014-08-15 | 2017-10-24 | Daqri, Llc | Spatial data collection |
US9799143B2 (en) | 2014-08-15 | 2017-10-24 | Daqri, Llc | Spatial data visualization |
US9830395B2 (en) * | 2014-08-15 | 2017-11-28 | Daqri, Llc | Spatial data processing |
JP2016045882A (en) * | 2014-08-26 | 2016-04-04 | 株式会社東芝 | Image processor and information processor |
KR101648446B1 (en) | 2014-10-07 | 2016-09-01 | 삼성전자주식회사 | Electronic conference system, method for controlling the electronic conference system, and digital pen |
KR102324192B1 (en) * | 2014-10-13 | 2021-11-09 | 삼성전자주식회사 | Medical imaging apparatus and control method for the same |
US10523993B2 (en) | 2014-10-16 | 2019-12-31 | Disney Enterprises, Inc. | Displaying custom positioned overlays to a viewer |
WO2016061447A1 (en) | 2014-10-17 | 2016-04-21 | Lockheed Martin Corporation | Head-wearable ultra-wide field of view display device |
WO2016073557A1 (en) | 2014-11-04 | 2016-05-12 | The University Of North Carolina At Chapel Hill | Minimal-latency tracking and display for matching real and virtual worlds |
US9900541B2 (en) | 2014-12-03 | 2018-02-20 | Vizio Inc | Augmented reality remote control |
EP3037784B1 (en) * | 2014-12-23 | 2019-05-01 | Nokia Technologies OY | Causation of display of supplemental map information |
WO2016108216A1 (en) | 2015-01-04 | 2016-07-07 | Microsoft Technology Licensing, Llc | Active stylus communication with a digitizer |
GB2534847A (en) | 2015-01-28 | 2016-08-10 | Sony Computer Entertainment Europe Ltd | Display |
US10176961B2 (en) | 2015-02-09 | 2019-01-08 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Small portable night vision system |
WO2016141054A1 (en) | 2015-03-02 | 2016-09-09 | Lockheed Martin Corporation | Wearable display system |
WO2016139850A1 (en) * | 2015-03-05 | 2016-09-09 | ソニー株式会社 | Information processing device, control method, and program |
US10606242B2 (en) * | 2015-03-12 | 2020-03-31 | Canon Kabushiki Kaisha | Print data division apparatus and program |
CA2979687A1 (en) | 2015-03-16 | 2016-09-22 | Nicole Elizabeth SAMEC | Methods and systems for diagnosing and treating health ailments |
US10853625B2 (en) | 2015-03-21 | 2020-12-01 | Mine One Gmbh | Facial signature methods, systems and software |
WO2016154123A2 (en) * | 2015-03-21 | 2016-09-29 | Mine One Gmbh | Virtual 3d methods, systems and software |
US12130430B2 (en) | 2015-03-31 | 2024-10-29 | Timothy Cummings | System for virtual display and method of use |
US9726885B2 (en) | 2015-03-31 | 2017-08-08 | Timothy A. Cummings | System for virtual display and method of use |
US9823474B2 (en) * | 2015-04-02 | 2017-11-21 | Avegant Corp. | System, apparatus, and method for displaying an image with a wider field of view |
US9995857B2 (en) | 2015-04-03 | 2018-06-12 | Avegant Corp. | System, apparatus, and method for displaying an image using focal modulation |
US20160292921A1 (en) * | 2015-04-03 | 2016-10-06 | Avegant Corporation | System, apparatus, and method for displaying an image using light of varying intensities |
US9697383B2 (en) * | 2015-04-14 | 2017-07-04 | International Business Machines Corporation | Numeric keypad encryption for augmented reality devices |
US10444931B2 (en) | 2017-05-09 | 2019-10-15 | Google Llc | Vantage generation and interactive playback |
US10469873B2 (en) | 2015-04-15 | 2019-11-05 | Google Llc | Encoding and decoding virtual reality video |
US10540818B2 (en) | 2015-04-15 | 2020-01-21 | Google Llc | Stereo image generation and interactive playback |
US10565734B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video capture, processing, calibration, computational fiber artifact removal, and light-field pipeline |
US11328446B2 (en) | 2015-04-15 | 2022-05-10 | Google Llc | Combining light-field data with active depth data for depth map generation |
US10412373B2 (en) | 2015-04-15 | 2019-09-10 | Google Llc | Image capture for virtual reality displays |
US10419737B2 (en) | 2015-04-15 | 2019-09-17 | Google Llc | Data structures and delivery methods for expediting virtual reality playback |
US10546424B2 (en) | 2015-04-15 | 2020-01-28 | Google Llc | Layered content delivery for virtual and augmented reality experiences |
US10275898B1 (en) | 2015-04-15 | 2019-04-30 | Google Llc | Wedge-based light-field video capture |
US10085005B2 (en) | 2015-04-15 | 2018-09-25 | Lytro, Inc. | Capturing light-field volume image and video data using tiled light-field cameras |
US10567464B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video compression with adaptive view-dependent lighting removal |
US10440407B2 (en) | 2017-05-09 | 2019-10-08 | Google Llc | Adaptive control for immersive experience delivery |
US10341632B2 (en) | 2015-04-15 | 2019-07-02 | Google Llc. | Spatial random access enabled video system with a three-dimensional viewing volume |
US10055888B2 (en) | 2015-04-28 | 2018-08-21 | Microsoft Technology Licensing, Llc | Producing and consuming metadata within multi-dimensional data |
WO2016175807A1 (en) | 2015-04-30 | 2016-11-03 | Hewlett-Packard Development Company, L.P. | Color changing apparatuses with solar cells |
EP3292700B1 (en) | 2015-05-05 | 2019-09-18 | Razer (Asia-Pacific) Pte. Ltd. | Methods for controlling a headset device, headset devices, computer readable media, and infrared sensors |
JP6433850B2 (en) * | 2015-05-13 | 2018-12-05 | 株式会社ソニー・インタラクティブエンタテインメント | Head mounted display, information processing apparatus, information processing system, and content data output method |
US9577697B2 (en) * | 2015-05-27 | 2017-02-21 | Otter Products, Llc | Protective case with stylus access feature |
US9977493B2 (en) | 2015-06-17 | 2018-05-22 | Microsoft Technology Licensing, Llc | Hybrid display system |
WO2016210159A1 (en) * | 2015-06-23 | 2016-12-29 | Mobius Virtual Foundry Llc | Head mounted display |
US20160378296A1 (en) * | 2015-06-25 | 2016-12-29 | Ashok Mishra | Augmented Reality Electronic Book Mechanism |
US10210844B2 (en) | 2015-06-29 | 2019-02-19 | Microsoft Technology Licensing, Llc | Holographic near-eye display |
US9588593B2 (en) | 2015-06-30 | 2017-03-07 | Ariadne's Thread (Usa), Inc. | Virtual reality system with control command gestures |
US10089790B2 (en) | 2015-06-30 | 2018-10-02 | Ariadne's Thread (Usa), Inc. | Predictive virtual reality display system with post rendering correction |
US9396588B1 (en) | 2015-06-30 | 2016-07-19 | Ariadne's Thread (Usa), Inc. (Dba Immerex) | Virtual reality virtual theater system |
US9240069B1 (en) * | 2015-06-30 | 2016-01-19 | Ariadne's Thread (Usa), Inc. | Low-latency virtual reality display system |
US9607428B2 (en) | 2015-06-30 | 2017-03-28 | Ariadne's Thread (Usa), Inc. | Variable resolution virtual reality display system |
US9588598B2 (en) | 2015-06-30 | 2017-03-07 | Ariadne's Thread (Usa), Inc. | Efficient orientation estimation system using magnetic, angular rate, and gravity sensors |
US10162583B2 (en) | 2015-07-02 | 2018-12-25 | Canon Information And Imaging Solutions, Inc. | System and method for printing |
US9979909B2 (en) | 2015-07-24 | 2018-05-22 | Lytro, Inc. | Automatic lens flare detection and correction for light-field images |
US9454010B1 (en) | 2015-08-07 | 2016-09-27 | Ariadne's Thread (Usa), Inc. | Wide field-of-view head mounted display system |
US9606362B2 (en) | 2015-08-07 | 2017-03-28 | Ariadne's Thread (Usa), Inc. | Peripheral field-of-view illumination system for a head mounted display |
US9990008B2 (en) | 2015-08-07 | 2018-06-05 | Ariadne's Thread (Usa), Inc. | Modular multi-mode virtual reality headset |
EP4198911A1 (en) * | 2015-08-18 | 2023-06-21 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
US9639945B2 (en) | 2015-08-27 | 2017-05-02 | Lytro, Inc. | Depth-based application of image effects |
US10168804B2 (en) | 2015-09-08 | 2019-01-01 | Apple Inc. | Stylus for electronic devices |
US9934594B2 (en) * | 2015-09-09 | 2018-04-03 | Spell Disain Ltd. | Textile-based augmented reality systems and methods |
US10681489B2 (en) | 2015-09-16 | 2020-06-09 | Magic Leap, Inc. | Head pose mixing of audio files |
US9736171B2 (en) * | 2015-10-12 | 2017-08-15 | Airwatch Llc | Analog security for digital data |
US10754156B2 (en) | 2015-10-20 | 2020-08-25 | Lockheed Martin Corporation | Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system |
US9805511B2 (en) * | 2015-10-21 | 2017-10-31 | International Business Machines Corporation | Interacting with data fields on a page using augmented reality |
US10338677B2 (en) * | 2015-10-28 | 2019-07-02 | Microsoft Technology Licensing, Llc | Adjusting image frames based on tracking motion of eyes |
USD792926S1 (en) | 2015-12-10 | 2017-07-25 | Milwaukee Electric Tool Corporation | Cap for a writing utensil |
US10147235B2 (en) | 2015-12-10 | 2018-12-04 | Microsoft Technology Licensing, Llc | AR display with adjustable stereo overlap zone |
JP6555120B2 (en) * | 2015-12-28 | 2019-08-07 | 富士ゼロックス株式会社 | Electronics |
WO2017114834A1 (en) | 2015-12-29 | 2017-07-06 | Koninklijke Philips N.V. | System, controller and method using virtual reality device for robotic surgery |
TWI595425B (en) * | 2015-12-30 | 2017-08-11 | 松翰科技股份有限公司 | Sensing device and optical sensing module |
US10092177B1 (en) | 2015-12-30 | 2018-10-09 | Verily Life Sciences Llc | Device, system and method for image display with a programmable phase map |
EP3400470A4 (en) | 2016-01-05 | 2019-09-04 | Saab Ab | Face plate in transparent optical projection displays |
US10643296B2 (en) | 2016-01-12 | 2020-05-05 | Qualcomm Incorporated | Systems and methods for rendering multiple levels of detail |
US10643381B2 (en) | 2016-01-12 | 2020-05-05 | Qualcomm Incorporated | Systems and methods for rendering multiple levels of detail |
US9459692B1 (en) | 2016-03-29 | 2016-10-04 | Ariadne's Thread (Usa), Inc. | Virtual reality headset with relative motion head tracker |
AU2017246901B2 (en) | 2016-04-08 | 2022-06-02 | Magic Leap, Inc. | Augmented reality systems and methods with variable focus lens elements |
EP3958048A1 (en) | 2016-04-17 | 2022-02-23 | Optica Amuka (A.A.) Ltd. | Liquid crystal lens with enhanced electrical drive |
WO2017182596A1 (en) | 2016-04-22 | 2017-10-26 | Carl Zeiss Meditec, Inc. | System and method for visual field testing |
US9995936B1 (en) | 2016-04-29 | 2018-06-12 | Lockheed Martin Corporation | Augmented reality systems having a virtual image overlaying an infrared portion of a live scene |
WO2017192467A1 (en) | 2016-05-02 | 2017-11-09 | Warner Bros. Entertainment Inc. | Geometry matching in virtual reality and augmented reality |
WO2017196879A1 (en) | 2016-05-09 | 2017-11-16 | Magic Leap, Inc. | Augmented reality systems and methods for user health analysis |
US10057511B2 (en) | 2016-05-11 | 2018-08-21 | International Business Machines Corporation | Framing enhanced reality overlays using invisible light emitters |
US10650591B1 (en) | 2016-05-24 | 2020-05-12 | Out of Sight Vision Systems LLC | Collision avoidance system for head mounted display utilized in room scale virtual reality system |
US10981060B1 (en) | 2016-05-24 | 2021-04-20 | Out of Sight Vision Systems LLC | Collision avoidance system for room scale virtual reality system |
US10146334B2 (en) | 2016-06-09 | 2018-12-04 | Microsoft Technology Licensing, Llc | Passive optical and inertial tracking in slim form-factor |
US10146335B2 (en) | 2016-06-09 | 2018-12-04 | Microsoft Technology Licensing, Llc | Modular extension of inertial controller for six DOF mixed reality input |
US10275892B2 (en) | 2016-06-09 | 2019-04-30 | Google Llc | Multi-view scene segmentation and propagation |
US11360330B2 (en) | 2016-06-16 | 2022-06-14 | Optica Amuka (A.A.) Ltd. | Tunable lenses for spectacles |
CN114296175A (en) | 2016-07-15 | 2022-04-08 | 光场实验室公司 | Energy propagation and lateral Anderson localization using two-dimensional, light-field and holographic repeaters |
KR102715030B1 (en) * | 2016-07-26 | 2024-10-10 | 삼성전자주식회사 | See-through type display apparatus |
US9858637B1 (en) * | 2016-07-29 | 2018-01-02 | Qualcomm Incorporated | Systems and methods for reducing motion-to-photon latency and memory bandwidth in a virtual reality system |
US10212414B2 (en) | 2016-08-01 | 2019-02-19 | Microsoft Technology Licensing, Llc | Dynamic realignment of stereoscopic digital consent |
US10181591B2 (en) | 2016-08-23 | 2019-01-15 | Microsoft Technology Licensing, Llc | Pen battery mechanical shock reduction design |
US10108144B2 (en) | 2016-09-16 | 2018-10-23 | Microsoft Technology Licensing, Llc | Holographic wide field of view display |
KR102723376B1 (en) * | 2016-10-21 | 2024-10-28 | 매직 립, 인코포레이티드 | System and method for presenting image content on multiple depth planes by providing multiple intra-pupil parallax views |
US10712572B1 (en) * | 2016-10-28 | 2020-07-14 | Facebook Technologies, Llc | Angle sensitive pixel array including a liquid crystal layer |
US10254542B2 (en) | 2016-11-01 | 2019-04-09 | Microsoft Technology Licensing, Llc | Holographic projector for a waveguide display |
US10120337B2 (en) * | 2016-11-04 | 2018-11-06 | Microsoft Technology Licensing, Llc | Adjustable scanned beam projector |
US10757400B2 (en) * | 2016-11-10 | 2020-08-25 | Manor Financial, Inc. | Near eye wavefront emulating display |
US10095342B2 (en) | 2016-11-14 | 2018-10-09 | Google Llc | Apparatus for sensing user input |
US10679361B2 (en) | 2016-12-05 | 2020-06-09 | Google Llc | Multi-view rotoscope contour propagation |
US11164378B1 (en) | 2016-12-08 | 2021-11-02 | Out of Sight Vision Systems LLC | Virtual reality detection and projection system for use with a head mounted display |
US11222397B2 (en) | 2016-12-23 | 2022-01-11 | Qualcomm Incorporated | Foveated rendering in tiled architectures |
US11022939B2 (en) | 2017-01-03 | 2021-06-01 | Microsoft Technology Licensing, Llc | Reduced bandwidth holographic near-eye display |
US10904514B2 (en) * | 2017-02-09 | 2021-01-26 | Facebook Technologies, Llc | Polarization illumination using acousto-optic structured light in 3D depth sensing |
DE102017202517A1 (en) * | 2017-02-16 | 2018-08-16 | Siemens Healthcare Gmbh | Operating device and operating method for operating a medical device |
US10620725B2 (en) * | 2017-02-17 | 2020-04-14 | Dell Products L.P. | System and method for dynamic mode switching in an active stylus |
IL307602A (en) | 2017-02-23 | 2023-12-01 | Magic Leap Inc | Variable-focus virtual image devices based on polarization conversion |
US20180262758A1 (en) * | 2017-03-08 | 2018-09-13 | Ostendo Technologies, Inc. | Compression Methods and Systems for Near-Eye Displays |
IL269042B2 (en) | 2017-03-09 | 2024-06-01 | Univ Arizona | Head-Mounted Light Field Display with Integral Imaging and Relay Optics |
CA3055545A1 (en) | 2017-03-09 | 2018-09-13 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Head-mounted light field display with integral imaging and waveguide prism |
US10001808B1 (en) | 2017-03-29 | 2018-06-19 | Google Llc | Mobile device accessory equipped to communicate with mobile device |
US10579168B2 (en) | 2017-03-30 | 2020-03-03 | Microsoft Technology Licensing, Llc | Dual LED drive circuit |
US10594945B2 (en) | 2017-04-03 | 2020-03-17 | Google Llc | Generating dolly zoom effect using light field image data |
US10453172B2 (en) | 2017-04-04 | 2019-10-22 | International Business Machines Corporation | Sparse-data generative model for pseudo-puppet memory recast |
US10013081B1 (en) | 2017-04-04 | 2018-07-03 | Google Llc | Electronic circuit and method to account for strain gauge variation |
US10514797B2 (en) | 2017-04-18 | 2019-12-24 | Google Llc | Force-sensitive user input interface for an electronic device |
US10635255B2 (en) | 2017-04-18 | 2020-04-28 | Google Llc | Electronic device response to force-sensitive interface |
US10474227B2 (en) | 2017-05-09 | 2019-11-12 | Google Llc | Generation of virtual reality with 6 degrees of freedom from limited viewer data |
US10354399B2 (en) | 2017-05-25 | 2019-07-16 | Google Llc | Multi-view back-projection to a light-field |
US10613413B1 (en) | 2017-05-31 | 2020-04-07 | Facebook Technologies, Llc | Ultra-wide field-of-view scanning devices for depth sensing |
US10885607B2 (en) * | 2017-06-01 | 2021-01-05 | Qualcomm Incorporated | Storage for foveated rendering |
US10712567B2 (en) | 2017-06-15 | 2020-07-14 | Microsoft Technology Licensing, Llc | Holographic display system |
CN107086027A (en) * | 2017-06-23 | 2017-08-22 | 青岛海信移动通信技术股份有限公司 | Character displaying method and device, mobile terminal and storage medium |
US10181200B1 (en) | 2017-06-28 | 2019-01-15 | Facebook Technologies, Llc | Circularly polarized illumination and detection for depth sensing |
US11953764B2 (en) | 2017-07-10 | 2024-04-09 | Optica Amuka (A.A.) Ltd. | Tunable lenses with enhanced performance features |
US11747619B2 (en) | 2017-07-10 | 2023-09-05 | Optica Amuka (A.A.) Ltd. | Virtual reality and augmented reality systems with dynamic vision correction |
US10360832B2 (en) | 2017-08-14 | 2019-07-23 | Microsoft Technology Licensing, Llc | Post-rendering image transformation using parallel image transformation pipelines |
JP2019046006A (en) * | 2017-08-31 | 2019-03-22 | シャープ株式会社 | Touch pen |
US10574973B2 (en) | 2017-09-06 | 2020-02-25 | Facebook Technologies, Llc | Non-mechanical beam steering for depth sensing |
US10545215B2 (en) | 2017-09-13 | 2020-01-28 | Google Llc | 4D camera tracking and optical stabilization |
US10102659B1 (en) | 2017-09-18 | 2018-10-16 | Nicholas T. Hariton | Systems and methods for utilizing a device as a marker for augmented reality content |
US10890767B1 (en) | 2017-09-27 | 2021-01-12 | United Services Automobile Association (Usaa) | System and method for automatic vision correction in near-to-eye displays |
US10489951B2 (en) | 2017-09-29 | 2019-11-26 | Qualcomm Incorporated | Display of a live scene and auxiliary object |
US11861136B1 (en) * | 2017-09-29 | 2024-01-02 | Apple Inc. | Systems, methods, and graphical user interfaces for interacting with virtual reality environments |
US10930709B2 (en) | 2017-10-03 | 2021-02-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for image sensors |
WO2019077442A1 (en) | 2017-10-16 | 2019-04-25 | Optica Amuka (A.A.) Ltd. | Spectacles with electrically-tunable lenses controllable by an external system |
US11368670B2 (en) * | 2017-10-26 | 2022-06-21 | Yeda Research And Development Co. Ltd. | Augmented reality display system and method |
US10105601B1 (en) | 2017-10-27 | 2018-10-23 | Nicholas T. Hariton | Systems and methods for rendering a virtual content object in an augmented reality environment |
US10761625B2 (en) | 2017-10-31 | 2020-09-01 | Microsoft Technology Licensing, Llc | Stylus for operation with a digitizer |
US10510812B2 (en) | 2017-11-09 | 2019-12-17 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
IL255891B2 (en) * | 2017-11-23 | 2023-05-01 | Everysight Ltd | Site selection for display of information |
CN107861754B (en) * | 2017-11-30 | 2020-12-01 | 阿里巴巴(中国)有限公司 | Data packaging method, data processing method, data packaging device, data processing device and electronic equipment |
US10656706B2 (en) * | 2017-12-04 | 2020-05-19 | International Business Machines Corporation | Modifying a computer-based interaction based on eye gaze |
US11256093B2 (en) | 2017-12-11 | 2022-02-22 | Magic Leap, Inc. | Waveguide illuminator |
US11656466B2 (en) * | 2018-01-03 | 2023-05-23 | Sajjad A. Khan | Spatio-temporal multiplexed single panel based mutual occlusion capable head mounted display system and method |
EP3737980A4 (en) | 2018-01-14 | 2021-11-10 | Light Field Lab, Inc. | Systems and methods for transverse energy localization in energy relays using ordered structures |
US11650354B2 (en) | 2018-01-14 | 2023-05-16 | Light Field Lab, Inc. | Systems and methods for rendering data from a 3D environment |
CN112074773B (en) | 2018-01-14 | 2024-02-02 | 光场实验室公司 | Four-dimensional energy field packaging assembly |
WO2019140398A1 (en) | 2018-01-14 | 2019-07-18 | Light Field Lab, Inc. | Holographic and diffractive optical encoding systems |
US10965862B2 (en) | 2018-01-18 | 2021-03-30 | Google Llc | Multi-camera navigation interface |
US10634913B2 (en) * | 2018-01-22 | 2020-04-28 | Symbol Technologies, Llc | Systems and methods for task-based adjustable focal distance for heads-up displays |
US10652529B2 (en) | 2018-02-07 | 2020-05-12 | Lockheed Martin Corporation | In-layer Signal processing |
US11616941B2 (en) | 2018-02-07 | 2023-03-28 | Lockheed Martin Corporation | Direct camera-to-display system |
US10690910B2 (en) | 2018-02-07 | 2020-06-23 | Lockheed Martin Corporation | Plenoptic cellular vision correction |
US10979699B2 (en) | 2018-02-07 | 2021-04-13 | Lockheed Martin Corporation | Plenoptic cellular imaging system |
US10838250B2 (en) * | 2018-02-07 | 2020-11-17 | Lockheed Martin Corporation | Display assemblies with electronically emulated transparency |
US10129984B1 (en) | 2018-02-07 | 2018-11-13 | Lockheed Martin Corporation | Three-dimensional electronics distribution by geodesic faceting |
US10594951B2 (en) | 2018-02-07 | 2020-03-17 | Lockheed Martin Corporation | Distributed multi-aperture camera array |
US10951883B2 (en) | 2018-02-07 | 2021-03-16 | Lockheed Martin Corporation | Distributed multi-screen array for high density display |
US10636188B2 (en) | 2018-02-09 | 2020-04-28 | Nicholas T. Hariton | Systems and methods for utilizing a living entity as a marker for augmented reality content |
US10735649B2 (en) | 2018-02-22 | 2020-08-04 | Magic Leap, Inc. | Virtual and augmented reality systems and methods using display system control information embedded in image data |
US11099386B1 (en) | 2018-03-01 | 2021-08-24 | Apple Inc. | Display device with optical combiner |
WO2019178060A1 (en) | 2018-03-12 | 2019-09-19 | Magic Leap, Inc. | Tilting array based display |
JP7185331B2 (en) | 2018-03-22 | 2022-12-07 | アリゾナ ボード オブ リージェンツ オン ビハーフ オブ ザ ユニバーシティ オブ アリゾナ | How to render light field images for integral imaging light field displays |
US10198871B1 (en) | 2018-04-27 | 2019-02-05 | Nicholas T. Hariton | Systems and methods for generating and facilitating access to a personalized augmented rendering of a user |
KR102118737B1 (en) * | 2018-06-01 | 2020-06-03 | 한밭대학교 산학협력단 | Pen lead holding appartus |
US10331874B1 (en) * | 2018-06-06 | 2019-06-25 | Capital One Services, Llc | Providing an augmented reality overlay to secure input data |
US10410372B1 (en) | 2018-06-14 | 2019-09-10 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer-readable media for utilizing radial distortion to estimate a pose configuration |
CN110605928A (en) * | 2018-06-14 | 2019-12-24 | 徐瀚奇 | Vibration writing pen |
SG11202100408XA (en) | 2018-07-25 | 2021-02-25 | Light Field Lab Inc | Light field display system based amusement park attraction |
KR102084321B1 (en) | 2018-08-13 | 2020-03-03 | 한밭대학교 산학협력단 | Pen lead holding appartus with release function and electric pen using the same |
US10866413B2 (en) | 2018-12-03 | 2020-12-15 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
KR102328618B1 (en) * | 2018-12-19 | 2021-11-18 | 한국광기술원 | Apparatus and Method for Attenuating Light Reactively |
CN111404765B (en) * | 2019-01-02 | 2021-10-26 | 中国移动通信有限公司研究院 | Message processing method, device, equipment and computer readable storage medium |
US11707806B2 (en) * | 2019-02-12 | 2023-07-25 | Illinois Tool Works Inc. | Virtual markings in welding systems |
US11112865B1 (en) * | 2019-02-13 | 2021-09-07 | Facebook Technologies, Llc | Systems and methods for using a display as an illumination source for eye tracking |
US11212514B2 (en) | 2019-03-25 | 2021-12-28 | Light Field Lab, Inc. | Light field display system for cinemas |
US10698201B1 (en) | 2019-04-02 | 2020-06-30 | Lockheed Martin Corporation | Plenoptic cellular axis redirection |
WO2020209491A1 (en) | 2019-04-11 | 2020-10-15 | Samsung Electronics Co., Ltd. | Head-mounted display device and operating method of the same |
US10586396B1 (en) | 2019-04-30 | 2020-03-10 | Nicholas T. Hariton | Systems, methods, and storage media for conveying virtual content in an augmented reality environment |
WO2020226833A1 (en) | 2019-05-06 | 2020-11-12 | Apple Inc. | Device, method, and graphical user interface for composing cgr files |
DE112020002268T5 (en) | 2019-05-06 | 2022-02-10 | Apple Inc. | DEVICE, METHOD AND COMPUTER READABLE MEDIA FOR REPRESENTING COMPUTER GENERATED REALITY FILES |
KR102069745B1 (en) * | 2019-05-14 | 2020-01-23 | (주)딥스원테크 | Pentip for multi-direction recognition combinating on electronic pen for writing on pettern film and electronic pen having multi-direction recognition for writing on pattern film |
CN110446194B (en) * | 2019-07-02 | 2023-05-23 | 广州视睿电子科技有限公司 | Intelligent pen control method and intelligent pen |
US10885819B1 (en) * | 2019-08-02 | 2021-01-05 | Harman International Industries, Incorporated | In-vehicle augmented reality system |
KR20220045166A (en) | 2019-08-09 | 2022-04-12 | 라이트 필드 랩 인코포레이티드 | Digital signage system based on light field display system |
US11822083B2 (en) | 2019-08-13 | 2023-11-21 | Apple Inc. | Display system with time interleaving |
US12130955B2 (en) | 2019-09-03 | 2024-10-29 | Light Field Lab, Inc. | Light field display for mobile devices |
US10712791B1 (en) | 2019-09-13 | 2020-07-14 | Microsoft Technology Licensing, Llc | Photovoltaic powered thermal management for wearable electronic devices |
CN114514495A (en) * | 2019-11-08 | 2022-05-17 | 株式会社和冠 | Electronic pen |
US11164339B2 (en) * | 2019-11-12 | 2021-11-02 | Sony Interactive Entertainment Inc. | Fast region of interest coding using multi-segment temporal resampling |
JP2023512869A (en) | 2019-12-03 | 2023-03-30 | ライト フィールド ラボ、インコーポレイテッド | Light field display systems for video games and electronic sports |
US11864841B2 (en) | 2019-12-31 | 2024-01-09 | Carl Zeiss Meditec Ag | Method of operating a surgical microscope and surgical microscope |
US11409091B2 (en) * | 2019-12-31 | 2022-08-09 | Carl Zeiss Meditec Ag | Method of operating a surgical microscope and surgical microscope |
US11607287B2 (en) | 2019-12-31 | 2023-03-21 | Carl Zeiss Meditec Ag | Method of operating a surgical microscope and surgical microscope |
JP6814898B2 (en) * | 2020-02-07 | 2021-01-20 | 株式会社ワコム | Electronic pen and position detection system |
JP6956248B2 (en) * | 2020-02-07 | 2021-11-02 | 株式会社ワコム | Electronic pen and position detection system |
US11709363B1 (en) | 2020-02-10 | 2023-07-25 | Avegant Corp. | Waveguide illumination of a spatial light modulator |
JP1677382S (en) * | 2020-04-21 | 2021-01-25 | ||
JP1683335S (en) * | 2020-04-21 | 2021-04-12 | ||
JP1683336S (en) * | 2020-04-21 | 2021-04-12 | ||
JP1677383S (en) * | 2020-04-21 | 2021-01-25 | ||
US20210349310A1 (en) * | 2020-05-11 | 2021-11-11 | Sony Interactive Entertainment Inc. | Highly interactive display environment for gaming |
WO2021263050A1 (en) | 2020-06-26 | 2021-12-30 | Limonox Projects Llc | Devices, methods and graphical user interfaces for content applications |
US11157081B1 (en) | 2020-07-28 | 2021-10-26 | Shenzhen Yunyinggu Technology Co., Ltd. | Apparatus and method for user interfacing in display glasses |
CN112043388B (en) * | 2020-08-14 | 2022-02-01 | 武汉大学 | Touch man-machine interaction device for medical teleoperation |
JP2023543799A (en) | 2020-09-25 | 2023-10-18 | アップル インコーポレイテッド | How to navigate the user interface |
US11860366B2 (en) | 2020-09-29 | 2024-01-02 | Avegant Corp. | Architecture to illuminate a display panel |
WO2022075990A1 (en) * | 2020-10-08 | 2022-04-14 | Hewlett-Packard Development Company, L.P. | Augmented reality documents |
EP4295314A1 (en) | 2021-02-08 | 2023-12-27 | Sightful Computers Ltd | Content sharing in extended reality |
EP4288950A1 (en) | 2021-02-08 | 2023-12-13 | Sightful Computers Ltd | User interactions in extended reality |
CN112788473B (en) * | 2021-03-11 | 2023-12-26 | 维沃移动通信有限公司 | earphone |
US20240036318A1 (en) * | 2021-12-21 | 2024-02-01 | Alexander Sarris | System to superimpose information over a users field of view |
US20240017482A1 (en) * | 2022-07-15 | 2024-01-18 | General Electric Company | Additive manufacturing methods and systems |
US12079442B2 (en) | 2022-09-30 | 2024-09-03 | Sightful Computers Ltd | Presenting extended reality content in different physical environments |
WO2024107372A1 (en) * | 2022-11-18 | 2024-05-23 | Lumileds Llc | Visualization system including direct and converted polychromatic led array |
US12105873B2 (en) * | 2022-11-29 | 2024-10-01 | Pixieray Oy | Light field based eye tracking |
WO2024129662A1 (en) * | 2022-12-12 | 2024-06-20 | Lumileds Llc | Visualization system including tunnel junction based rgb die with isolated active regions |
SE2330076A1 (en) * | 2023-02-10 | 2024-08-11 | Flatfrog Lab Ab | Augmented Reality Projection Surface with Optimized Features |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6378070B1 (en) * | 1998-01-09 | 2002-04-23 | Hewlett-Packard Company | Secure printing |
US6379058B1 (en) * | 2000-03-30 | 2002-04-30 | Zih Corp. | System for RF communication between a host and a portable printer |
US6627870B1 (en) * | 1999-10-25 | 2003-09-30 | Silverbrook Research Pty Ltd | Sensing device with interchangeable nibs |
US20040004735A1 (en) * | 2002-07-03 | 2004-01-08 | Oakeson Kenneth Lee | Proximity-based print queue adjustment |
US6745234B1 (en) * | 1998-09-11 | 2004-06-01 | Digital:Convergence Corporation | Method and apparatus for accessing a remote location by scanning an optical code |
US6768821B2 (en) * | 1999-05-25 | 2004-07-27 | Silverbrook Research Pty Ltd | Sensing device with identifier |
US20050105734A1 (en) * | 2003-09-30 | 2005-05-19 | Mark Buer | Proximity authentication system |
US7312887B2 (en) * | 2003-01-03 | 2007-12-25 | Toshiba Corporation | Internet print protocol print dispatch server |
Family Cites Families (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2306669A (en) * | 1940-11-12 | 1942-12-29 | Du Pont | Vulcanization of rubber |
NL106344C (en) * | 1959-01-20 | |||
FR1250814A (en) | 1960-02-05 | 1961-01-13 | Poor visibility tracking system which can be used in particular for landing aircraft | |
US3632184A (en) * | 1970-03-02 | 1972-01-04 | Bell Telephone Labor Inc | Three-dimensional display |
JPS5892081U (en) * | 1981-12-15 | 1983-06-22 | セイコーインスツルメンツ株式会社 | stylus pen |
US4864618A (en) * | 1986-11-26 | 1989-09-05 | Wright Technologies, L.P. | Automated transaction system with modular printhead having print authentication feature |
DE3712077A1 (en) * | 1987-04-09 | 1988-10-27 | Bosch Gmbh Robert | FORCE MEASURING DEVICE |
JPH0630506B2 (en) | 1987-07-21 | 1994-04-20 | 横河電機株式会社 | Serial communication device |
US4896543A (en) * | 1988-11-15 | 1990-01-30 | Sri International, Inc. | Three-axis force measurement stylus |
JPH02146526A (en) * | 1988-11-29 | 1990-06-05 | Seiko Instr Inc | Liquid crystal element |
US5051736A (en) * | 1989-06-28 | 1991-09-24 | International Business Machines Corporation | Optical stylus and passive digitizing tablet data input system |
JP2505037Y2 (en) * | 1990-03-16 | 1996-07-24 | 日本電気株式会社 | Stylus pen |
US5044805A (en) * | 1990-04-11 | 1991-09-03 | Steve Kosteniuk | Mechanical pencil |
JP3150685B2 (en) * | 1990-08-06 | 2001-03-26 | 株式会社ワコム | Variable capacitance capacitor |
US20040130783A1 (en) * | 2002-12-02 | 2004-07-08 | Solomon Dennis J | Visual display with full accommodation |
JP2726594B2 (en) * | 1991-04-01 | 1998-03-11 | 八洲電機株式会社 | Memory pen |
JPH052447A (en) * | 1991-06-25 | 1993-01-08 | Hitachi Seiko Ltd | Writing pressure detecting pen |
US5166778A (en) * | 1991-09-05 | 1992-11-24 | General Electric Company | Single-lens color video stereoscopic helmet mountable display |
JPH0588809A (en) * | 1991-09-30 | 1993-04-09 | Toshiba Corp | Writing utensil type pointing device |
US5477012A (en) * | 1992-04-03 | 1995-12-19 | Sekendur; Oral F. | Optical position determination |
US5852434A (en) * | 1992-04-03 | 1998-12-22 | Sekendur; Oral F. | Absolute optical position determination |
ATE148952T1 (en) * | 1992-07-08 | 1997-02-15 | Smart Pen Inc | APPARATUS AND METHOD FOR REPRESENTING WRITTEN INFORMATION. |
JPH0635592A (en) * | 1992-07-13 | 1994-02-10 | Fujikura Rubber Ltd | Stylus pen |
US5571997A (en) * | 1993-08-02 | 1996-11-05 | Kurta Corporation | Pressure sensitive pointing device for transmitting signals to a tablet |
JPH09503879A (en) * | 1993-10-18 | 1997-04-15 | サマグラフィクス コーポレイション | Pressure sensitive stylus with elastically compressible tip element |
JPH07200215A (en) * | 1993-12-01 | 1995-08-04 | Internatl Business Mach Corp <Ibm> | Selection method of printing device and data processing network |
US5438275A (en) * | 1994-01-03 | 1995-08-01 | International Business Machines Corporation | Digitizing stylus having capacitive pressure and contact sensing capabilities |
JPH09508478A (en) * | 1994-02-07 | 1997-08-26 | バーチュアル・アイ/オゥ・インコーポレイテッド | Personal visual display |
JPH0821975A (en) * | 1994-07-06 | 1996-01-23 | Olympus Optical Co Ltd | Head-mounted type video display system |
GB2291304A (en) * | 1994-07-07 | 1996-01-17 | Marconi Gec Ltd | Head-mountable display system |
US5652412A (en) * | 1994-07-11 | 1997-07-29 | Sia Technology Corp. | Pen and paper information recording system |
US5661506A (en) * | 1994-11-10 | 1997-08-26 | Sia Technology Corporation | Pen and paper information recording system using an imaging pen |
TW275590B (en) * | 1994-12-09 | 1996-05-11 | Sega Enterprises Kk | Head mounted display and system for use therefor |
GB2337680B (en) * | 1994-12-09 | 2000-02-23 | Sega Enterprises Kk | Head mounted display, and head mounted video display system |
GB2301896B (en) * | 1995-06-07 | 1999-04-21 | Ferodo Ltd | Force transducer |
US6081261A (en) | 1995-11-01 | 2000-06-27 | Ricoh Corporation | Manual entry interactive paper and electronic document handling and processing system |
US5692073A (en) * | 1996-05-03 | 1997-11-25 | Xerox Corporation | Formless forms and paper web using a reference-based mark extraction technique |
US6847336B1 (en) * | 1996-10-02 | 2005-01-25 | Jerome H. Lemelson | Selectively controllable heads-up display system |
US6518950B1 (en) | 1997-10-07 | 2003-02-11 | Interval Research Corporation | Methods and systems for providing human/computer interfaces |
WO1999023524A1 (en) * | 1997-10-30 | 1999-05-14 | The Microoptical Corporation | Eyeglass interface system |
WO1999050736A1 (en) | 1998-04-01 | 1999-10-07 | Xerox Corporation | Paper indexing of recordings |
US6964374B1 (en) * | 1998-10-02 | 2005-11-15 | Lucent Technologies Inc. | Retrieval and manipulation of electronically stored information via pointers embedded in the associated printed material |
US6344848B1 (en) * | 1999-02-19 | 2002-02-05 | Palm, Inc. | Stylus assembly |
KR20000074397A (en) * | 1999-05-20 | 2000-12-15 | 윤종용 | Portable computer with function of power control by combination or separation of stylus |
AUPQ056099A0 (en) * | 1999-05-25 | 1999-06-17 | Silverbrook Research Pty Ltd | A method and apparatus (pprint01) |
US7123239B1 (en) * | 1999-05-25 | 2006-10-17 | Paul Lapstun | Computer system control with user data via interface surface |
US6120461A (en) * | 1999-08-09 | 2000-09-19 | The United States Of America As Represented By The Secretary Of The Army | Apparatus for tracking the human eye with a retinal scanning display, and method thereof |
US6836555B2 (en) * | 1999-12-23 | 2004-12-28 | Anoto Ab | Information management system with authenticity check |
US6261015B1 (en) * | 2000-01-28 | 2001-07-17 | Bic Corporation | Roller ball pen with adjustable spring tension |
JP2001325182A (en) * | 2000-03-10 | 2001-11-22 | Ricoh Co Ltd | Print system, print method, computer readable recording medium with program recorded therein, portable communication equipment of print system, printer, print server and client |
CA2420390A1 (en) * | 2000-08-24 | 2002-02-28 | Immersive Technologies, Llc. | Computerized image system |
US6856407B2 (en) * | 2000-09-13 | 2005-02-15 | Nextengine, Inc. | Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels |
SG152904A1 (en) * | 2000-10-20 | 2009-06-29 | Silverbrook Res Pty Ltd | Cartridge for an electronic pen |
JP2002358156A (en) * | 2001-05-31 | 2002-12-13 | Pentel Corp | Coordinate inputting pen with sensing pressure function |
JP2003315650A (en) * | 2002-04-26 | 2003-11-06 | Olympus Optical Co Ltd | Optical device |
US7003267B2 (en) * | 2002-05-14 | 2006-02-21 | Siemens Communications, Inc. | Internal part design, molding and surface finish for cosmetic appearance |
US7158122B2 (en) * | 2002-05-17 | 2007-01-02 | 3M Innovative Properties Company | Calibration of force based touch panel systems |
JP2003337665A (en) * | 2002-05-20 | 2003-11-28 | Fujitsu Ltd | Information system, print method and program |
US20040128163A1 (en) * | 2002-06-05 | 2004-07-01 | Goodman Philip Holden | Health care information management apparatus, system and method of use and doing business |
US7006709B2 (en) * | 2002-06-15 | 2006-02-28 | Microsoft Corporation | System and method deghosting mosaics using multiperspective plane sweep |
US7009594B2 (en) * | 2002-10-31 | 2006-03-07 | Microsoft Corporation | Universal computing device |
US20040095311A1 (en) * | 2002-11-19 | 2004-05-20 | Motorola, Inc. | Body-centric virtual interactive apparatus and method |
US6967781B2 (en) * | 2002-11-29 | 2005-11-22 | Brother Kogyo Kabushiki Kaisha | Image display apparatus for displaying image in variable direction relative to viewer |
US7077594B1 (en) * | 2003-02-25 | 2006-07-18 | Palm, Incorporated | Expandable and contractible stylus |
DE10316518A1 (en) * | 2003-04-10 | 2004-10-21 | Carl Zeiss Jena Gmbh | Imaging device for augmented imaging |
US6912920B2 (en) * | 2003-07-31 | 2005-07-05 | Delphi Technologies, Inc. | Frame-based occupant weight estimation load cell with ball-actuated force sensor |
US8041888B2 (en) * | 2004-02-05 | 2011-10-18 | Netapp, Inc. | System and method for LUN cloning |
US7627703B2 (en) * | 2005-06-29 | 2009-12-01 | Microsoft Corporation | Input device with audio capabilities |
-
2005
- 2005-08-01 AU AU2005269255A patent/AU2005269255A1/en not_active Abandoned
- 2005-08-01 WO PCT/AU2005/001124 patent/WO2006012679A1/en active Application Filing
- 2005-08-01 SG SG200905070-9A patent/SG155167A1/en unknown
- 2005-08-01 AU AU2005269254A patent/AU2005269254B2/en not_active Ceased
- 2005-08-01 JP JP2007524130A patent/JP2008508621A/en active Pending
- 2005-08-01 EP EP05764241A patent/EP1779178A4/en not_active Withdrawn
- 2005-08-01 AU AU2005269256A patent/AU2005269256B2/en not_active Ceased
- 2005-08-01 CA CA2576010A patent/CA2576010C/en not_active Expired - Fee Related
- 2005-08-01 KR KR1020077005171A patent/KR101084853B1/en not_active IP Right Cessation
- 2005-08-01 CN CN2005800261388A patent/CN1993688B/en not_active Expired - Fee Related
- 2005-08-01 US US11/193,482 patent/US20060028459A1/en not_active Abandoned
- 2005-08-01 US US11/193,481 patent/US20060028400A1/en not_active Abandoned
- 2005-08-01 CA CA002576016A patent/CA2576016A1/en not_active Abandoned
- 2005-08-01 EP EP05764195A patent/EP1779081A4/en not_active Withdrawn
- 2005-08-01 WO PCT/AU2005/001123 patent/WO2006012678A1/en active Application Filing
- 2005-08-01 CA CA002576026A patent/CA2576026A1/en not_active Abandoned
- 2005-08-01 US US11/193,479 patent/US20060028674A1/en not_active Abandoned
- 2005-08-01 WO PCT/AU2005/001122 patent/WO2006012677A1/en active Application Filing
- 2005-08-01 JP JP2007524129A patent/JP4638493B2/en not_active Expired - Fee Related
- 2005-08-01 US US11/193,435 patent/US7567241B2/en not_active Expired - Fee Related
- 2005-08-01 EP EP05764221A patent/EP1782228A1/en not_active Withdrawn
-
2007
- 2007-02-28 KR KR1020077004867A patent/KR101108266B1/en not_active IP Right Cessation
-
2009
- 2009-07-05 US US12/497,684 patent/US8308387B2/en not_active Expired - Fee Related
-
2010
- 2010-10-04 US US12/897,758 patent/US20110018903A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6378070B1 (en) * | 1998-01-09 | 2002-04-23 | Hewlett-Packard Company | Secure printing |
US6745234B1 (en) * | 1998-09-11 | 2004-06-01 | Digital:Convergence Corporation | Method and apparatus for accessing a remote location by scanning an optical code |
US6768821B2 (en) * | 1999-05-25 | 2004-07-27 | Silverbrook Research Pty Ltd | Sensing device with identifier |
US6627870B1 (en) * | 1999-10-25 | 2003-09-30 | Silverbrook Research Pty Ltd | Sensing device with interchangeable nibs |
US6379058B1 (en) * | 2000-03-30 | 2002-04-30 | Zih Corp. | System for RF communication between a host and a portable printer |
US20040004735A1 (en) * | 2002-07-03 | 2004-01-08 | Oakeson Kenneth Lee | Proximity-based print queue adjustment |
US7312887B2 (en) * | 2003-01-03 | 2007-12-25 | Toshiba Corporation | Internet print protocol print dispatch server |
US20050105734A1 (en) * | 2003-09-30 | 2005-05-19 | Mark Buer | Proximity authentication system |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7694889B2 (en) * | 2005-02-28 | 2010-04-13 | Fuji Xerox Co., Ltd. | Printed material having location identification function, two-dimensional coordinate identification apparatus, image-forming apparatus and the method thereof |
US20060193522A1 (en) * | 2005-02-28 | 2006-08-31 | Fuji Xerox Co., Ltd. | Printed material having location identification function, two-dimensional coordinate identification apparatus, image-forming apparatus and the method thereof |
US20070139711A1 (en) * | 2005-12-16 | 2007-06-21 | Brother Kogyo Kabushiki Kaisha | Image forming apparatus, image forming method, and recording sheet |
US8094341B2 (en) * | 2005-12-16 | 2012-01-10 | Brother Kogyo Kabushiki Kaisha | Image forming apparatus, image forming method, and recording sheet |
US8279463B2 (en) | 2006-03-16 | 2012-10-02 | Oce-Technologies B.V. | Printing via kickstart function |
US20070216947A1 (en) * | 2006-03-16 | 2007-09-20 | Oce-Technologies B.V. | Printing via kickstart function |
EP1835714A1 (en) | 2006-03-16 | 2007-09-19 | Océ-Technologies B.V. | Printing via kickstart function |
US20080130882A1 (en) * | 2006-12-05 | 2008-06-05 | International Business Machines Corporation | Secure printing via rfid tags |
US20090080015A1 (en) * | 2007-09-21 | 2009-03-26 | Silverbrook Research Pty Ltd | Printer driver for interactive printer |
US20090080017A1 (en) * | 2007-09-21 | 2009-03-26 | Silverbrook Research Pty Ltd | Printer driver configured for receiving print impression identity from a printer |
US8284428B2 (en) * | 2007-09-21 | 2012-10-09 | Silverbrook Research Pty Ltd | Printer driver for interactive printer |
US8051012B2 (en) * | 2008-06-09 | 2011-11-01 | Hewlett-Packard Development Company, L.P. | System and method for discounted printing |
US20090307029A1 (en) * | 2008-06-09 | 2009-12-10 | Krishnan Ramanathan | System and method for discounted printing |
US20110314539A1 (en) * | 2010-06-18 | 2011-12-22 | At&T Intellectual Property I, L.P. | Proximity Based Device Security |
US9443071B2 (en) * | 2010-06-18 | 2016-09-13 | At&T Intellectual Property I, L.P. | Proximity based device security |
US20130335758A1 (en) * | 2012-06-18 | 2013-12-19 | Canon Kabushiki Kaisha | Image-forming apparatus communicating with an information-processing apparatus |
US9007635B2 (en) * | 2012-06-18 | 2015-04-14 | Canon Kabushiki Kaisha | Image-forming apparatus communicating with an information-processing apparatus |
US20140114782A1 (en) * | 2012-10-22 | 2014-04-24 | NCR Corporation, Law Dept. | Techniques for retail printing |
US10019702B2 (en) * | 2012-10-22 | 2018-07-10 | Ncr Corporation | Techniques for retail printing |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060028674A1 (en) | Printer with user ID sensor | |
US8312281B2 (en) | Computer system incorporating a target and symbol data sensing arrangement | |
AU2005243106B2 (en) | Authentication of an object using a signature encoded in a number of data portions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SILVERBROOK RESEARCH PTY LTD, AUSTRALIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LAPSTUN, PAUL;SILVERBROOK, KIA;REEL/FRAME:016856/0655 Effective date: 20050713 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |