US20050088407A1 - Method and system for managing an interactive video display system - Google Patents
Method and system for managing an interactive video display system Download PDFInfo
- Publication number
- US20050088407A1 US20050088407A1 US10/973,335 US97333504A US2005088407A1 US 20050088407 A1 US20050088407 A1 US 20050088407A1 US 97333504 A US97333504 A US 97333504A US 2005088407 A1 US2005088407 A1 US 2005088407A1
- Authority
- US
- United States
- Prior art keywords
- data
- video
- recited
- spots
- display system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25866—Management of end-user data
- H04N21/25891—Management of end-user data being end-user preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41415—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to the field of visual electronic displays. Specifically, embodiments of the present invention relate to a self-contained interactive video display system.
- Advertising is used to convey various types of information to an audience. To fully maximize the performance of an advertisement, it is desirable for the advertiser to be able to gather information regarding effectiveness of the advertisement(s). This effectiveness measure may comprise both how many people saw the advertisement, whether people paid attention to it.
- data gathering is typically based on information related to where the advertisement is placed. For example, for a billboard advertisement, effectiveness may be measured by the amount of automobile traffic that passes the billboard.
- effectiveness, or popularity may be based on popularity ratings for the television show during which the television commercial was aired. In this way, information regarding the popularity of an advertisement may be inferred based on the advertisement placement information.
- Interactive video display systems allow real-time unencumbered human interactions with video displays. Natural physical motions by human users are captured by a computer vision system and used to drive visual effects.
- the computer vision system usually uses images captured by a video camera as input and has software processes that gather real-time information about people and other objects in the interactive area viewed by the camera.
- a method and system for managing an interactive video display system are described herein.
- a plurality of video spots are displayed on the interactive video display system.
- Data based on interactions with the interactive video display system corresponding to video spots of the plurality of video spots is gathered.
- the interaction is determined according to person tracking.
- the interaction is determined according to a foreground/background classification image.
- the data includes information about what virtual objects displayed by the interactive video display system were interacted with.
- the data includes information about a location of the interaction relative to a display of the interactive video display system.
- the data may then be stored and may be used in managing presentation of the video spots.
- the data is stored at a local memory of the interactive video display system.
- the data is transmitted to an external computer system and is stored in a memory of the external computer system.
- the display schedule of the plurality of video spots is adjusted based on analyzing data. In one embodiment, the display schedule is automatically adjusted based on the analysis. In another embodiment, the display schedule is manually adjusted based on the analysis.
- FIG. 1A illustrates a projection interactive video display system in accordance with an embodiment of the present invention.
- FIG. 1B illustrates a self-contained interactive video display system in accordance with an embodiment of the present invention.
- FIG. 2 illustrates a system for managing an interactive video display system in accordance with an embodiment of the present invention.
- FIG. 3 illustrates a process for managing an interactive video display system in accordance with an embodiment of the present invention.
- the interactive video display system includes a vision system that captures and processes information relating to a scene.
- the processed information is used to generate certain visual effects that are then displayed to viewers via an interactive display device. People are able to interact with such visual effects on a real-time basis.
- FIG. 1A illustrates a projection interactive video display system 100 in accordance with an embodiment of the present invention.
- Projection interactive video display system 100 uses a camera system 105 , an illuminator that illuminates surface 102 being viewed by camera 105 , a projector 110 that projects an image 120 onto the interactive space 115 of surface 102 , and a local computer (not shown) that takes as input the image of camera 105 and outputs a video image to projector 110 .
- the local computer processes the camera 105 input to discern information about the position and movement of people (or moving objects)in the volume in front of surface 102 .
- the local computer processes the camera 105 input to discern on a pixel-by-pixel basis what portions of the volume in front of surface 102 (e.g., interactive space 115 ) are occupied by people (or moving objects) and what portions of surface 102 are background.
- the local computer may accomplish this by developing several evolving models of what the background is believed to look like, and then comparing its concepts of the background to what camera 105 is currently imaging. Components of the local computer that process camera 105 input are collectively known as the vision system.
- Various embodiments of projection interactive video display system 100 and the vision system are described in co-pending U.S.
- projection interactive video display system 100 is coupled to a remote computer system 130 .
- remote computer system 130 is a remote server for collecting data gathered by projection interactive video display system 100 .
- remote computer system 130 is configured to collect data gathered by multiple projection interactive video display systems 100 that are located in physically distinct locations. It should be appreciated that remote computer system 130 may be located in a different physical location than projection interactive video display system 100 .
- FIG. 1B illustrates a self-contained interactive video display system 150 in accordance with an embodiment of the present invention.
- Self-contained interactive video display system 150 displays an image onto display screen 155 , and uses a camera (not shown) to detect people and objects in interactive space 160 .
- a local computer (not shown), also referred to as the image system, takes as input the image of the camera and outputs a video image to display screen 155 .
- self-contained interactive video display system 150 is coupled to a remote computer system 170 .
- remote computer system 170 is a remote server for collecting data gathered by self-contained interactive video display system 150 .
- remote computer system 170 is configured to collect data gathered by multiple self-contained interactive video display systems 150 and/or projection interactive video display systems 100 ( FIG. 1A ) that are located in physically distinct locations. It should be appreciated that remote computer system 170 may be located in a different physical location than self-contained interactive video display system 150 .
- FIG. 2 illustrates a system 200 for managing an interactive video display system in accordance with an embodiment of the present invention.
- System 200 includes interactive video display system 210 (e.g., projection interactive video display system 100 of FIG. 1A or self-contained interactive video display system 150 of FIG. 1B ) and remote computer system 230 (e.g., remote computer system 130 of FIG. 1A or remote computer system 170 of FIG. 1B ).
- Interactive video display system 210 includes display 212 , vision system 214 , video spot storage 216 , and data gathering module 218 .
- display 212 , vision system 214 , video spot storage 216 , and data gathering module 218 are components of a computer system of interactive video display system 210 .
- Remote computer system 230 includes data storage 232 , data analysis module 234 , and video spot schedule 236 . It should be appreciated that in various embodiments of the present invention, any or all of the components and functionality of remote computer system 230 may be local and included in interactive video display system 210 .
- interactive video display system 210 presents a rich opportunity to gather data about interactions with and usage of the interactive displays. This data is gathered at data gathering module 218 , and is useful for many kinds of interactive contents and applications. In one exemplary embodiment, the data is useful for designing advertising contents 20 as it allows advertisers to analyze where and when their advertisements are most popular, thus, allowing them to adjust their advertisements and/or schedules accordingly.
- the data may be analyzed locally by the same computer that handles vision system 214 .
- interactive video display system 210 may be networked, allowing the data for one or more systems to be analyzed from remote computer system 230 .
- the data from multiple systems may be pooled to increase the total amount of data or to identify differences between locations.
- real-time information relating to the present and past positions of people viewed by a camera of vision system 214 or a separate camera can be determined.
- additional information can be derived including, for example, the number of people on or near the display 212 , the time at which each person enters and leaves the display 212 (e.g., interactive area 115 of FIG. 1A ), and whether they are moving and interacting with the displayed image.
- interactive video display system 210 such as, a floor or wall-projected interactive display, the boundaries of the display screen are known.
- additional information about the people within display 212 can be derived.
- specific information can be determined based on the number of people who are interacting with Interactive video display system 210 and the number of people who are mere spectators.
- face-recognition systems it is also possible to gather demographic data such as gender.
- the person tracking system may take the form of a face-tracking system, in which heads are counted and tracked instead of entire bodies.
- a variety of person-tracking processes exist. These processes may use one or more of a variety of systems and techniques including, but not limited to, stereo cameras, time-of-flight cameras, depth segmentation, color segmentation, feature tracking, face detection, template matching, adaptive template matching, and blob tracking.
- Similar data can be gathered from a foreground/background classification image, which may be produced by vision system 214 .
- the portion of the image that is classified as foreground is a rough approximation of the number of people viewed by the camera. If the width and height of the screen viewed by the camera and the approximate cross-sectional size of a typical person from the camera's point of view are known, then the approximate portion of the screen that is turned to foreground by the presence of a single person can be computed. Consequently, a conversion can be made from foreground portion to number of people. Alternatively, by doing some experiments with a person in front of the camera, the approximate portion of foreground that corresponds to a single person for a given installation can be determined.
- This “foreground portion” data can be used to roughly estimate how popular the display is at a given time, and whether people are entering or leaving the display. Because the physical boundaries of interactive video display system 210 may only take up a portion of the camera's view, the foreground/background classification image can be segmented into different regions, such as, “on the display”, “within one foot of the display's boundary” “off the display, but within 4 feet of it” and the foreground portions for each region can be recorded separately. Small thumbnail or full-size copies of the vision foreground-background classification image can also be directly logged for later analysis.
- Data from the person-tracking information and/or the foreground portions may be referred to as vision data and can be written (e.g., stored) on a periodic basis (e.g., once per second) to a log file.
- the log file is stored in data storage 232 .
- This log file can then be analyzed, either locally or centrally, at a later time by data analysis module 234 .
- the log entries may be time-stamped and may contain information about what content is running at that time.
- data about the specific interactions that took place on the display can be gathered.
- interactive video display system 210 is showing a survey that asks users questions and takes their responses by having them touch virtual buttons.
- the votes on the virtual survey could be recorded in the logs along with the vision data.
- the scores and actions of the players could be added to the log file.
- interactive informational content such as, a virtual shopping catalog
- each instance of a product or item being viewed could be recorded in the log file.
- any information about an instance of human interaction with a virtual object on the screen can be logged for later analysis. This can be used as a valuable feedback tool for advertisers, game designers, and content creators.
- the display or screen of interactive video display system 210 requires maintenance and cleaning. In public installations, this maintenance and cleaning generally takes place after hours, when the display is turned off. In order to verify that maintenance and cleaning is taking place on a regular basis, the data logs can be checked for activity after hours.
- interactive video display system 210 sequentially displays different pieces of content, which will be referred to as “spots” or “video spots”, each for a length of time.
- the spots are stored in video spot storage 216 .
- This series of spots can be conceptually compared to a series of television commercials. There is flexibility in the number of spots, the length of time that they play, and the order in which they play.
- video spot scheduler 236 controls the scheduling of the spots, wherein video spot scheduler 236 may be located locally or remotely, as described above.
- Data analysis module 234 is operable to perform data analysis on the gathered data stored in data storage 232 .
- the popularity of each spot can be determined based on the logged data as described above. For example, a spot's popularity can be measured by taking the average of the logged values for the foreground portion or the number of people on the screen during the periods when the spot was showing.
- the number of people on the screen when the spot begins playing is determined by the popularity of the previous spot. For example, a spot that follows an unpopular spot will tend to have lower activity levels than it otherwise would have.
- the number of people interacting with a spot depends on the number of people near the display; if very few people are in the venue where the interactive system is installed, then relatively few people will interact with the system. Ways to handle the foregoing problems will be further described below.
- the order in which spots are played is randomized, so that each spot follows each other spot with equal probability. There may be some deviation from true randomness to prevent the same spot from playing twice in too short a time period. Thus, any effect of the previous spot on the current spot's popularity would be averaged out across all spots, giving a somewhat better sense of the spot's true popularity.
- the length of time that the spot runs can be increased so that the effects of the previous spot are limited.
- each showing of the spot only the data for the last few seconds of the showing is analyzed when computing the average number of people or foreground portion. This average is referred to as “average popularity”.
- the difference between the showing's average popularity and the average popularity during the previous spot is computed. This shows whether the current spot caused an increase in the number of people on the screen.
- the difference between the showing's average popularity and the average popularity during the previous spot is computed. However, only the last few seconds of the showing of the current spot and the previous spot are used when computing the averages.
- the difference between the number of people (or foreground portion) at the beginning of the showing and the number of people (or foreground portion) at the end of the showing is computed.
- the beginning and end could either be instantaneous or refer to the average of the first few seconds and last few seconds respectively.
- the number of people who entered the display and the number of people who left the display during the showing are counted.
- Popular spots have more people enter and fewer people leave, in some cases, though, it would be desirable to have many people enter and leave, thus allowing the content to be seen by as many people as possible.
- the information about the number of people entering and leaving can be derived directly from person tracking data or estimated from the foreground portion data by looking for quick rises and drops (of a particular minimum size) in the foreground portion data.
- the average length of time that people playing with the spot have been on the display is recorded.
- the person controlling the system may want either a long or a short length of stay.
- the amount of movement that takes place during the spot's showings is recorded.
- the amount of movement can be derived as the average speed of people on the display, which can be found by examining the position information in the person-tracking data.
- An alternative measure of the amount of movement can be calculated from the foreground/background images by computing the portion of pixels that switched from foreground to background or vice versa between two such images a very short time apart. In order to compute the average amount of movement, this image difference would ideally be made several times during each showing of the spot. This allows the person controlling the system to distinguish between spots that promote lively behavior and spots that create a more sedate atmosphere but are nonetheless popular.
- the number of showings for which no one was interacting with the display the beginning (or no one interacted with the previous spot) but at least one person was interacting at the end is determined. This measures the spot's “first user attraction level,” e.g., how good it is attracting a person to interact with the display when there currently is no one interacting with the display.
- a spot's “group appeal” can be characterized by the number of times that one person was playing with the spot at the beginning of the showing and multiple people were playing with the spot at the end of the showing.
- Embodiments of the present invention provide a way to account for the number of people in the venue where interactive video display system 210 is installed.
- One way to do this would be to simply obtain attendance data from the venue; many venues have such data by day and even by hour.
- the attendance data can be obtained by performing surveys or spot checks on the number of people in attendance, or by use of a wide-angle camera and analysis of the camera image to determine the number of people in attendance. Then, the popularity of each spot relative to the overall number of people who saw the interactive display system can be determined. Then, a given spot's popularity could then be compared fairly across venues with very different levels of attendance.
- all the spots could be run at the same general time of day for the same days and in the same venues so that any differences in overall traffic level affect all spots equally.
- the popularity of a given spot may vary greatly depending on a variety of other conditions or factors.
- These external factors 240 include, but are not limited to, time of day, day of the week, season, weather, physical location of installation (geography), type of interactive installation (wall projection, floor projection etc.), gender, age, income demographic of people visiting the venue of the installation, and how frequently the spot is shown, etc.
- Spots that are popular under one set of conditions may be less popular under another set of conditions.
- data analysis module 234 is operable to receive external data 240 in its analysis of the data. Since this data is either recorded in the spot logs or can be matched (based on time and location) to the spot logs, statistics on the popularity of a spot can be determined given some or all of these conditions.
- a goal is to schedule a set of spots for a given interactive video display that optimizes the overall popularity of the display, then past popularity data from one or more interactive video displays can be analyzed to determine what spots should be chosen.
- the people responsible for determining the schedule of spots can make use of this database by querying for the popularity data for these spots from the database. They have the option of limiting this data to similar external factors (e.g., looking up the popularity of spots running on weekdays from 10:00 PM to 12:00 AM at mall installations in New England) so as to allow the most accurate judgments to be made. The database could then produce a list of the most popular spots given those conditions. Based on what the scheduler wants, different methods of computing popularity, such as the ones described earlier, may be employed.
- the spot popularity data can also be used to allow for automatic scheduling at video spot scheduler 236 , both at a micro and at a macro level.
- an interactive video display system could run spots that are found through the log processing to have high “first user attraction level” (as defined earlier) when there is no one at the display and run spots with high “group appeal” (as defined earlier) when there is currently one person at the display.
- the system could look at its own spot popularity data over the last few minutes or hours, and change the schedule to show popular spots more often or stop showing unpopular spots.
- the system could even directly solicit feedback from users. For example, the system could display an interactive button that asks users to touch the button if they like the spot and want to see more like it.
- machine learning processes can be employed, including but not limited to, neural networks, hidden Markov models, a mixture of Gaussian models, and principal component analysis, to build a model of the relationships between the popularity of each showing of each spot and the set of conditions under which that showing occurred. These machine learning processes could then automatically predict what set of spots would perform best at any given time and place, and automatically reschedule them to optimize their popularity.
- process 300 for managing an interactive video display system, are described herein.
- process 300 is carried out by processors and electrical components (e.g., an interactive video display system) under the control of computer readable and computer executable instructions, such as interactive video display system 210 of FIG. 2 .
- processors and electrical components e.g., an interactive video display system
- computer readable and computer executable instructions such as interactive video display system 210 of FIG. 2 .
- a plurality of video spots are displayed on the interactive video display system.
- the plurality of video spots are displayed in a pseudo random order.
- the length of time that a particular video spot is displayed is adjusted.
- step 320 data based on interaction with the interactive video display system corresponding to video spots of the plurality of video spots is gathered.
- the interaction is determined according to person tracking.
- the interaction is determined according to a foreground/background classification image.
- the data is stored, wherein the data is for use in managing presentation of the video spots.
- the data is stored at a local memory of the interactive video display system.
- the data is transmitted to an external computer system and is stored in a memory of the external computer system.
- the data is analyzed for use in managing presentation of the plurality of video spots.
- analyzing the data includes determining popularity for at least one video spot of the plurality of video spots based on the data.
- analyzing the data includes determining a first user attraction level for at least one video spot of the plurality of video spots based on the data.
- analyzing the data includes determining a group appeal for at least one video spot of the plurality of video spots based on the data.
- external data is gathered for use in analyzing the data.
- the display schedule of the plurality of video spots is adjusted based on analyzing data.
- the display schedule is automatically adjusted based on the analysis.
- the display schedule is manually adjusted based on the analysis.
- the display schedule is fed into step 310 of process 300 for displaying the video spots.
- the present invention is implemented using software in the form of control logic, in either an integrated or a modular manner.
- software or a combination of software and hardware can also be used to implement the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Marketing (AREA)
- Development Economics (AREA)
- Social Psychology (AREA)
- Finance (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Accounting & Taxation (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Controls And Circuits For Display Device (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
- This application claims priority from co-pending U.S. Provisional Patent Application No. 60/514,232, filed on Oct. 24, 2003, entitled “METHOD AND SYSTEM FOR MANAGING CAPTURED IMAGE INFORMATION IN AN INTERACTIVE VIDEO DISPLAY SYSTEM,” by Bell et al., and assigned to the assignee of the present application, which is herein incorporated by reference.
- The present invention relates to the field of visual electronic displays. Specifically, embodiments of the present invention relate to a self-contained interactive video display system.
- Advertising is used to convey various types of information to an audience. To fully maximize the performance of an advertisement, it is desirable for the advertiser to be able to gather information regarding effectiveness of the advertisement(s). This effectiveness measure may comprise both how many people saw the advertisement, whether people paid attention to it. In general, for traditional offline advertising (e.g., billboards, posters, television commercials) data gathering is typically based on information related to where the advertisement is placed. For example, for a billboard advertisement, effectiveness may be measured by the amount of automobile traffic that passes the billboard. Similarly, for a television commercial, effectiveness, or popularity, may be based on popularity ratings for the television show during which the television commercial was aired. In this way, information regarding the popularity of an advertisement may be inferred based on the advertisement placement information. However, it is typically very difficult to directly measure popularity of traditional advertisements. Because these advertisements are not interactive, it is difficult to know if people are actually paying attention to them.
- Recent technological advancements have led to the creation of a new type of advertising medium, the interactive video display system. Interactive video display systems allow real-time unencumbered human interactions with video displays. Natural physical motions by human users are captured by a computer vision system and used to drive visual effects. The computer vision system usually uses images captured by a video camera as input and has software processes that gather real-time information about people and other objects in the interactive area viewed by the camera.
- For interactive video display systems, it is possible to gather information regarding displayed images based on the same methods used for traditional offline advertising mediums. However, in addition to providing real-time interactions, it would be desirable to have an interactive video display system that is capable of capturing and managing information derived from such real-time interactions not available to traditional offline advertising mediums.
- Various embodiments of the present invention, a method and system for managing an interactive video display system, are described herein. In one embodiment, a plurality of video spots are displayed on the interactive video display system.
- Data based on interactions with the interactive video display system corresponding to video spots of the plurality of video spots is gathered. In one embodiment, the interaction is determined according to person tracking. In another embodiment, the interaction is determined according to a foreground/background classification image. In one embodiment, the data includes information about what virtual objects displayed by the interactive video display system were interacted with. In one embodiment, the data includes information about a location of the interaction relative to a display of the interactive video display system.
- The data may then be stored and may be used in managing presentation of the video spots. In one embodiment, the data is stored at a local memory of the interactive video display system. In another embodiment, the data is transmitted to an external computer system and is stored in a memory of the external computer system.
- In one embodiment, the data is analyzed for use in managing presentation of the plurality of video spots. In one embodiment, analyzing the data includes determining popularity for at least one video spot of the plurality of video spots based on the data. In another embodiment, analyzing the data includes determining a “first user attraction level” for at least one video spot of the plurality of video spots based on the data. In another embodiment, analyzing the data includes determining a group appeal for at least one video spot of the plurality of video spots based on the data. In one embodiment, external data is gathered for use in analyzing the data.
- In one embodiment, the display schedule of the plurality of video spots is adjusted based on analyzing data. In one embodiment, the display schedule is automatically adjusted based on the analysis. In another embodiment, the display schedule is manually adjusted based on the analysis.
- The accompanying drawings, which are incorporated in and form a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention:
-
FIG. 1A illustrates a projection interactive video display system in accordance with an embodiment of the present invention. -
FIG. 1B illustrates a self-contained interactive video display system in accordance with an embodiment of the present invention. -
FIG. 2 illustrates a system for managing an interactive video display system in accordance with an embodiment of the present invention. -
FIG. 3 illustrates a process for managing an interactive video display system in accordance with an embodiment of the present invention. - Reference will now be made in detail to various embodiments of the invention, an electronic device for monitoring the presence of objects around a second electronic device, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with these embodiments, it is understood that they are not intended to limit the invention to these embodiments. On the contrary, the invention is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the invention as defined by the appended claims. Furthermore, in the following detailed description of the invention, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be recognized by one of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the invention.
- Some portions of the detailed descriptions, which follow, are presented in terms of procedures, steps, logic blocks, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer executed step, logic block, process, etc., is here, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
- It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as “presenting” or “gathering” or “storing” or “transmitting” or “analyzing” or “determining” or “adjusting” or “managing” the like, refer to the action and processes of an electronic system (e.g., 200 of
FIG. 2 ), or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the electronic device's registers and memories into other data similarly represented as physical quantities within the electronic device memories or registers or other such information storage, transmission or display devices. - Various embodiments of the present invention in the form of one or more exemplary embodiments will now be described. In one embodiment, the interactive video display system includes a vision system that captures and processes information relating to a scene. The processed information is used to generate certain visual effects that are then displayed to viewers via an interactive display device. People are able to interact with such visual effects on a real-time basis.
-
FIG. 1A illustrates a projection interactivevideo display system 100 in accordance with an embodiment of the present invention. Projection interactivevideo display system 100 uses acamera system 105, an illuminator that illuminatessurface 102 being viewed bycamera 105, aprojector 110 that projects animage 120 onto theinteractive space 115 ofsurface 102, and a local computer (not shown) that takes as input the image ofcamera 105 and outputs a video image toprojector 110. - The local computer processes the
camera 105 input to discern information about the position and movement of people (or moving objects)in the volume in front ofsurface 102. In one embodiment, the local computer processes thecamera 105 input to discern on a pixel-by-pixel basis what portions of the volume in front of surface 102 (e.g., interactive space 115) are occupied by people (or moving objects) and what portions ofsurface 102 are background. The local computer may accomplish this by developing several evolving models of what the background is believed to look like, and then comparing its concepts of the background to whatcamera 105 is currently imaging. Components of the local computer that processcamera 105 input are collectively known as the vision system. Various embodiments of projection interactivevideo display system 100 and the vision system are described in co-pending U.S. patent application Ser. No. 10/160,217, filed on May 28, 2002, entitled “INTERACTIVE VIDEO DISPLAY SYSTEM,” by Bell, and assigned to the assignee of the present application, and in co-pending U.S. Provisional Patent Application No. 60/514,024, filed on Oct. 24, 2003, entitled “METHOD AND SYSTEM FOR PROCESSING CAPTURED IMAGE INFORMATION IN AN INTERACTIVE VIDEO SYSTEM,” by Bell, and assigned to the assignee of the present application, both of which are herein incorporated by reference. - In one embodiment, projection interactive
video display system 100 is coupled to aremote computer system 130. In one embodiment,remote computer system 130 is a remote server for collecting data gathered by projection interactivevideo display system 100. In one embodiment,remote computer system 130 is configured to collect data gathered by multiple projection interactivevideo display systems 100 that are located in physically distinct locations. It should be appreciated thatremote computer system 130 may be located in a different physical location than projection interactivevideo display system 100. -
FIG. 1B illustrates a self-contained interactive video display system 150 in accordance with an embodiment of the present invention. Self-contained interactive video display system 150 displays an image onto display screen 155, and uses a camera (not shown) to detect people and objects in interactive space 160. A local computer (not shown), also referred to as the image system, takes as input the image of the camera and outputs a video image to display screen 155. - Various embodiments of self-contained interactive video display system 150 are described in co-pending U.S. patent application Ser. No. 10/946,263, filed on Sep. 20, 2004, entitled “SELF-CONTAINED INTERACTIVE VIDEO DISPLAY SYSTEM,” by Bell et al., and assigned to the assignee of the present application, co-pending U.S. patent application Ser. No. 10/946,084, filed on Sep. 20, 2004, entitled “SELF-CONTAINED INTERACTIVE VIDEO DISPLAY SYSTEM,” by Bell, and assigned to the assignee of the present application, and co-pending U.S. patent application Ser. No.______, filed on Sep. 20, 2004, entitled “INTERACTIVE VIDEO WINDOW DISPLAY SYSTEM,” by Bell, and assigned to the assignee of the present application, all of which are herein incorporated by reference. Furthermore, various embodiments of the vision system are described in co-pending U.S. patent application Ser. No. 10/160,217, filed on May 28, 2002, entitled “INTERACTIVE VIDEO DISPLAY SYSTEM,” by Bell, and assigned to the assignee of the present application, and in co-pending U.S. Provisional Patent Application No. 60/514,024, filed on Oct. 24, 2003, entitled “METHOD AND SYSTEM FOR PROCESSING CAPTURED IMAGE INFORMATION IN AN INTERACTIVE VIDEO SYSTEM,” by Bell, and assigned to the assignee of the present application, both of which are herein incorporated by reference.
- In one embodiment, self-contained interactive video display system 150 is coupled to a remote computer system 170. In one embodiment, remote computer system 170 is a remote server for collecting data gathered by self-contained interactive video display system 150. In one embodiment, remote computer system 170 is configured to collect data gathered by multiple self-contained interactive video display systems 150 and/or projection interactive video display systems 100 (
FIG. 1A ) that are located in physically distinct locations. It should be appreciated that remote computer system 170 may be located in a different physical location than self-contained interactive video display system 150. Many other forms of interactive video display systems exist. These systems may use different kinds of display media, and different sensing apparatus. However, the techniques described in this patent are fully applicable to these other systems as well. -
FIG. 2 illustrates asystem 200 for managing an interactive video display system in accordance with an embodiment of the present invention.System 200 includes interactive video display system 210 (e.g., projection interactivevideo display system 100 ofFIG. 1A or self-contained interactive video display system 150 ofFIG. 1B ) and remote computer system 230 (e.g.,remote computer system 130 ofFIG. 1A or remote computer system 170 ofFIG. 1B ). Interactivevideo display system 210 includesdisplay 212,vision system 214,video spot storage 216, anddata gathering module 218. In one embodiment,display 212,vision system 214,video spot storage 216, anddata gathering module 218 are components of a computer system of interactivevideo display system 210.Remote computer system 230 includesdata storage 232, data analysis module 234, andvideo spot schedule 236. It should be appreciated that in various embodiments of the present invention, any or all of the components and functionality ofremote computer system 230 may be local and included in interactivevideo display system 210. - Data Gathering
- In an exemplary aspect, interactive
video display system 210 presents a rich opportunity to gather data about interactions with and usage of the interactive displays. This data is gathered atdata gathering module 218, and is useful for many kinds of interactive contents and applications. In one exemplary embodiment, the data is useful for designing advertising contents 20 as it allows advertisers to analyze where and when their advertisements are most popular, thus, allowing them to adjust their advertisements and/or schedules accordingly. - The data may be analyzed locally by the same computer that handles
vision system 214. Alternatively, interactivevideo display system 210 may be networked, allowing the data for one or more systems to be analyzed fromremote computer system 230. The data from multiple systems may be pooled to increase the total amount of data or to identify differences between locations. - By using a person-tracking process, real-time information relating to the present and past positions of people viewed by a camera of
vision system 214 or a separate camera, can be determined. Using this information, additional information can be derived including, for example, the number of people on or near thedisplay 212, the time at which each person enters and leaves the display 212 (e.g.,interactive area 115 ofFIG. 1A ), and whether they are moving and interacting with the displayed image. In one embodiment of interactivevideo display system 210, such as, a floor or wall-projected interactive display, the boundaries of the display screen are known. Thus, based on the respective positions of the people, additional information about the people withindisplay 212 can be derived. For example, specific information can be determined based on the number of people who are interacting with Interactivevideo display system 210 and the number of people who are mere spectators. In addition, with face-recognition systems, it is also possible to gather demographic data such as gender. It should be appreciated that the person tracking system may take the form of a face-tracking system, in which heads are counted and tracked instead of entire bodies. A variety of person-tracking processes exist. These processes may use one or more of a variety of systems and techniques including, but not limited to, stereo cameras, time-of-flight cameras, depth segmentation, color segmentation, feature tracking, face detection, template matching, adaptive template matching, and blob tracking. - Similar data can be gathered from a foreground/background classification image, which may be produced by
vision system 214. The portion of the image that is classified as foreground is a rough approximation of the number of people viewed by the camera. If the width and height of the screen viewed by the camera and the approximate cross-sectional size of a typical person from the camera's point of view are known, then the approximate portion of the screen that is turned to foreground by the presence of a single person can be computed. Consequently, a conversion can be made from foreground portion to number of people. Alternatively, by doing some experiments with a person in front of the camera, the approximate portion of foreground that corresponds to a single person for a given installation can be determined. This “foreground portion” data can be used to roughly estimate how popular the display is at a given time, and whether people are entering or leaving the display. Because the physical boundaries of interactivevideo display system 210 may only take up a portion of the camera's view, the foreground/background classification image can be segmented into different regions, such as, “on the display”, “within one foot of the display's boundary” “off the display, but within 4 feet of it” and the foreground portions for each region can be recorded separately. Small thumbnail or full-size copies of the vision foreground-background classification image can also be directly logged for later analysis. - Data from the person-tracking information and/or the foreground portions may be referred to as vision data and can be written (e.g., stored) on a periodic basis (e.g., once per second) to a log file. In one embodiment, the log file is stored in
data storage 232. This log file can then be analyzed, either locally or centrally, at a later time by data analysis module 234. In addition, the log entries may be time-stamped and may contain information about what content is running at that time. - In addition to the vision data, data about the specific interactions that took place on the display can be gathered. For example, suppose that interactive
video display system 210 is showing a survey that asks users questions and takes their responses by having them touch virtual buttons. The votes on the virtual survey could be recorded in the logs along with the vision data. In the case of video game content, the scores and actions of the players could be added to the log file. In the case of interactive informational content, such as, a virtual shopping catalog, each instance of a product or item being viewed could be recorded in the log file. In general, any information about an instance of human interaction with a virtual object on the screen can be logged for later analysis. This can be used as a valuable feedback tool for advertisers, game designers, and content creators. - Use of Data for Monitoring
- In many situations, the display or screen of interactive
video display system 210 requires maintenance and cleaning. In public installations, this maintenance and cleaning generally takes place after hours, when the display is turned off. In order to verify that maintenance and cleaning is taking place on a regular basis, the data logs can be checked for activity after hours. - Analysis of Content Popularity
- In other situations, interactive
video display system 210 sequentially displays different pieces of content, which will be referred to as “spots” or “video spots”, each for a length of time. The spots are stored invideo spot storage 216. This series of spots can be conceptually compared to a series of television commercials. There is flexibility in the number of spots, the length of time that they play, and the order in which they play. In one embodiment,video spot scheduler 236 controls the scheduling of the spots, whereinvideo spot scheduler 236 may be located locally or remotely, as described above. - Data analysis module 234 is operable to perform data analysis on the gathered data stored in
data storage 232. In one embodiment, the popularity of each spot can be determined based on the logged data as described above. For example, a spot's popularity can be measured by taking the average of the logged values for the foreground portion or the number of people on the screen during the periods when the spot was showing. However, there are various issues that should be considered with the foregoing approach. First, the number of people on the screen when the spot begins playing is determined by the popularity of the previous spot. For example, a spot that follows an unpopular spot will tend to have lower activity levels than it otherwise would have. Second, the number of people interacting with a spot depends on the number of people near the display; if very few people are in the venue where the interactive system is installed, then relatively few people will interact with the system. Ways to handle the foregoing problems will be further described below. - Controlling Effects of Prior Spot Popularity
- There are several ways to reduce or eliminate the effects of the previous spot's popularity on the spot that is being evaluated. In one embodiment, the order in which spots are played is randomized, so that each spot follows each other spot with equal probability. There may be some deviation from true randomness to prevent the same spot from playing twice in too short a time period. Thus, any effect of the previous spot on the current spot's popularity would be averaged out across all spots, giving a somewhat better sense of the spot's true popularity.
- In another embodiment, the length of time that the spot runs can be increased so that the effects of the previous spot are limited.
- Furthermore, there are other ways of measuring various aspects of popularity that are less sensitive to the popularity of the previous spot. The following are some illustrative examples. In the case of examples that describe what to do for each showing of the spot, presume that the statistic gathered is averaged over the number of showings. Nearly all these techniques work for both person-tracking data and foreground portion data. Techniques described earlier explain how to turn foreground portion data into a rough estimate of the number of people. These techniques can be applied equally well to data from the camera's whole image or data from a specified region, such as the area within the interactive display or the area near the interactive display.
- In one embodiment, for each showing of the spot, only the data for the last few seconds of the showing is analyzed when computing the average number of people or foreground portion. This average is referred to as “average popularity”. In another embodiment, for each showing of the spot, the difference between the showing's average popularity and the average popularity during the previous spot is computed. This shows whether the current spot caused an increase in the number of people on the screen.
- The above two approaches can be combined; in another embodiment, the difference between the showing's average popularity and the average popularity during the previous spot is computed. However, only the last few seconds of the showing of the current spot and the previous spot are used when computing the averages.
- In another embodiment, for each showing of the spot, the difference between the number of people (or foreground portion) at the beginning of the showing and the number of people (or foreground portion) at the end of the showing is computed. The beginning and end could either be instantaneous or refer to the average of the first few seconds and last few seconds respectively.
- It should be appreciated that the aforementioned approaches to determining popularity attempt to measure how many people were interacting with the display as a result of a particular spot. However, there are other types of statistics that can be gathered with regard to human interactions with the interactive display. The next few paragraphs describe other such measures.
- In a further embodiment, for each showing of the spot, the number of people who entered the display and the number of people who left the display during the showing are counted. Popular spots, of course, have more people enter and fewer people leave, in some cases, though, it would be desirable to have many people enter and leave, thus allowing the content to be seen by as many people as possible. The information about the number of people entering and leaving can be derived directly from person tracking data or estimated from the foreground portion data by looking for quick rises and drops (of a particular minimum size) in the foreground portion data.
- In another embodiment, for each showing of the spot, the average length of time that people playing with the spot have been on the display is recorded. Depending on the kind of message delivered by the interactive content, the person controlling the system may want either a long or a short length of stay.
- In another embodiment, the amount of movement that takes place during the spot's showings is recorded. The amount of movement can be derived as the average speed of people on the display, which can be found by examining the position information in the person-tracking data. An alternative measure of the amount of movement can be calculated from the foreground/background images by computing the portion of pixels that switched from foreground to background or vice versa between two such images a very short time apart. In order to compute the average amount of movement, this image difference would ideally be made several times during each showing of the spot. This allows the person controlling the system to distinguish between spots that promote lively behavior and spots that create a more sedate atmosphere but are nonetheless popular.
- In another example, for each spot, the number of showings for which no one was interacting with the display the beginning (or no one interacted with the previous spot) but at least one person was interacting at the end is determined. This measures the spot's “first user attraction level,” e.g., how good it is attracting a person to interact with the display when there currently is no one interacting with the display. In a similar way, a spot's “group appeal” can be characterized by the number of times that one person was playing with the spot at the beginning of the showing and multiple people were playing with the spot at the end of the showing.
- Controlling Overall Traffic Level
- Embodiments of the present invention provide a way to account for the number of people in the venue where interactive
video display system 210 is installed. One way to do this would be to simply obtain attendance data from the venue; many venues have such data by day and even by hour. Alternatively, the attendance data can be obtained by performing surveys or spot checks on the number of people in attendance, or by use of a wide-angle camera and analysis of the camera image to determine the number of people in attendance. Then, the popularity of each spot relative to the overall number of people who saw the interactive display system can be determined. Then, a given spot's popularity could then be compared fairly across venues with very different levels of attendance. - Alternatively, if it is desired to compare the relative popularity of different spots, all the spots could be run at the same general time of day for the same days and in the same venues so that any differences in overall traffic level affect all spots equally.
- External Factors Affecting Popularity
- The popularity of a given spot may vary greatly depending on a variety of other conditions or factors. These
external factors 240 include, but are not limited to, time of day, day of the week, season, weather, physical location of installation (geography), type of interactive installation (wall projection, floor projection etc.), gender, age, income demographic of people visiting the venue of the installation, and how frequently the spot is shown, etc. - Spots that are popular under one set of conditions may be less popular under another set of conditions. In one embodiment, data analysis module 234 is operable to receive
external data 240 in its analysis of the data. Since this data is either recorded in the spot logs or can be matched (based on time and location) to the spot logs, statistics on the popularity of a spot can be determined given some or all of these conditions. - Using Popularity Information
- If a goal is to schedule a set of spots for a given interactive video display that optimizes the overall popularity of the display, then past popularity data from one or more interactive video displays can be analyzed to determine what spots should be chosen.
- It is useful to put the popularity log information, along with associated external factors in a central searchable database format such as MySQL, especially if there are multiple installations of interactive video displays.
- The people responsible for determining the schedule of spots can make use of this database by querying for the popularity data for these spots from the database. They have the option of limiting this data to similar external factors (e.g., looking up the popularity of spots running on weekdays from 10:00 PM to 12:00 AM at mall installations in New England) so as to allow the most accurate judgments to be made. The database could then produce a list of the most popular spots given those conditions. Based on what the scheduler wants, different methods of computing popularity, such as the ones described earlier, may be employed.
- Automatic Scheduling
- The spot popularity data can also be used to allow for automatic scheduling at
video spot scheduler 236, both at a micro and at a macro level. At a micro level, an interactive video display system could run spots that are found through the log processing to have high “first user attraction level” (as defined earlier) when there is no one at the display and run spots with high “group appeal” (as defined earlier) when there is currently one person at the display. In addition, the system could look at its own spot popularity data over the last few minutes or hours, and change the schedule to show popular spots more often or stop showing unpopular spots. The system could even directly solicit feedback from users. For example, the system could display an interactive button that asks users to touch the button if they like the spot and want to see more like it. - At a macro level, a variety of machine learning processes can be employed, including but not limited to, neural networks, hidden Markov models, a mixture of Gaussian models, and principal component analysis, to build a model of the relationships between the popularity of each showing of each spot and the set of conditions under which that showing occurred. These machine learning processes could then automatically predict what set of spots would perform best at any given time and place, and automatically reschedule them to optimize their popularity.
- In one embodiment, as shown in
FIG. 3 , the present invention provides aprocess 300 for managing an interactive video display system, are described herein. In one embodiment,process 300 is carried out by processors and electrical components (e.g., an interactive video display system) under the control of computer readable and computer executable instructions, such as interactivevideo display system 210 ofFIG. 2 . Although specific steps are disclosed inprocess 300, such steps are exemplary. That is, the embodiments of the present invention are well suited to performing various other steps or variations of the steps recited inFIG. 3 . - At
step 310 ofprocess 300, a plurality of video spots are displayed on the interactive video display system. In one embodiment, the plurality of video spots are displayed in a pseudo random order. In one embodiment, the length of time that a particular video spot is displayed is adjusted. - At
step 320, data based on interaction with the interactive video display system corresponding to video spots of the plurality of video spots is gathered. In one embodiment, the interaction is determined according to person tracking. In another embodiment, the interaction is determined according to a foreground/background classification image. - At step 330, the data is stored, wherein the data is for use in managing presentation of the video spots. In one embodiment, the data is stored at a local memory of the interactive video display system. In another embodiment, the data is transmitted to an external computer system and is stored in a memory of the external computer system.
- At
step 340, in accordance with one embodiment, the data is analyzed for use in managing presentation of the plurality of video spots. In one embodiment, analyzing the data includes determining popularity for at least one video spot of the plurality of video spots based on the data. In another embodiment, analyzing the data includes determining a first user attraction level for at least one video spot of the plurality of video spots based on the data. In another embodiment, analyzing the data includes determining a group appeal for at least one video spot of the plurality of video spots based on the data. In one embodiment, external data is gathered for use in analyzing the data. - At
step 350, in one embodiment, the display schedule of the plurality of video spots is adjusted based on analyzing data. In one embodiment, the display schedule is automatically adjusted based on the analysis. In another embodiment, the display schedule is manually adjusted based on the analysis. In one embodiment, the display schedule is fed intostep 310 ofprocess 300 for displaying the video spots. - In an exemplary implementation, the present invention is implemented using software in the form of control logic, in either an integrated or a modular manner. Alternatively, hardware or a combination of software and hardware can also be used to implement the present invention. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will know of other ways and/or methods to implement the present invention.
- It is understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application and scope of the appended claims. All publications, patents, and patent applications cited herein are hereby incorporated by reference for all purposes in their entirety.
- Various embodiments of the invention, a method and system for managing an interactive video display system, are thus described. While the present invention has been described in particular embodiments, it should be appreciated that the invention should not be construed as limited by such embodiments, but rather construed according to the below claims.
Claims (54)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/973,335 US20050088407A1 (en) | 2003-10-24 | 2004-10-25 | Method and system for managing an interactive video display system |
US12/417,588 US8487866B2 (en) | 2003-10-24 | 2009-04-02 | Method and system for managing an interactive video display system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US51423203P | 2003-10-24 | 2003-10-24 | |
US10/973,335 US20050088407A1 (en) | 2003-10-24 | 2004-10-25 | Method and system for managing an interactive video display system |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/417,588 Continuation US8487866B2 (en) | 2003-10-24 | 2009-04-02 | Method and system for managing an interactive video display system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20050088407A1 true US20050088407A1 (en) | 2005-04-28 |
Family
ID=34520185
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/973,335 Abandoned US20050088407A1 (en) | 2003-10-24 | 2004-10-25 | Method and system for managing an interactive video display system |
US12/417,588 Active 2027-11-17 US8487866B2 (en) | 2003-10-24 | 2009-04-02 | Method and system for managing an interactive video display system |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/417,588 Active 2027-11-17 US8487866B2 (en) | 2003-10-24 | 2009-04-02 | Method and system for managing an interactive video display system |
Country Status (6)
Country | Link |
---|---|
US (2) | US20050088407A1 (en) |
EP (1) | EP1676442A2 (en) |
JP (1) | JP4794453B2 (en) |
KR (1) | KR101094119B1 (en) |
CN (2) | CN1902930B (en) |
WO (1) | WO2005041578A2 (en) |
Cited By (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050089194A1 (en) * | 2003-10-24 | 2005-04-28 | Matthew Bell | Method and system for processing captured image information in an interactive video display system |
US20050110964A1 (en) * | 2002-05-28 | 2005-05-26 | Matthew Bell | Interactive video window display system |
US20050162381A1 (en) * | 2002-05-28 | 2005-07-28 | Matthew Bell | Self-contained interactive video display system |
US20050238201A1 (en) * | 2004-04-15 | 2005-10-27 | Atid Shamaie | Tracking bimanual movements |
US20060036944A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Surface UI for gesture-based interaction |
US20060188849A1 (en) * | 2005-01-07 | 2006-08-24 | Atid Shamaie | Detecting and tracking objects in images |
US20060192782A1 (en) * | 2005-01-21 | 2006-08-31 | Evan Hildreth | Motion-based tracking |
US20060218618A1 (en) * | 2005-03-22 | 2006-09-28 | Lorkovic Joseph E | Dual display interactive video |
US20080062123A1 (en) * | 2001-06-05 | 2008-03-13 | Reactrix Systems, Inc. | Interactive video display system using strobed light |
US20080252596A1 (en) * | 2007-04-10 | 2008-10-16 | Matthew Bell | Display Using a Three-Dimensional vision System |
US20090077504A1 (en) * | 2007-09-14 | 2009-03-19 | Matthew Bell | Processing of Gesture-Based User Interactions |
US20090183125A1 (en) * | 2008-01-14 | 2009-07-16 | Prime Sense Ltd. | Three-dimensional user interface |
US20090235295A1 (en) * | 2003-10-24 | 2009-09-17 | Matthew Bell | Method and system for managing an interactive video display system |
US20090251685A1 (en) * | 2007-11-12 | 2009-10-08 | Matthew Bell | Lens System |
US20100034457A1 (en) * | 2006-05-11 | 2010-02-11 | Tamir Berliner | Modeling of humanoid forms from depth maps |
US20100082423A1 (en) * | 2008-09-30 | 2010-04-01 | Yahoo! Inc. | System for optimizing ad performance at campaign running time |
US20100121866A1 (en) * | 2008-06-12 | 2010-05-13 | Matthew Bell | Interactive display management systems and methods |
US20100199232A1 (en) * | 2009-02-03 | 2010-08-05 | Massachusetts Institute Of Technology | Wearable Gestural Interface |
US20100235786A1 (en) * | 2009-03-13 | 2010-09-16 | Primesense Ltd. | Enhanced 3d interfacing for remote devices |
US7834846B1 (en) | 2001-06-05 | 2010-11-16 | Matthew Bell | Interactive video display system |
US20110052006A1 (en) * | 2009-08-13 | 2011-03-03 | Primesense Ltd. | Extraction of skeletons from 3d maps |
US20110164032A1 (en) * | 2010-01-07 | 2011-07-07 | Prime Sense Ltd. | Three-Dimensional User Interface |
US20110211754A1 (en) * | 2010-03-01 | 2011-09-01 | Primesense Ltd. | Tracking body parts by combined color image and depth processing |
US8081822B1 (en) | 2005-05-31 | 2011-12-20 | Intellectual Ventures Holding 67 Llc | System and method for sensing a feature of an object in an interactive video display |
US8098277B1 (en) | 2005-12-02 | 2012-01-17 | Intellectual Ventures Holding 67 Llc | Systems and methods for communication between a reactive video system and a mobile communication device |
US8199108B2 (en) | 2002-12-13 | 2012-06-12 | Intellectual Ventures Holding 67 Llc | Interactive directed light/sound system |
US8259163B2 (en) | 2008-03-07 | 2012-09-04 | Intellectual Ventures Holding 67 Llc | Display with built in 3D sensing |
US8559676B2 (en) | 2006-12-29 | 2013-10-15 | Qualcomm Incorporated | Manipulation of virtual objects using enhanced interactive system |
US8582867B2 (en) | 2010-09-16 | 2013-11-12 | Primesense Ltd | Learning-based pose estimation from depth maps |
US8594425B2 (en) | 2010-05-31 | 2013-11-26 | Primesense Ltd. | Analysis of three-dimensional scenes |
US8624962B2 (en) | 2009-02-02 | 2014-01-07 | Ydreams—Informatica, S.A. Ydreams | Systems and methods for simulating three-dimensional virtual interactions from two-dimensional camera images |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
US8959013B2 (en) | 2010-09-27 | 2015-02-17 | Apple Inc. | Virtual keyboard for a non-tactile three dimensional user interface |
US9002099B2 (en) | 2011-09-11 | 2015-04-07 | Apple Inc. | Learning-based estimation of hand and finger pose |
US9019267B2 (en) | 2012-10-30 | 2015-04-28 | Apple Inc. | Depth mapping with enhanced resolution |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9035876B2 (en) | 2008-01-14 | 2015-05-19 | Apple Inc. | Three-dimensional user interface session control |
US9047507B2 (en) | 2012-05-02 | 2015-06-02 | Apple Inc. | Upper-body skeleton extraction from depth maps |
US9122311B2 (en) | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
US9128519B1 (en) | 2005-04-15 | 2015-09-08 | Intellectual Ventures Holding 67 Llc | Method and system for state-based control of objects |
US9158375B2 (en) | 2010-07-20 | 2015-10-13 | Apple Inc. | Interactive reality augmentation for natural interaction |
US20150301591A1 (en) * | 2012-10-31 | 2015-10-22 | Audi Ag | Method for inputting a control command for a component of a motor vehicle |
US9201501B2 (en) | 2010-07-20 | 2015-12-01 | Apple Inc. | Adaptive projector |
US9218063B2 (en) | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
US9285874B2 (en) | 2011-02-09 | 2016-03-15 | Apple Inc. | Gaze detection in a 3D mapping environment |
US9377863B2 (en) | 2012-03-26 | 2016-06-28 | Apple Inc. | Gaze-enhanced virtual touchscreen |
US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
CN105828159A (en) * | 2016-03-22 | 2016-08-03 | 乐视网信息技术(北京)股份有限公司 | Configuration method and device of television operation corner mark |
US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
US20160328741A1 (en) * | 2015-05-04 | 2016-11-10 | International Business Machines Corporation | Measuring display effectiveness with interactive asynchronous applications |
US9584753B2 (en) | 2015-05-18 | 2017-02-28 | Target Brands, Inc. | Interactive display fixture |
US10043279B1 (en) | 2015-12-07 | 2018-08-07 | Apple Inc. | Robust detection and classification of body parts in a depth map |
US20190052938A1 (en) * | 2015-04-03 | 2019-02-14 | Mirriad Advertising Plc | Producing video data |
CN109905753A (en) * | 2017-12-08 | 2019-06-18 | 腾讯科技(深圳)有限公司 | The display methods and device of footmark, storage medium, electronic device |
US10366278B2 (en) | 2016-09-20 | 2019-07-30 | Apple Inc. | Curvature-based face detector |
US10911811B1 (en) * | 2019-10-10 | 2021-02-02 | Recentive Analytics | Systems and methods for automatically and dynamically generating a network map |
US11321947B2 (en) | 2012-09-28 | 2022-05-03 | Nec Corporation | Information processing apparatus, information processing method, and information processing program |
US20230083741A1 (en) * | 2012-04-12 | 2023-03-16 | Supercell Oy | System and method for controlling technical processes |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8736751B2 (en) * | 2008-08-26 | 2014-05-27 | Empire Technology Development Llc | Digital presenter for displaying image captured by camera with illumination system |
US20130138499A1 (en) * | 2011-11-30 | 2013-05-30 | General Electric Company | Usage measurent techniques and systems for interactive advertising |
US11284137B2 (en) | 2012-04-24 | 2022-03-22 | Skreens Entertainment Technologies, Inc. | Video processing systems and methods for display, selection and navigation of a combination of heterogeneous sources |
US20180316941A1 (en) * | 2012-04-24 | 2018-11-01 | Skreens Entertainment Technologies, Inc. | Systems and methods for video processing and display of a combination of heterogeneous sources and advertising content |
US9749431B1 (en) * | 2013-11-21 | 2017-08-29 | Mashable, Inc. | Finding a potentially viral first media content and transmitting a second media content that is selected based on the first media content and based on the determination that the first media content exceeds a velocity threshold |
US9894414B2 (en) * | 2014-09-30 | 2018-02-13 | Rovi Guides, Inc. | Methods and systems for presenting content to a user based on the movement of the user |
RU2018133712A (en) * | 2018-09-25 | 2020-03-25 | Алексей Викторович Шторм | Methods for confirming transactions in a distributed outdoor advertising network |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5510828A (en) * | 1994-03-01 | 1996-04-23 | Lutterbach; R. Steven | Interactive video display system |
US5861881A (en) * | 1991-11-25 | 1999-01-19 | Actv, Inc. | Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers |
US6106119A (en) * | 1998-10-16 | 2000-08-22 | The Board Of Trustees Of The Leland Stanford Junior University | Method for presenting high level interpretations of eye tracking data correlated to saved display images |
US6118888A (en) * | 1997-02-28 | 2000-09-12 | Kabushiki Kaisha Toshiba | Multi-modal interface apparatus and method |
US20020178440A1 (en) * | 2001-03-28 | 2002-11-28 | Philips Electronics North America Corp. | Method and apparatus for automatically selecting an alternate item based on user behavior |
US20020186221A1 (en) * | 2001-06-05 | 2002-12-12 | Reactrix Systems, Inc. | Interactive video display system |
US20030093784A1 (en) * | 2001-11-13 | 2003-05-15 | Koninklijke Philips Electronics N.V. | Affective television monitoring and control |
US20030098819A1 (en) * | 2001-11-29 | 2003-05-29 | Compaq Information Technologies Group, L.P. | Wireless multi-user multi-projector presentation system |
US20030122839A1 (en) * | 2001-12-26 | 2003-07-03 | Eastman Kodak Company | Image format including affective information |
US20060256382A1 (en) * | 2001-12-26 | 2006-11-16 | Matraszek Tomasz A | Method for creating and using affective information in a digital imaging system |
US7193608B2 (en) * | 2003-05-27 | 2007-03-20 | York University | Collaborative pointing devices |
Family Cites Families (239)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2917980A (en) | 1955-12-30 | 1959-12-22 | Mergenthaler Linotype Gmbh | Lenslet assembly for photocomposing machines |
US3068754A (en) | 1958-07-30 | 1962-12-18 | Corning Giass Works | Prismatic light transmitting panel |
US3763468A (en) | 1971-10-01 | 1973-10-02 | Energy Conversion Devices Inc | Light emitting display array with non-volatile memory |
JPS5189419A (en) | 1975-02-03 | 1976-08-05 | ||
US4275395A (en) | 1977-10-31 | 1981-06-23 | International Business Machines Corporation | Interactive projection display system |
DE3176016D1 (en) | 1980-12-30 | 1987-04-23 | Ibm | System for remotely displaying and sensing information using shadow parallax |
JPS59182688A (en) | 1983-03-31 | 1984-10-17 | Toshiba Corp | Stereoscopic processor |
GB8421783D0 (en) | 1984-08-29 | 1984-10-03 | Atomic Energy Authority Uk | Stereo camera |
US5001558A (en) | 1985-06-11 | 1991-03-19 | General Motors Corporation | Night vision system with color video camera |
US4791572A (en) | 1985-11-20 | 1988-12-13 | Mets, Inc. | Method for accurately displaying positional information on a map |
US4843568A (en) * | 1986-04-11 | 1989-06-27 | Krueger Myron W | Real time perception of and response to the actions of an unencumbered participant/user |
US4769697A (en) * | 1986-12-17 | 1988-09-06 | R. D. Percy & Company | Passive television audience measuring systems |
US4887898A (en) | 1988-04-13 | 1989-12-19 | Rowe Furniture Corporation | Fabric projection system |
US4948371A (en) | 1989-04-25 | 1990-08-14 | The United States Of America As Represented By The United States Department Of Energy | System for training and evaluation of security personnel in use of firearms |
CA2030139C (en) | 1989-11-20 | 2002-04-23 | David M. Durlach | 3-d amusement and display device |
CA2040273C (en) | 1990-04-13 | 1995-07-18 | Kazu Horiuchi | Image displaying system |
US5138304A (en) | 1990-08-02 | 1992-08-11 | Hewlett-Packard Company | Projected image light pen |
US6732929B2 (en) | 1990-09-10 | 2004-05-11 | Metrologic Instruments, Inc. | Led-based planar light illumination beam generation module employing a focal lens for reducing the image size of the light emmiting surface of the led prior to beam collimation and planarization |
US5239373A (en) | 1990-12-26 | 1993-08-24 | Xerox Corporation | Video computational shared drawing space |
US5151718A (en) | 1990-12-31 | 1992-09-29 | Texas Instruments Incorporated | System and method for solid state illumination for dmd devices |
US5534917A (en) | 1991-05-09 | 1996-07-09 | Very Vivid, Inc. | Video image based control system |
US5325473A (en) | 1991-10-11 | 1994-06-28 | The Walt Disney Company | Apparatus and method for projection upon a three-dimensional object |
US5444462A (en) | 1991-12-16 | 1995-08-22 | Wambach; Mark L. | Computer mouse glove with remote communication |
EP0553700B1 (en) | 1992-01-29 | 1997-06-04 | Deutsche Thomson-Brandt GmbH | Video camera, optionally operable as projector |
US5497269A (en) | 1992-06-25 | 1996-03-05 | Lockheed Missiles And Space Company, Inc. | Dispersive microlens |
US6008800A (en) | 1992-09-18 | 1999-12-28 | Pryor; Timothy R. | Man machine interfaces for entering data into a computer |
US7084859B1 (en) | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US5982352A (en) | 1992-09-18 | 1999-11-09 | Pryor; Timothy R. | Method for providing human input to a computer |
US5442252A (en) | 1992-11-16 | 1995-08-15 | General Electric Company | Lenticulated lens with improved light distribution |
US5319496A (en) | 1992-11-18 | 1994-06-07 | Photonics Research Incorporated | Optical beam delivery system |
US5550928A (en) * | 1992-12-15 | 1996-08-27 | A.C. Nielsen Company | Audience measurement system and method |
US5526182A (en) | 1993-02-17 | 1996-06-11 | Vixel Corporation | Multiple beam optical memory system |
US5436639A (en) | 1993-03-16 | 1995-07-25 | Hitachi, Ltd. | Information processing system |
US5454043A (en) | 1993-07-30 | 1995-09-26 | Mitsubishi Electric Research Laboratories, Inc. | Dynamic and static hand gesture recognition through low-level image analysis |
JPH07135623A (en) | 1993-10-27 | 1995-05-23 | Kinseki Ltd | Direct display device on retina |
US5426474A (en) | 1994-03-22 | 1995-06-20 | Innersense, Inc. | Light projection system using randomized fiber optic bundle |
US5528263A (en) | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
US5846086A (en) | 1994-07-01 | 1998-12-08 | Massachusetts Institute Of Technology | System for human trajectory learning in virtual environments |
US5808784A (en) | 1994-09-06 | 1998-09-15 | Dai Nippon Printing Co., Ltd. | Lens array sheet surface light source, and transmission type display device |
US5682468A (en) | 1995-01-23 | 1997-10-28 | Intergraph Corporation | OLE for design and modeling |
US5548694A (en) | 1995-01-31 | 1996-08-20 | Mitsubishi Electric Information Technology Center America, Inc. | Collision avoidance system for voxel-based object representation |
US5594469A (en) | 1995-02-21 | 1997-01-14 | Mitsubishi Electric Information Technology Center America Inc. | Hand gesture machine control system |
JP3539788B2 (en) | 1995-04-21 | 2004-07-07 | パナソニック モバイルコミュニケーションズ株式会社 | Image matching method |
EP0823683B1 (en) | 1995-04-28 | 2005-07-06 | Matsushita Electric Industrial Co., Ltd. | Interface device |
US5633691A (en) | 1995-06-07 | 1997-05-27 | Nview Corporation | Stylus position sensing and digital camera with a digital micromirror device |
US5882204A (en) | 1995-07-13 | 1999-03-16 | Dennis J. Lannazzo | Football interactive simulation trainer |
US5591972A (en) | 1995-08-03 | 1997-01-07 | Illumination Technologies, Inc. | Apparatus for reading optical information |
US5574511A (en) | 1995-10-18 | 1996-11-12 | Polaroid Corporation | Background replacement for an image |
US6308565B1 (en) | 1995-11-06 | 2001-10-30 | Impulse Technology Ltd. | System and method for tracking and assessing movement skills in multidimensional space |
US6176782B1 (en) | 1997-12-22 | 2001-01-23 | Philips Electronics North America Corp. | Motion-based command generation technology |
US6278418B1 (en) | 1995-12-29 | 2001-08-21 | Kabushiki Kaisha Sega Enterprises | Three-dimensional imaging system, game device, method for same and recording medium |
US5828485A (en) | 1996-02-07 | 1998-10-27 | Light & Sound Design Ltd. | Programmable light beam shape altering device using programmable micromirrors |
US6084979A (en) | 1996-06-20 | 2000-07-04 | Carnegie Mellon University | Method for creating virtual reality |
KR19990067463A (en) | 1996-09-11 | 1999-08-16 | 이리마지리 쇼우이치로 | Ski simulation device |
US6400374B2 (en) | 1996-09-18 | 2002-06-04 | Eyematic Interfaces, Inc. | Video superposition system and method |
US5923475A (en) | 1996-11-27 | 1999-07-13 | Eastman Kodak Company | Laser printer using a fly's eye integrator |
US5969754A (en) | 1996-12-09 | 1999-10-19 | Zeman; Herbert D. | Contrast enhancing illuminator |
JP3298437B2 (en) | 1996-12-18 | 2002-07-02 | セイコーエプソン株式会社 | Optical element, polarized illumination device and projection display device |
DE19708240C2 (en) | 1997-02-28 | 1999-10-14 | Siemens Ag | Arrangement and method for detecting an object in a region illuminated by waves in the invisible spectral range |
US6088612A (en) | 1997-04-04 | 2000-07-11 | Medtech Research Corporation | Method and apparatus for reflective glare removal in digital photography useful in cervical cancer detection |
US6058397A (en) | 1997-04-08 | 2000-05-02 | Mitsubishi Electric Information Technology Center America, Inc. | 3D virtual environment creation management and delivery system |
JPH10334270A (en) | 1997-05-28 | 1998-12-18 | Mitsubishi Electric Corp | Operation recognition device and recorded medium recording operation recognition program |
JP3145059B2 (en) | 1997-06-13 | 2001-03-12 | 株式会社ナムコ | Information storage medium and image generation device |
JP3183632B2 (en) | 1997-06-13 | 2001-07-09 | 株式会社ナムコ | Information storage medium and image generation device |
US6075895A (en) | 1997-06-20 | 2000-06-13 | Holoplex | Methods and apparatus for gesture recognition based on templates |
JP3968477B2 (en) | 1997-07-07 | 2007-08-29 | ソニー株式会社 | Information input device and information input method |
US6720949B1 (en) | 1997-08-22 | 2004-04-13 | Timothy R. Pryor | Man machine interfaces and applications |
US6522312B2 (en) | 1997-09-01 | 2003-02-18 | Canon Kabushiki Kaisha | Apparatus for presenting mixed reality shared among operators |
JP3795647B2 (en) | 1997-10-29 | 2006-07-12 | 株式会社竹中工務店 | Hand pointing device |
JP3794180B2 (en) | 1997-11-11 | 2006-07-05 | セイコーエプソン株式会社 | Coordinate input system and coordinate input device |
US6166744A (en) | 1997-11-26 | 2000-12-26 | Pathfinder Systems, Inc. | System for combining virtual images with real-world scenes |
US6611241B1 (en) | 1997-12-02 | 2003-08-26 | Sarnoff Corporation | Modular display system |
JPH11175750A (en) | 1997-12-05 | 1999-07-02 | Namco Ltd | Image generating device and information storage medium |
US6388657B1 (en) | 1997-12-31 | 2002-05-14 | Anthony James Francis Natoli | Virtual reality keyboard system and method |
US6198844B1 (en) | 1998-01-28 | 2001-03-06 | Konica Corporation | Image processing apparatus |
US6349301B1 (en) | 1998-02-24 | 2002-02-19 | Microsoft Corporation | Virtual environment bystander updating in client server architecture |
US6266053B1 (en) | 1998-04-03 | 2001-07-24 | Synapix, Inc. | Time inheritance scene graph for representation of media content |
KR100530812B1 (en) | 1998-04-13 | 2005-11-28 | 네브엔지니어링 인코포레이티드 | Wavelet-based facial motion capture for avatar animation |
US5966696A (en) * | 1998-04-14 | 1999-10-12 | Infovation | System for tracking consumer exposure and for exposing consumers to different advertisements |
JP3745117B2 (en) | 1998-05-08 | 2006-02-15 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP2000105583A (en) * | 1998-07-27 | 2000-04-11 | Ricoh Co Ltd | Interactive display device |
US6263339B1 (en) | 1998-08-25 | 2001-07-17 | Informix Software, Inc. | Dynamic object visualization and code generation |
US6228538B1 (en) | 1998-08-28 | 2001-05-08 | Micron Technology, Inc. | Mask forming methods and field emission display emitter mask forming methods |
US6122014A (en) | 1998-09-17 | 2000-09-19 | Motorola, Inc. | Modified chroma keyed technique for simple shape coding for digital video |
JP2000163196A (en) | 1998-09-25 | 2000-06-16 | Sanyo Electric Co Ltd | Gesture recognizing device and instruction recognizing device having gesture recognizing function |
DE19845030A1 (en) | 1998-09-30 | 2000-04-20 | Siemens Ag | Imaging system for reproduction of medical image information |
US6501515B1 (en) | 1998-10-13 | 2002-12-31 | Sony Corporation | Remote control system |
US6661918B1 (en) | 1998-12-04 | 2003-12-09 | Interval Research Corporation | Background estimation and segmentation based on range and color |
JP4159159B2 (en) * | 1999-01-20 | 2008-10-01 | 株式会社野村総合研究所 | Advertising media evaluation device |
GB9902235D0 (en) | 1999-02-01 | 1999-03-24 | Emuse Corp | Interactive system |
US6552760B1 (en) | 1999-02-18 | 2003-04-22 | Fujitsu Limited | Luminaire with improved light utilization efficiency |
US6333735B1 (en) | 1999-03-16 | 2001-12-25 | International Business Machines Corporation | Method and apparatus for mouse positioning device based on infrared light sources and detectors |
JP3644295B2 (en) | 1999-03-17 | 2005-04-27 | セイコーエプソン株式会社 | Projection display |
US6292171B1 (en) | 1999-03-31 | 2001-09-18 | Seiko Epson Corporation | Method and apparatus for calibrating a computer-generated projected image |
US6614422B1 (en) | 1999-11-04 | 2003-09-02 | Canesta, Inc. | Method and apparatus for entering data using a virtual input device |
JP2000350865A (en) | 1999-06-11 | 2000-12-19 | Mr System Kenkyusho:Kk | Game device for composite real space, image processing method therefor and program storage medium |
US6545706B1 (en) | 1999-07-30 | 2003-04-08 | Electric Planet, Inc. | System, method and article of manufacture for tracking a head of a camera-generated image of a person |
JP4604439B2 (en) | 1999-08-19 | 2011-01-05 | ソニー株式会社 | Image processing apparatus, image processing method, and recording medium |
JP4691754B2 (en) | 1999-09-07 | 2011-06-01 | 株式会社セガ | Game device |
JP3905670B2 (en) | 1999-09-10 | 2007-04-18 | 株式会社リコー | Coordinate input detection apparatus, information storage medium, and coordinate input detection method |
US6407870B1 (en) | 1999-10-28 | 2002-06-18 | Ihar Hurevich | Optical beam shaper and method for spatial redistribution of inhomogeneous beam |
US6826727B1 (en) | 1999-11-24 | 2004-11-30 | Bitstream Inc. | Apparatus, methods, programming for automatically laying out documents |
US20030065563A1 (en) * | 1999-12-01 | 2003-04-03 | Efunds Corporation | Method and apparatus for atm-based cross-selling of products and services |
JP3760068B2 (en) | 1999-12-02 | 2006-03-29 | 本田技研工業株式会社 | Image recognition device |
GB2356996A (en) | 1999-12-03 | 2001-06-06 | Hewlett Packard Co | Improvements to digital cameras |
JP2003517642A (en) * | 1999-12-17 | 2003-05-27 | プロモ・ヴィユー | Interactive sales promotion information communication system |
US6666567B1 (en) | 1999-12-28 | 2003-12-23 | Honeywell International Inc. | Methods and apparatus for a light source with a raised LED structure |
US20050195598A1 (en) | 2003-02-07 | 2005-09-08 | Dancs Imre J. | Projecting light and images from a device |
JP4332964B2 (en) | 1999-12-21 | 2009-09-16 | ソニー株式会社 | Information input / output system and information input / output method |
JP4549468B2 (en) | 1999-12-28 | 2010-09-22 | 株式会社トプコン | Lens meter |
JP3363861B2 (en) | 2000-01-13 | 2003-01-08 | キヤノン株式会社 | Mixed reality presentation device, mixed reality presentation method, and storage medium |
JP3312018B2 (en) | 2000-01-14 | 2002-08-05 | コナミ株式会社 | Game system and computer-readable storage medium |
AU2001234601A1 (en) | 2000-01-26 | 2001-08-07 | New York University | Method and system for facilitating wireless, full-body, real-time user interaction with digitally generated text data |
US20020140633A1 (en) | 2000-02-03 | 2002-10-03 | Canesta, Inc. | Method and system to present immersion virtual simulations using three-dimensional measurement |
US6491396B2 (en) | 2000-02-15 | 2002-12-10 | Seiko Epson Corporation | Projector modulating a plurality of partial luminous fluxes according to imaging information by means of an electro-optical device |
US6663491B2 (en) | 2000-02-18 | 2003-12-16 | Namco Ltd. | Game apparatus, storage medium and computer program that adjust tempo of sound |
AU2001249994A1 (en) * | 2000-02-25 | 2001-09-03 | Interval Research Corporation | Method and system for selecting advertisements |
SE0000850D0 (en) | 2000-03-13 | 2000-03-13 | Pink Solution Ab | Recognition arrangement |
JP3842515B2 (en) | 2000-03-24 | 2006-11-08 | セイコーエプソン株式会社 | Pointed position detection by multi-resolution image analysis |
US20020046100A1 (en) * | 2000-04-18 | 2002-04-18 | Naoto Kinjo | Image display method |
JP2002014997A (en) * | 2000-04-27 | 2002-01-18 | Ntt Comware Corp | Method and system for distribution of advertisement information |
US7859519B2 (en) | 2000-05-01 | 2010-12-28 | Tulbert David J | Human-machine interface |
JP4402262B2 (en) | 2000-06-07 | 2010-01-20 | オリンパス株式会社 | Printer device and electronic camera |
US6752720B1 (en) | 2000-06-15 | 2004-06-22 | Intel Corporation | Mobile remote control video gaming system |
US6873710B1 (en) * | 2000-06-27 | 2005-03-29 | Koninklijke Philips Electronics N.V. | Method and apparatus for tuning content of information presented to an audience |
US7149262B1 (en) | 2000-07-06 | 2006-12-12 | The Trustees Of Columbia University In The City Of New York | Method and apparatus for enhancing data resolution |
CN1333489A (en) * | 2000-07-12 | 2002-01-30 | 张正国 | Television connecting internet system by one key operation |
US7227526B2 (en) | 2000-07-24 | 2007-06-05 | Gesturetek, Inc. | Video-based image control system |
JP4666808B2 (en) | 2000-07-27 | 2011-04-06 | キヤノン株式会社 | Image display system, image display method, storage medium, and program |
US6754370B1 (en) | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
US6707444B1 (en) | 2000-08-18 | 2004-03-16 | International Business Machines Corporation | Projector and camera arrangement with shared optics and optical marker for use with whiteboard systems |
US7103838B1 (en) | 2000-08-18 | 2006-09-05 | Firstrain, Inc. | Method and apparatus for extracting relevant data |
US6654734B1 (en) | 2000-08-30 | 2003-11-25 | International Business Machines Corporation | System and method for query processing and optimization for XML repositories |
WO2002019717A2 (en) * | 2000-08-31 | 2002-03-07 | Myrio Corporation | Real-time audience monitoring, content rating, and content enhancing |
JP2002092023A (en) * | 2000-09-14 | 2002-03-29 | Nippon Telegr & Teleph Corp <Ntt> | Information providing device and its method and recording medium with information providing program recorded thereon |
US7000200B1 (en) | 2000-09-15 | 2006-02-14 | Intel Corporation | Gesture recognition system recognizing gestures within a specified timing |
EP1211640A3 (en) | 2000-09-15 | 2003-10-15 | Canon Kabushiki Kaisha | Image processing methods and apparatus for detecting human eyes, human face and other objects in an image |
JP4432246B2 (en) * | 2000-09-29 | 2010-03-17 | ソニー株式会社 | Audience status determination device, playback output control system, audience status determination method, playback output control method, recording medium |
US7058204B2 (en) | 2000-10-03 | 2006-06-06 | Gesturetek, Inc. | Multiple camera control system |
JP4518661B2 (en) | 2000-11-29 | 2010-08-04 | 京セラ株式会社 | Manufacturing method of image display device |
JP2002171507A (en) * | 2000-11-30 | 2002-06-14 | Yokogawa Electric Corp | Contents distribution method and contents distribution system |
JP3467017B2 (en) | 2000-11-30 | 2003-11-17 | キヤノン株式会社 | Position and orientation determination method and apparatus, and storage medium |
US6431711B1 (en) | 2000-12-06 | 2002-08-13 | International Business Machines Corporation | Multiple-surface display projector with interactive input capability |
JP2002222424A (en) | 2001-01-29 | 2002-08-09 | Nec Corp | Fingerprint matching system |
US8085293B2 (en) | 2001-03-14 | 2011-12-27 | Koninklijke Philips Electronics N.V. | Self adjusting stereo camera system |
US6621483B2 (en) | 2001-03-16 | 2003-09-16 | Agilent Technologies, Inc. | Optical screen pointing device with inertial properties |
US20020140682A1 (en) | 2001-03-29 | 2002-10-03 | Brown Frank T. | Optical drawing tablet |
US6912313B2 (en) | 2001-05-31 | 2005-06-28 | Sharp Laboratories Of America, Inc. | Image background replacement method |
EP1689172B1 (en) | 2001-06-05 | 2016-03-09 | Microsoft Technology Licensing, LLC | Interactive video display system |
US8035612B2 (en) | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Self-contained interactive video display system |
US8300042B2 (en) | 2001-06-05 | 2012-10-30 | Microsoft Corporation | Interactive video display system using strobed light |
JP2003004905A (en) | 2001-06-18 | 2003-01-08 | Toppan Printing Co Ltd | Both-side lens sheet, rear type projection screen and display device using it |
JP2003016804A (en) | 2001-06-27 | 2003-01-17 | Nichia Chem Ind Ltd | Led indicator lamp |
US7190832B2 (en) | 2001-07-17 | 2007-03-13 | Amnis Corporation | Computational methods for the segmentation of images of objects from background in a flow imaging instrument |
US7274800B2 (en) | 2001-07-18 | 2007-09-25 | Intel Corporation | Dynamic gesture recognition from stereo sequences |
US7068274B2 (en) | 2001-08-15 | 2006-06-27 | Mitsubishi Electric Research Laboratories, Inc. | System and method for animating real objects with projected images |
JP2003173237A (en) | 2001-09-28 | 2003-06-20 | Ricoh Co Ltd | Information input-output system, program and storage medium |
CA2468869A1 (en) | 2001-12-03 | 2003-06-12 | Tsutomu Yoshida | Lens array sheet, transparent screen, and rear-projection display |
TWI222029B (en) | 2001-12-04 | 2004-10-11 | Desun Technology Co Ltd | Two-in-one image display/image capture apparatus and the method thereof and identification system using the same |
JP2003196655A (en) * | 2001-12-25 | 2003-07-11 | Toyota Motor Corp | Eye image detection device |
US20030126013A1 (en) * | 2001-12-28 | 2003-07-03 | Shand Mark Alexander | Viewer-targeted display system and method |
KR20030058894A (en) | 2002-01-02 | 2003-07-07 | 조영탁 | Virtual golf simulator |
CA2475132A1 (en) | 2002-02-20 | 2003-08-28 | University Of Washington | Analytical instruments using a pseudorandom array of sample sources, such as a micro-machined mass spectrometer or monochromator |
JP4027118B2 (en) | 2002-02-25 | 2007-12-26 | 富士通株式会社 | User authentication method, program, and apparatus |
JP4165095B2 (en) * | 2002-03-15 | 2008-10-15 | オムロン株式会社 | Information providing apparatus and information providing method |
US6607275B1 (en) | 2002-03-20 | 2003-08-19 | The Neiman Marcus Group, Inc. | Merchandise display case and system |
US6707054B2 (en) | 2002-03-21 | 2004-03-16 | Eastman Kodak Company | Scannerless range imaging system having high dynamic range |
US6831664B2 (en) | 2002-03-22 | 2004-12-14 | Koninklijke Philips Electronics N.V. | Low cost interactive program control system and method |
US20050122308A1 (en) | 2002-05-28 | 2005-06-09 | Matthew Bell | Self-contained interactive video display system |
US7710391B2 (en) | 2002-05-28 | 2010-05-04 | Matthew Bell | Processing an image utilizing a spatially varying pattern |
US7170492B2 (en) | 2002-05-28 | 2007-01-30 | Reactrix Systems, Inc. | Interactive video display system |
US7348963B2 (en) | 2002-05-28 | 2008-03-25 | Reactrix Systems, Inc. | Interactive video display system |
EP1552427A4 (en) | 2002-06-13 | 2009-12-16 | Mark Logic Corp | Parent-child query indexing for xml databases |
US7574652B2 (en) | 2002-06-20 | 2009-08-11 | Canon Kabushiki Kaisha | Methods for interactively defining transforms and for generating queries by manipulating existing query data |
US20040091110A1 (en) | 2002-11-08 | 2004-05-13 | Anthony Christian Barkans | Copy protected display screen |
US7576727B2 (en) | 2002-12-13 | 2009-08-18 | Matthew Bell | Interactive directed light/sound system |
AU2003301043A1 (en) | 2002-12-13 | 2004-07-09 | Reactrix Systems | Interactive directed light/sound system |
JP4397394B2 (en) | 2003-01-24 | 2010-01-13 | ディジタル・オプティクス・インターナショナル・コーポレイション | High density lighting system |
US6999600B2 (en) | 2003-01-30 | 2006-02-14 | Objectvideo, Inc. | Video scene background maintenance using change detection and classification |
US6877882B1 (en) | 2003-03-12 | 2005-04-12 | Delta Electronics, Inc. | Illumination system for a projection system |
US7665041B2 (en) | 2003-03-25 | 2010-02-16 | Microsoft Corporation | Architecture for controlling a computer using hand gestures |
JP4127546B2 (en) | 2003-04-15 | 2008-07-30 | 富士通株式会社 | Image collation apparatus, image collation method, and image collation program |
DE60330276D1 (en) | 2003-04-25 | 2010-01-07 | Fujitsu Ltd | FINGERPRINTER COMPARISON EQUIPMENT, FINGERPRINTER COMPARISON METHOD AND FINGERPRINT COMPARISON PROGRAM |
JP2005049795A (en) | 2003-07-31 | 2005-02-24 | Dainippon Printing Co Ltd | Lens sheet for screen |
US20050027587A1 (en) | 2003-08-01 | 2005-02-03 | Latona Richard Edward | System and method for determining object effectiveness |
US20050039206A1 (en) | 2003-08-06 | 2005-02-17 | Opdycke Thomas C. | System and method for delivering and optimizing media programming in public spaces |
WO2005038629A2 (en) | 2003-10-17 | 2005-04-28 | Park Media, Llc | Digital media presentation system |
US20050088407A1 (en) | 2003-10-24 | 2005-04-28 | Matthew Bell | Method and system for managing an interactive video display system |
WO2005041579A2 (en) | 2003-10-24 | 2005-05-06 | Reactrix Systems, Inc. | Method and system for processing captured image information in an interactive video display system |
US20050104506A1 (en) | 2003-11-18 | 2005-05-19 | Youh Meng-Jey | Triode Field Emission Cold Cathode Devices with Random Distribution and Method |
US7268950B2 (en) | 2003-11-18 | 2007-09-11 | Merlin Technology Limited Liability Company | Variable optical arrays and variable manufacturing methods |
US7619824B2 (en) | 2003-11-18 | 2009-11-17 | Merlin Technology Limited Liability Company | Variable optical arrays and variable manufacturing methods |
US7681114B2 (en) | 2003-11-21 | 2010-03-16 | Bridgeborn, Llc | Method of authoring, deploying and using interactive, data-driven two or more dimensional content |
KR100970253B1 (en) | 2003-12-19 | 2010-07-16 | 삼성전자주식회사 | Method of manufacturing light emitting device |
KR100588042B1 (en) | 2004-01-14 | 2006-06-09 | 한국과학기술연구원 | Interactive presentation system |
WO2005091651A2 (en) | 2004-03-18 | 2005-09-29 | Reactrix Systems, Inc. | Interactive video display system |
CN100573548C (en) | 2004-04-15 | 2009-12-23 | 格斯图尔泰克股份有限公司 | The method and apparatus of tracking bimanual movements |
US7382897B2 (en) | 2004-04-27 | 2008-06-03 | Microsoft Corporation | Multi-image feature matching using multi-scale oriented patches |
US7394459B2 (en) | 2004-04-29 | 2008-07-01 | Microsoft Corporation | Interaction between objects and a virtual environment display |
WO2005119576A2 (en) | 2004-06-01 | 2005-12-15 | Ultra-Scan Corporation | Fingerprint image database and method of matching fingerprint sample to fingerprint images |
US7432917B2 (en) | 2004-06-16 | 2008-10-07 | Microsoft Corporation | Calibration of an interactive display system |
US7519223B2 (en) | 2004-06-28 | 2009-04-14 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
KR101323966B1 (en) | 2004-07-30 | 2013-10-31 | 익스트림 리얼리티 엘티디. | A system and method for 3D space-dimension based image processing |
US7728821B2 (en) | 2004-08-06 | 2010-06-01 | Touchtable, Inc. | Touch detecting interactive display |
JP2006068315A (en) | 2004-09-02 | 2006-03-16 | Sega Corp | Pause detection program, video game device, pause detection method, and computer-readable recording medium recorded with program |
US7330584B2 (en) | 2004-10-14 | 2008-02-12 | Sony Corporation | Image processing apparatus and method |
KR100729280B1 (en) | 2005-01-08 | 2007-06-15 | 아이리텍 잉크 | Iris Identification System and Method using Mobile Device with Stereo Camera |
US7818666B2 (en) | 2005-01-27 | 2010-10-19 | Symyx Solutions, Inc. | Parsing, evaluating leaf, and branch nodes, and navigating the nodes based on the evaluation |
US7598942B2 (en) | 2005-02-08 | 2009-10-06 | Oblong Industries, Inc. | System and method for gesture based control system |
US20060184993A1 (en) | 2005-02-15 | 2006-08-17 | Goldthwaite Flora P | Method and system for collecting and using data |
US7570249B2 (en) | 2005-03-30 | 2009-08-04 | Microsoft Corporation | Responding to change of state of control on device disposed on an interactive display surface |
US20060258397A1 (en) | 2005-05-10 | 2006-11-16 | Kaplan Mark M | Integrated mobile application server and communication gateway |
US7428542B1 (en) | 2005-05-31 | 2008-09-23 | Reactrix Systems, Inc. | Method and system for combining nodes into a mega-node |
US8081822B1 (en) | 2005-05-31 | 2011-12-20 | Intellectual Ventures Holding 67 Llc | System and method for sensing a feature of an object in an interactive video display |
US7431253B2 (en) | 2005-06-03 | 2008-10-07 | Kye Systems Corp. | Support device for computer peripheral equipment |
JP2007004488A (en) | 2005-06-23 | 2007-01-11 | Sony Corp | Electronic advertisement system and display control method thereof |
US7970870B2 (en) | 2005-06-24 | 2011-06-28 | Microsoft Corporation | Extending digital artifacts through an interactive surface |
US7680301B2 (en) | 2005-06-30 | 2010-03-16 | Sportvision, Inc. | Measurements using a single image |
US7576766B2 (en) | 2005-06-30 | 2009-08-18 | Microsoft Corporation | Normalized images for cameras |
US8098277B1 (en) | 2005-12-02 | 2012-01-17 | Intellectual Ventures Holding 67 Llc | Systems and methods for communication between a reactive video system and a mobile communication device |
US20080040692A1 (en) | 2006-06-29 | 2008-02-14 | Microsoft Corporation | Gesture input |
US8589824B2 (en) | 2006-07-13 | 2013-11-19 | Northrop Grumman Systems Corporation | Gesture recognition interface system |
US8395658B2 (en) | 2006-09-07 | 2013-03-12 | Sony Computer Entertainment Inc. | Touch screen-like user interface that does not require actual touching |
US7737636B2 (en) | 2006-11-09 | 2010-06-15 | Intematix Corporation | LED assembly with an LED and adjacent lens and method of making same |
DE502006007337D1 (en) | 2006-12-11 | 2010-08-12 | Brainlab Ag | Multi-band tracking and calibration system |
JP4662071B2 (en) | 2006-12-27 | 2011-03-30 | 富士フイルム株式会社 | Image playback method |
US7961906B2 (en) | 2007-01-03 | 2011-06-14 | Science Applications International Corporation | Human detection with imaging sensors |
US7971156B2 (en) | 2007-01-12 | 2011-06-28 | International Business Machines Corporation | Controlling resource access based on user gesturing in a 3D captured image stream of the user |
US7745771B2 (en) | 2007-04-03 | 2010-06-29 | Delphi Technologies, Inc. | Synchronous imaging using segmented illumination |
US20080252596A1 (en) | 2007-04-10 | 2008-10-16 | Matthew Bell | Display Using a Three-Dimensional vision System |
US8726194B2 (en) | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
AU2008299883B2 (en) | 2007-09-14 | 2012-03-15 | Facebook, Inc. | Processing of gesture-based user interactions |
CN101874404B (en) | 2007-09-24 | 2013-09-18 | 高通股份有限公司 | Enhanced interface for voice and video communications |
JP5228439B2 (en) | 2007-10-22 | 2013-07-03 | 三菱電機株式会社 | Operation input device |
US8159682B2 (en) | 2007-11-12 | 2012-04-17 | Intellectual Ventures Holding 67 Llc | Lens system |
US20090172606A1 (en) | 2007-12-31 | 2009-07-02 | Motorola, Inc. | Method and apparatus for two-handed computer user interface with gesture recognition |
US20100039500A1 (en) | 2008-02-15 | 2010-02-18 | Matthew Bell | Self-Contained 3D Vision System Utilizing Stereo Camera and Patterned Illuminator |
US8259163B2 (en) | 2008-03-07 | 2012-09-04 | Intellectual Ventures Holding 67 Llc | Display with built in 3D sensing |
US8595218B2 (en) | 2008-06-12 | 2013-11-26 | Intellectual Ventures Holding 67 Llc | Interactive display management systems and methods |
-
2004
- 2004-10-25 US US10/973,335 patent/US20050088407A1/en not_active Abandoned
- 2004-10-25 JP JP2006536930A patent/JP4794453B2/en not_active Expired - Fee Related
- 2004-10-25 CN CN2004800309518A patent/CN1902930B/en not_active Expired - Fee Related
- 2004-10-25 EP EP04796450A patent/EP1676442A2/en not_active Withdrawn
- 2004-10-25 KR KR1020067007617A patent/KR101094119B1/en not_active IP Right Cessation
- 2004-10-25 CN CN2010105225412A patent/CN102034197A/en active Pending
- 2004-10-25 WO PCT/US2004/035477 patent/WO2005041578A2/en active Application Filing
-
2009
- 2009-04-02 US US12/417,588 patent/US8487866B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5861881A (en) * | 1991-11-25 | 1999-01-19 | Actv, Inc. | Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers |
US5510828A (en) * | 1994-03-01 | 1996-04-23 | Lutterbach; R. Steven | Interactive video display system |
US6118888A (en) * | 1997-02-28 | 2000-09-12 | Kabushiki Kaisha Toshiba | Multi-modal interface apparatus and method |
US6106119A (en) * | 1998-10-16 | 2000-08-22 | The Board Of Trustees Of The Leland Stanford Junior University | Method for presenting high level interpretations of eye tracking data correlated to saved display images |
US20020178440A1 (en) * | 2001-03-28 | 2002-11-28 | Philips Electronics North America Corp. | Method and apparatus for automatically selecting an alternate item based on user behavior |
US20020186221A1 (en) * | 2001-06-05 | 2002-12-12 | Reactrix Systems, Inc. | Interactive video display system |
US20030093784A1 (en) * | 2001-11-13 | 2003-05-15 | Koninklijke Philips Electronics N.V. | Affective television monitoring and control |
US20030098819A1 (en) * | 2001-11-29 | 2003-05-29 | Compaq Information Technologies Group, L.P. | Wireless multi-user multi-projector presentation system |
US20030122839A1 (en) * | 2001-12-26 | 2003-07-03 | Eastman Kodak Company | Image format including affective information |
US20060256382A1 (en) * | 2001-12-26 | 2006-11-16 | Matraszek Tomasz A | Method for creating and using affective information in a digital imaging system |
US7193608B2 (en) * | 2003-05-27 | 2007-03-20 | York University | Collaborative pointing devices |
Cited By (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7834846B1 (en) | 2001-06-05 | 2010-11-16 | Matthew Bell | Interactive video display system |
US20080062123A1 (en) * | 2001-06-05 | 2008-03-13 | Reactrix Systems, Inc. | Interactive video display system using strobed light |
US8300042B2 (en) | 2001-06-05 | 2012-10-30 | Microsoft Corporation | Interactive video display system using strobed light |
US20080150913A1 (en) * | 2002-05-28 | 2008-06-26 | Matthew Bell | Computer vision based touch screen |
US20050162381A1 (en) * | 2002-05-28 | 2005-07-28 | Matthew Bell | Self-contained interactive video display system |
US20050110964A1 (en) * | 2002-05-28 | 2005-05-26 | Matthew Bell | Interactive video window display system |
US8035614B2 (en) | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Interactive video window |
US8035624B2 (en) | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Computer vision based touch screen |
US8035612B2 (en) | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Self-contained interactive video display system |
US7710391B2 (en) | 2002-05-28 | 2010-05-04 | Matthew Bell | Processing an image utilizing a spatially varying pattern |
US8199108B2 (en) | 2002-12-13 | 2012-06-12 | Intellectual Ventures Holding 67 Llc | Interactive directed light/sound system |
US20050089194A1 (en) * | 2003-10-24 | 2005-04-28 | Matthew Bell | Method and system for processing captured image information in an interactive video display system |
US8487866B2 (en) | 2003-10-24 | 2013-07-16 | Intellectual Ventures Holding 67 Llc | Method and system for managing an interactive video display system |
US20090235295A1 (en) * | 2003-10-24 | 2009-09-17 | Matthew Bell | Method and system for managing an interactive video display system |
US7809167B2 (en) | 2003-10-24 | 2010-10-05 | Matthew Bell | Method and system for processing captured image information in an interactive video display system |
US7536032B2 (en) | 2003-10-24 | 2009-05-19 | Reactrix Systems, Inc. | Method and system for processing captured image information in an interactive video display system |
US8515132B2 (en) | 2004-04-15 | 2013-08-20 | Qualcomm Incorporated | Tracking bimanual movements |
US8259996B2 (en) | 2004-04-15 | 2012-09-04 | Qualcomm Incorporated | Tracking bimanual movements |
US20050238201A1 (en) * | 2004-04-15 | 2005-10-27 | Atid Shamaie | Tracking bimanual movements |
US20080219502A1 (en) * | 2004-04-15 | 2008-09-11 | Gesturetek, Inc. | Tracking bimanual movements |
US7379563B2 (en) | 2004-04-15 | 2008-05-27 | Gesturetek, Inc. | Tracking bimanual movements |
US8560972B2 (en) * | 2004-08-10 | 2013-10-15 | Microsoft Corporation | Surface UI for gesture-based interaction |
US20100027843A1 (en) * | 2004-08-10 | 2010-02-04 | Microsoft Corporation | Surface ui for gesture-based interaction |
US20060036944A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Surface UI for gesture-based interaction |
US8483437B2 (en) | 2005-01-07 | 2013-07-09 | Qualcomm Incorporated | Detecting and tracking objects in images |
US20060188849A1 (en) * | 2005-01-07 | 2006-08-24 | Atid Shamaie | Detecting and tracking objects in images |
US7574020B2 (en) | 2005-01-07 | 2009-08-11 | Gesturetek, Inc. | Detecting and tracking objects in images |
US8170281B2 (en) | 2005-01-07 | 2012-05-01 | Qualcomm Incorporated | Detecting and tracking objects in images |
US20080187178A1 (en) * | 2005-01-07 | 2008-08-07 | Gesturetek, Inc. | Detecting and tracking objects in images |
US20090295756A1 (en) * | 2005-01-07 | 2009-12-03 | Gesturetek, Inc. | Detecting and tracking objects in images |
US7853041B2 (en) | 2005-01-07 | 2010-12-14 | Gesturetek, Inc. | Detecting and tracking objects in images |
US20060192782A1 (en) * | 2005-01-21 | 2006-08-31 | Evan Hildreth | Motion-based tracking |
US8717288B2 (en) | 2005-01-21 | 2014-05-06 | Qualcomm Incorporated | Motion-based tracking |
US8144118B2 (en) | 2005-01-21 | 2012-03-27 | Qualcomm Incorporated | Motion-based tracking |
US20060218618A1 (en) * | 2005-03-22 | 2006-09-28 | Lorkovic Joseph E | Dual display interactive video |
US9128519B1 (en) | 2005-04-15 | 2015-09-08 | Intellectual Ventures Holding 67 Llc | Method and system for state-based control of objects |
US8081822B1 (en) | 2005-05-31 | 2011-12-20 | Intellectual Ventures Holding 67 Llc | System and method for sensing a feature of an object in an interactive video display |
US8098277B1 (en) | 2005-12-02 | 2012-01-17 | Intellectual Ventures Holding 67 Llc | Systems and methods for communication between a reactive video system and a mobile communication device |
US20100034457A1 (en) * | 2006-05-11 | 2010-02-11 | Tamir Berliner | Modeling of humanoid forms from depth maps |
US8249334B2 (en) | 2006-05-11 | 2012-08-21 | Primesense Ltd. | Modeling of humanoid forms from depth maps |
US8559676B2 (en) | 2006-12-29 | 2013-10-15 | Qualcomm Incorporated | Manipulation of virtual objects using enhanced interactive system |
US20080252596A1 (en) * | 2007-04-10 | 2008-10-16 | Matthew Bell | Display Using a Three-Dimensional vision System |
US20090077504A1 (en) * | 2007-09-14 | 2009-03-19 | Matthew Bell | Processing of Gesture-Based User Interactions |
US9811166B2 (en) | 2007-09-14 | 2017-11-07 | Intellectual Ventures Holding 81 Llc | Processing of gesture-based user interactions using volumetric zones |
US10564731B2 (en) | 2007-09-14 | 2020-02-18 | Facebook, Inc. | Processing of gesture-based user interactions using volumetric zones |
US8230367B2 (en) | 2007-09-14 | 2012-07-24 | Intellectual Ventures Holding 67 Llc | Gesture-based user interactions with status indicators for acceptable inputs in volumetric zones |
US10990189B2 (en) | 2007-09-14 | 2021-04-27 | Facebook, Inc. | Processing of gesture-based user interaction using volumetric zones |
US9058058B2 (en) | 2007-09-14 | 2015-06-16 | Intellectual Ventures Holding 67 Llc | Processing of gesture-based user interactions activation levels |
US8159682B2 (en) | 2007-11-12 | 2012-04-17 | Intellectual Ventures Holding 67 Llc | Lens system |
US8810803B2 (en) | 2007-11-12 | 2014-08-19 | Intellectual Ventures Holding 67 Llc | Lens system |
US9229107B2 (en) | 2007-11-12 | 2016-01-05 | Intellectual Ventures Holding 81 Llc | Lens system |
US20090251685A1 (en) * | 2007-11-12 | 2009-10-08 | Matthew Bell | Lens System |
US8166421B2 (en) | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
US20090183125A1 (en) * | 2008-01-14 | 2009-07-16 | Prime Sense Ltd. | Three-dimensional user interface |
US9035876B2 (en) | 2008-01-14 | 2015-05-19 | Apple Inc. | Three-dimensional user interface session control |
US10831278B2 (en) | 2008-03-07 | 2020-11-10 | Facebook, Inc. | Display with built in 3D sensing capability and gesture control of tv |
US8259163B2 (en) | 2008-03-07 | 2012-09-04 | Intellectual Ventures Holding 67 Llc | Display with built in 3D sensing |
US9247236B2 (en) | 2008-03-07 | 2016-01-26 | Intellectual Ventures Holdings 81 Llc | Display with built in 3D sensing capability and gesture control of TV |
US8595218B2 (en) | 2008-06-12 | 2013-11-26 | Intellectual Ventures Holding 67 Llc | Interactive display management systems and methods |
US20100121866A1 (en) * | 2008-06-12 | 2010-05-13 | Matthew Bell | Interactive display management systems and methods |
US20100082423A1 (en) * | 2008-09-30 | 2010-04-01 | Yahoo! Inc. | System for optimizing ad performance at campaign running time |
US8645205B2 (en) * | 2008-09-30 | 2014-02-04 | Yahoo! Inc. | System for optimizing ad performance at campaign running time |
US8624962B2 (en) | 2009-02-02 | 2014-01-07 | Ydreams—Informatica, S.A. Ydreams | Systems and methods for simulating three-dimensional virtual interactions from two-dimensional camera images |
US9569001B2 (en) * | 2009-02-03 | 2017-02-14 | Massachusetts Institute Of Technology | Wearable gestural interface |
US20100199232A1 (en) * | 2009-02-03 | 2010-08-05 | Massachusetts Institute Of Technology | Wearable Gestural Interface |
US20100235786A1 (en) * | 2009-03-13 | 2010-09-16 | Primesense Ltd. | Enhanced 3d interfacing for remote devices |
US8565479B2 (en) | 2009-08-13 | 2013-10-22 | Primesense Ltd. | Extraction of skeletons from 3D maps |
US20110052006A1 (en) * | 2009-08-13 | 2011-03-03 | Primesense Ltd. | Extraction of skeletons from 3d maps |
US20110164032A1 (en) * | 2010-01-07 | 2011-07-07 | Prime Sense Ltd. | Three-Dimensional User Interface |
US8787663B2 (en) | 2010-03-01 | 2014-07-22 | Primesense Ltd. | Tracking body parts by combined color image and depth processing |
US20110211754A1 (en) * | 2010-03-01 | 2011-09-01 | Primesense Ltd. | Tracking body parts by combined color image and depth processing |
US8594425B2 (en) | 2010-05-31 | 2013-11-26 | Primesense Ltd. | Analysis of three-dimensional scenes |
US8824737B2 (en) | 2010-05-31 | 2014-09-02 | Primesense Ltd. | Identifying components of a humanoid form in three-dimensional scenes |
US8781217B2 (en) | 2010-05-31 | 2014-07-15 | Primesense Ltd. | Analysis of three-dimensional scenes with a surface model |
US9158375B2 (en) | 2010-07-20 | 2015-10-13 | Apple Inc. | Interactive reality augmentation for natural interaction |
US9201501B2 (en) | 2010-07-20 | 2015-12-01 | Apple Inc. | Adaptive projector |
US8582867B2 (en) | 2010-09-16 | 2013-11-12 | Primesense Ltd | Learning-based pose estimation from depth maps |
US8959013B2 (en) | 2010-09-27 | 2015-02-17 | Apple Inc. | Virtual keyboard for a non-tactile three dimensional user interface |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
US9454225B2 (en) | 2011-02-09 | 2016-09-27 | Apple Inc. | Gaze-based display control |
US9342146B2 (en) | 2011-02-09 | 2016-05-17 | Apple Inc. | Pointing-based display interaction |
US9285874B2 (en) | 2011-02-09 | 2016-03-15 | Apple Inc. | Gaze detection in a 3D mapping environment |
US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9218063B2 (en) | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
US9122311B2 (en) | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
US9002099B2 (en) | 2011-09-11 | 2015-04-07 | Apple Inc. | Learning-based estimation of hand and finger pose |
US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
US9377863B2 (en) | 2012-03-26 | 2016-06-28 | Apple Inc. | Gaze-enhanced virtual touchscreen |
US11169611B2 (en) | 2012-03-26 | 2021-11-09 | Apple Inc. | Enhanced virtual touchpad |
US20230415041A1 (en) * | 2012-04-12 | 2023-12-28 | Supercell Oy | System and method for controlling technical processes |
US11771988B2 (en) * | 2012-04-12 | 2023-10-03 | Supercell Oy | System and method for controlling technical processes |
US20230083741A1 (en) * | 2012-04-12 | 2023-03-16 | Supercell Oy | System and method for controlling technical processes |
US9047507B2 (en) | 2012-05-02 | 2015-06-02 | Apple Inc. | Upper-body skeleton extraction from depth maps |
US11816897B2 (en) * | 2012-09-28 | 2023-11-14 | Nec Corporation | Information processing apparatus, information processing method, and information processing program |
US11321947B2 (en) | 2012-09-28 | 2022-05-03 | Nec Corporation | Information processing apparatus, information processing method, and information processing program |
US9019267B2 (en) | 2012-10-30 | 2015-04-28 | Apple Inc. | Depth mapping with enhanced resolution |
US20150301591A1 (en) * | 2012-10-31 | 2015-10-22 | Audi Ag | Method for inputting a control command for a component of a motor vehicle |
US9612655B2 (en) * | 2012-10-31 | 2017-04-04 | Audi Ag | Method for inputting a control command for a component of a motor vehicle |
US20190052938A1 (en) * | 2015-04-03 | 2019-02-14 | Mirriad Advertising Plc | Producing video data |
US10841667B2 (en) * | 2015-04-03 | 2020-11-17 | Mirriad Advertising Plc | Producing video data |
US20160328614A1 (en) * | 2015-05-04 | 2016-11-10 | International Business Machines Corporation | Measuring display effectiveness with interactive asynchronous applications |
US9898754B2 (en) * | 2015-05-04 | 2018-02-20 | International Business Machines Corporation | Measuring display effectiveness with interactive asynchronous applications |
US9892421B2 (en) * | 2015-05-04 | 2018-02-13 | International Business Machines Corporation | Measuring display effectiveness with interactive asynchronous applications |
US20160328741A1 (en) * | 2015-05-04 | 2016-11-10 | International Business Machines Corporation | Measuring display effectiveness with interactive asynchronous applications |
US9584753B2 (en) | 2015-05-18 | 2017-02-28 | Target Brands, Inc. | Interactive display fixture |
US10043279B1 (en) | 2015-12-07 | 2018-08-07 | Apple Inc. | Robust detection and classification of body parts in a depth map |
CN105828159A (en) * | 2016-03-22 | 2016-08-03 | 乐视网信息技术(北京)股份有限公司 | Configuration method and device of television operation corner mark |
US10366278B2 (en) | 2016-09-20 | 2019-07-30 | Apple Inc. | Curvature-based face detector |
CN109905753A (en) * | 2017-12-08 | 2019-06-18 | 腾讯科技(深圳)有限公司 | The display methods and device of footmark, storage medium, electronic device |
US10911811B1 (en) * | 2019-10-10 | 2021-02-02 | Recentive Analytics | Systems and methods for automatically and dynamically generating a network map |
US10958957B1 (en) * | 2019-10-10 | 2021-03-23 | Recentive Analytics, Inc. | Systems and methods for automatically and dynamically generating a network map |
Also Published As
Publication number | Publication date |
---|---|
JP4794453B2 (en) | 2011-10-19 |
JP2007512729A (en) | 2007-05-17 |
EP1676442A2 (en) | 2006-07-05 |
US20090235295A1 (en) | 2009-09-17 |
US8487866B2 (en) | 2013-07-16 |
WO2005041578A2 (en) | 2005-05-06 |
KR101094119B1 (en) | 2011-12-15 |
KR20070006671A (en) | 2007-01-11 |
WO2005041578A3 (en) | 2006-02-02 |
CN102034197A (en) | 2011-04-27 |
CN1902930A (en) | 2007-01-24 |
CN1902930B (en) | 2010-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8487866B2 (en) | Method and system for managing an interactive video display system | |
US12124509B2 (en) | Automated media analysis for sponsor valuation | |
US7020336B2 (en) | Identification and evaluation of audience exposure to logos in a broadcast event | |
US7636456B2 (en) | Selectively displaying information based on face detection | |
US7921036B1 (en) | Method and system for dynamically targeting content based on automatic demographics and behavior analysis | |
US6873710B1 (en) | Method and apparatus for tuning content of information presented to an audience | |
US20140289754A1 (en) | Platform-independent interactivity with media broadcasts | |
US20100232644A1 (en) | System and method for counting the number of people | |
EP1566788A2 (en) | Display | |
US10296936B1 (en) | Method and system for measuring effectiveness of a marketing campaign on digital signage | |
JP4603975B2 (en) | Content attention evaluation apparatus and evaluation method | |
US11367083B1 (en) | Method and system for evaluating content for digital displays by measuring viewer responses by demographic segments | |
CN110324683B (en) | Method for playing advertisement on digital signboard | |
KR20140068634A (en) | Face image analysis system for intelligent advertisement | |
KR20220039871A (en) | Apparatus for providing smart interactive advertisement | |
KR20220039872A (en) | Apparatus for providing smart interactive advertisement | |
CN118246987B (en) | Advertisement delivery position management system based on delivery effect analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: REACTRIX SYSTEMS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BELL, MATTHEW;BELFER, RUSSELL;REEL/FRAME:015934/0430 Effective date: 20041024 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: REACTRIX (ASSIGNMENT FOR THE BENEFIT OF CREDITORS) Free format text: CONFIRMATORY ASSIGNMENT;ASSIGNOR:REACTRIX SYSTEMS, INC.;REEL/FRAME:022710/0296 Effective date: 20090406 |
|
AS | Assignment |
Owner name: DHANDO INVESTMENTS, INC., DELAWARE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REACTRIX (ASSIGNMENT FOR THE BENEFIT OF CREDITORS), LLC;REEL/FRAME:022741/0024 Effective date: 20090409 |
|
AS | Assignment |
Owner name: INTELLECTUAL VENTURES HOLDING 67 LLC, NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DHANDO INVESTMENTS, INC.;REEL/FRAME:022767/0631 Effective date: 20090409 |