WO2014171200A1 - 情報処理装置及び情報処理方法、表示装置及び表示方法、並びに情報処理システム - Google Patents
情報処理装置及び情報処理方法、表示装置及び表示方法、並びに情報処理システム Download PDFInfo
- Publication number
- WO2014171200A1 WO2014171200A1 PCT/JP2014/055352 JP2014055352W WO2014171200A1 WO 2014171200 A1 WO2014171200 A1 WO 2014171200A1 JP 2014055352 W JP2014055352 W JP 2014055352W WO 2014171200 A1 WO2014171200 A1 WO 2014171200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual object
- real
- real object
- display
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/217—Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/55—Details of game data or player data management
- A63F2300/5546—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
- A63F2300/5553—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the technology disclosed in the present specification relates to an information processing device and information processing method, a display device and a display method, and an information processing system that perform processing related to a virtual image combined with a real image in an augmented reality image.
- the present invention relates to an information processing device, an information processing method, a display device, a display method, and an information processing system that present an interaction between an object and a real space.
- Virtual creatures are widely used in the description field of information processing as 2D or 3D game characters, avatars that are incarnations of players (users), and computer user interfaces. This type of virtual creature generates an action on the information processing apparatus, and freely moves around the screen or emits a sound.
- AR augmented reality
- mixed reality that superimpose a virtual image on a real image and see it are becoming popular.
- Virtual images to be superimposed on real images include game characters, player (user) incarnations of avatars, virtual pets, virtual creatures such as computer user interfaces, or virtual moving objects such as automobiles. .
- an entertainment apparatus that can interact with a user and displays a virtual pet walking around on a virtual screen in combination with an image of a real environment
- This entertainment device renders a virtual pet generated by the system unit into a real image captured by a video camera and displays it on a display / sound output device such as a monitor or television set having a display and a speaker.
- a display / sound output device such as a monitor or television set having a display and a speaker.
- the virtual pet moves only within the screen, and voices expressing the behavior and reaction are only output from the place where the display / sound output device is installed, so that the interaction between the virtual pet and the real space can be expressed realistically. There is a limit to it.
- the depth position of the 3D stereoscopic image displayed by the DFD (Depth Fusion Display) type 3D display device is matched to the depth position through which the real object passes, and the virtual object is stereoscopically viewed in the same space for the observer.
- an object display device that presents a feeling as if both have interaction see, for example, Patent Document 2.
- this object display device can provide is limited to the effects of visual interaction. Further, the effect of the interaction is limited to the installation location of the object display device that displays the virtual object.
- an image processing method has been proposed in which an image of a virtual space including a virtual object is superimposed on the real space and presented on a head-mounted display (see, for example, Patent Document 3).
- a sound effect such as a sound of a virtual object moving around in a virtual space is output from a speaker.
- the sound effect represents the interaction of the virtual object in the virtual space, and does not present the interaction of the virtual object with the real space.
- an information terminal device that displays an image of a virtual pet so that the virtual pet appears as a virtual image on the skin in the real field of view has been proposed (see, for example, Patent Document 4).
- an impression is given as if the virtual pet was reacting to the movement of the skin. That is, according to this information terminal device, a user in the real space can perform an action on the virtual pet, but conversely, the virtual pet cannot perform an action in the real space. That is, it does not present an interaction between the virtual pet and the real space.
- An object of the technology disclosed in this specification is to provide an excellent information processing apparatus and information processing method, display apparatus and display method, and information processing system capable of suitably presenting an interaction between a virtual object and a real space. There is to do.
- An output unit for adding motion to a real object A control unit that controls an output from the output unit according to an operation performed by the virtual object on the real object or an operation performed on the virtual object by the real object; Is an information processing apparatus.
- the information processing apparatus further includes identification information for identifying the information processing apparatus.
- the information processing apparatus further includes a receiving unit that receives a detection result of an operation performed on the real object by the virtual object.
- the said control part is comprised so that the output from the said output part may be controlled according to the received said detection result.
- the information processing apparatus further includes a virtual object control unit that controls the operation of the virtual object.
- the said control part is comprised so that the output from the said output part may be controlled according to operation
- the output unit of the information processing apparatus includes a vibration device, a pulse generation device, a heat generation device, and a cooling device incorporated in the real object.
- the technology described in claim 6 of the present application is: Obtaining an action performed by the virtual object on the real object, or an action performed by the real object on the virtual object; Adding an action to the real object in response to an action performed by the virtual object on the real object or an action performed by the real object on the virtual object; Is an information processing method.
- a detection unit for detecting a specific real object A detection unit for detecting a specific real object; A display unit for displaying a virtual object in response to detecting the specific real object; Is a display device.
- the detection unit of the display device identifies the real object, and the display unit corresponds to the identified real object. Is configured to display.
- the detection unit of the display device identifies the real object based on identification information included in the real object, or the object recognition recognizes the real object. It is configured to identify a real object.
- the detection unit of the display device detects the real object from within the user's field of view, and the display unit overlaps the real object.
- the virtual object is displayed.
- the display device further includes a virtual object control unit that controls the operation of the virtual object.
- the virtual object control unit of the display device according to claim 11 is configured to control the appearance or disappearance of the virtual object according to a user's action. Yes.
- the virtual object control unit of the display device is configured to display the virtual object displayed by the display unit according to a user's action, situation, or time zone. The amount of information is controlled.
- the virtual object control unit of the display device controls an operation of the virtual object with respect to the real object, or the virtual object is the real object.
- the operation of the virtual object is controlled in accordance with the operation received from the object.
- the detection unit of the display device performs an operation of the virtual object with respect to the real object or an operation that the virtual object receives from the real object.
- the virtual object control unit is configured to control the operation of the virtual object based on a detection result of the detection unit.
- the virtual object control unit of the display device controls the operation of the virtual object so as to be synchronized with the operation of the real object. It is configured.
- the display unit of the display device according to claim 7 is used by being mounted on a user's head or face, and the position of the user's head or face is used. And a position / orientation detection unit for detecting the attitude.
- the display unit is configured to correct the display of the virtual object in a direction opposite to a change in the position or posture of the user's head or face.
- the technique described in claim 18 of the present application is: A detection step of detecting a specific real object; A display step of displaying a virtual object in response to detecting the specific real object; Is a display method.
- a control device for controlling the operation of the virtual object A display device that detects a real object and displays a corresponding virtual object; An output device that applies an action to a real object in response to an action performed by the virtual object on the real object or an action performed by the real object on the virtual object; Is an information processing system.
- system here refers to a logical collection of a plurality of devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter whether or not (the same applies hereinafter).
- a display device that detects a real object, displays a corresponding virtual object, and controls the operation of the virtual object;
- An output device that applies an action to a real object in response to an action performed by the virtual object on the real object or an action performed by the real object on the virtual object; Is an information processing system.
- an excellent information processing apparatus and information processing method, display apparatus and display method, and information processing system capable of suitably presenting an interaction between a virtual object and a real space are provided. can do.
- FIG. 1 is a diagram schematically illustrating a functional configuration of an information processing system 100 that presents an augmented reality image to a user.
- FIG. 2 is a diagram showing a specific configuration example 100-2 of the information processing system.
- FIG. 3 is a diagram showing an example of a communication sequence in the information processing system 100-2.
- FIG. 4 is a diagram showing a specific configuration example 100-4 of the information processing system.
- FIG. 5 is a diagram showing a modification 100-5 of the information processing system shown in FIG.
- FIG. 6 is a diagram showing an example of a communication sequence in the information processing system 100-4.
- FIG. 7 is a diagram showing an external configuration of an image display apparatus 700 applicable to the technology disclosed in this specification.
- FIG. 1 is a diagram schematically illustrating a functional configuration of an information processing system 100 that presents an augmented reality image to a user.
- FIG. 2 is a diagram showing a specific configuration example 100-2 of the information processing system.
- FIG. 3 is a diagram showing an example of a
- FIG. 8 is a diagram illustrating a state in which the image display device 700 worn by the user is viewed from above.
- FIG. 9 is a diagram illustrating an internal configuration example of the image display apparatus 700.
- FIG. 10 is a diagram schematically illustrating an internal configuration example of the output device 202 configured as a dedicated hardware device.
- FIG. 11 is a diagram schematically illustrating a configuration example of the actuator unit 1020.
- FIG. 12A is a diagram illustrating a specific example of the output device 202.
- FIG. 12B is a diagram illustrating a specific example of the output device 202.
- FIG. 12C is a diagram illustrating a specific example of the output device 202.
- FIG. 13 is a diagram illustrating a configuration example of the vibration device 1101.
- FIG. 13 is a diagram illustrating a configuration example of the vibration device 1101. FIG.
- FIG. 14 is a diagram illustrating an operation example of the vibration device 1101 illustrated in FIG. 13.
- FIG. 15 is a diagram illustrating an operation example of the vibration device 1101 illustrated in FIG. 13.
- FIG. 16 is a diagram illustrating an operation example of the output device 202 illustrated in FIG. 12C.
- FIG. 17 is a diagram illustrating an operation example of the output device 202 illustrated in FIG. 12C.
- FIG. 18 is a diagram illustrating an operation example of the output device 202 illustrated in FIG. 12C.
- FIG. 19 is a diagram illustrating an operation example of the output device 202 illustrated in FIG. 12C.
- FIG. 20 is a diagram illustrating an operation example of the output device 202 illustrated in FIG. 12C.
- FIG. 20 is a diagram illustrating an operation example of the output device 202 illustrated in FIG. 12C.
- FIG. 21 is a diagram showing a state in which the virtual object 150 is presented following the movement of the user's head.
- FIG. 22 is a diagram illustrating a state in which the virtual object 150 is presented without following the movement of the user's head.
- FIG. 23 is a diagram illustrating a state in which the movement of the image of the virtual object 150 is delayed with respect to the movement of the user's head.
- FIG. 24A is a diagram illustrating a state in which the user observes the real object 151 through the head mounted display 2401.
- FIG. 24B is a diagram illustrating a state in which the user observes the virtual object 150 through the head mounted display 2401.
- FIG. 25 is a diagram showing how each user observes the virtual objects 150A, 150B, 150C,...
- FIG. 26 is a diagram illustrating an example in which the area where the identification information 152 in the room is arranged becomes a real object.
- FIG. 27 is a diagram illustrating an example in which a part of the user's body becomes a real object.
- FIG. 28 is a diagram illustrating an example in which a part of the user's body becomes a real object.
- FIG. 29 is a diagram showing further modifications 100-29 of the information processing system shown in FIG.
- FIG. 1 schematically shows a functional configuration of an information processing system 100 that presents an augmented reality or mixed reality image in which a virtual image is superimposed on a real image to a user.
- the illustrated information processing system 100 synthesizes and presents a virtual object 150 in a real space as a virtual image, but includes a virtual object control unit 101, a virtual object generation unit 102, a display unit 103, and a detection unit 104.
- the “virtual object” referred to in the present embodiment is, for example, an avatar such as a user's incarnation, a 2D or 3D game character, a computer user interface, and the like.
- the virtual object 150 does not need to be a person but may be an animal such as a virtual pet. Further, the virtual object 150 does not have to be a living thing, and may be a moving body such as an automobile, a helicopter, and a ship.
- the virtual object 150 may be text instead of an object image.
- the virtual object 150 may be a voice (such as a voice assistant) (without displaying an image).
- a human virtual object 150 As a method of compositing a virtual object to a real image, a method of superimposing a virtual image on a real space that is a user's field of view observed on a see-through head-mounted display, or a video see-through head-mounted display is used. In addition, there is a method of superimposing a virtual image on a real space image taken by a camera.
- the virtual object control unit 101 controls the appearance and disappearance (from the real space) of the virtual object 150 (in the real space) and the operation and behavior of the virtual object 150 that is appearing (in the real space). Basically, the virtual object control unit 101 makes the virtual object 150 corresponding to the real object 151 appear in response to the user seeing or finding the real object 151 (however, (It is assumed that the virtual object 150 is associated with the real object 151 in advance).
- the actual object 151 is provided with identification information 152 such as an IC tag or a barcode (which may be a QR code (registered trademark) or a dot code).
- identification information 152 may be a display image such as a barcode.
- the real object 151 may be a printed matter or an object having characteristics (in other words, an image can be easily recognized), and does not necessarily include the identification information 152 such as an IC tag.
- the above-mentioned “according to seeing the real object 151” or “according to finding” means actually recognizing the identification information 152 arranged in the real object 151 or object recognition of the image. This is realized by the process of specifying the real object 151. For each piece of identification information 152, information on the corresponding virtual object 150 is stored in a library.
- a method for causing the virtual object 150 to appear is determined by the system side regardless of the pull type that appears when the user requests the appearance by a gesture or other input operation, and the user's intention. It is classified as a push (PUSH) type that appears when the event occurs.
- PUSH push
- the above-described method for causing the virtual object 150 corresponding to the real object 151 to appear corresponds to the appearance of the PULL type. Further, when the user removes his / her line of sight from the region where the virtual object 150 or the real object 151 exists (or a predetermined time elapses after the user removes his / her line of sight), the user stops showing interest in the virtual object 150 (for example, a predetermined time or longer) When an event such as “do not talk” occurs, the virtual object control unit 101 causes the virtual object 150 to disappear.
- the virtual object 150 appears in a comfortable appearance method at a user's comfortable timing regardless of whether the appearance form is an image, text, or sound.
- the appearance timing and appearance method of the virtual object 150 are controlled while recognizing the user's action. It is also necessary to control the amount of information of the virtual object 150 according to the user's behavior, situation, time zone, etc. (for example, when the user is sitting on a train, the virtual object 150 may be rich content with an image. Although it is good, when walking in a crowded area, the virtual object 150 is made of only sound.
- the virtual object 150 that appears in the push type may be extinguished by, for example, a user's operation (for example, by a gesture such as a user picking and moving the displayed virtual object 150 or flipping it with a fingertip) You may be allowed to).
- a user's operation for example, by a gesture such as a user picking and moving the displayed virtual object 150 or flipping it with a fingertip
- You may be allowed to.
- the same extinction method may be applied to the virtual object 150 that appears in the PULL type.
- the virtual object control unit 101 causes an action on the real object 151 to be performed as an operation or action of the virtual object 150.
- the virtual object control unit 101 determines the operation and behavior of the virtual object 150, for example, any one of the following control rules (1) to (6) or a combination of two or more of (1) to (6) Control based on the control rule consisting of
- the virtual object 150 may operate autonomously or may operate by a user operation.
- the virtual object generation unit 102 generates an image of the virtual object 150 such as a person or an animal whose appearance and disappearance, operation and behavior are controlled by the virtual object control unit 101. Although various rendering techniques such as texture mapping, light source, and shading can be applied to the generation of the image data of the virtual object 150, detailed description thereof is omitted in this specification.
- the virtual object generation unit 102 may generate text and sound together with an image of the virtual object 150, or may generate a virtual object 150 consisting of only text or sound without an image (described above).
- the display unit 103 synthesizes the image of the virtual object 150 generated by the virtual object generation unit 102 so as to overlap the real image, and displays and outputs it on the screen.
- the display unit 103 is configured as an image display device (head-mounted display) that is used by being mounted on the user's head or face (described later)
- an image of the virtual object 150 is displayed in the user's field of view. Is displayed in a see-through manner superimposed on the real space that is the field of view, or is displayed on a video see-through superimposed on an image in the real space captured by the camera.
- a projection type image device may be applied to the display unit 103 to project the virtual object 150 onto an object in the real space. For example, when the virtual object 150 appears in response to the user seeing or finding the real object 151 (described above), the virtual object 150 is superimposed on the real object 151 as described later. Displayed.
- the detecting unit 104 detects an event that occurs in the real space or the virtual space.
- the detection unit 104 includes a camera, recognizes an image of the user's field of view captured by the camera, and identifies the real object 151 on which the virtual object 150 is an action target based on, for example, its shape. .
- identification information 152 such as an IC tag or a barcode (which may be a QR code (registered trademark) or a dot code) is disposed on the real object 151
- the detection unit 104 reads the identification information 152.
- the real object 151 can be specified.
- the detecting unit 104 performs such interaction, that is, an action that the virtual object 150 performs on the real object 151 and a reaction from the real object 151, an action that the real object 151 performs on the virtual object 150, and the virtual object 150. Reaction detection is also performed.
- the actions that the virtual object 150 performs on the real object 151 are physical operations such as the virtual object 150 riding on the real object 151, stepping on the real object 151, and hitting the real object 151, for example.
- the detection unit 104 needs to constantly monitor the interaction between the virtual object 150 and the real object 151. However, since there is no need to monitor while the virtual object 150 does not appear, the operation of the camera may be turned off to reduce power consumption.
- the detection unit 104 may analyze an image of the virtual object 150 displayed on the display unit 103 to detect an action that the virtual object 150 is performing on the real object 151, or may relate to the action Information may be obtained directly from the virtual object control unit 101 that controls the operation or action of the virtual object 150.
- the detecting unit 104 also detects an action (described later) that the output unit 105 performs on the virtual object 150 through the real object 151.
- the action performed by the real object 151 can be detected based on the analysis result of the image of the virtual object 150 displayed superimposed on the real object 151, the recognition result of the captured image of the camera, or the like.
- the detection unit 104 may receive feedback from the output control unit 106 for an action that the output unit 105 has performed on the virtual object 150. This detection result is notified to the virtual object control unit 101.
- the virtual object control unit 101 controls the reaction of the virtual object 150 with respect to the action performed by the output unit 105.
- the detection unit 104 also detects an interaction between the user and the virtual object 150 (for example, when a part of the user's body becomes the real object 151). For the detection of the action or reaction performed by the user, gesture recognition using a motion capture technology, voice recognition, or the like is used.
- the detection unit 104 Based on the expression of the virtual object 150 displayed on the display unit 103 or the context such as the action taken by the virtual object 150, the detection unit 104 detects the mental state (excited or cooled) of the virtual object 150. Or other states may be determined. Alternatively, the detection unit 104 may acquire information related to the mental state of the virtual object 150 from the virtual object control unit 101 and notify the output control unit 106 of the information. Accordingly, the output control unit 106 can control the operation of the output unit 105 to perform an action or reaction according to the mental state of the virtual object 150 and tactile feedback (described later) to the user.
- the output unit 105 generates an output such as vibration, an electric pulse, heat generation or cooling, wind, sound, light emission, movement, jumping, etc. for the real object 151 that is the target of the action of the virtual object 150, The reaction by the real object 151 is realized.
- the output unit 105 outputs at least a part of output devices among moving devices such as a vibration device, a pulse generation device, a heat generation device, a cooling device, a blower device, an acoustic device, a light emitting device, and a wheel.
- the output unit 105 including these output devices is incorporated in the real object 151 (in other words, some real objects 151 include the output unit 105).
- the real object 151 will be described as including the output unit 105.
- the output control unit 106 controls driving of each output device constituting the output unit 105 as a reaction corresponding to the action that the virtual object 150 performs on the real object 151 detected by the detection unit 104.
- the output control unit 106 outputs the amplitude and period of vibration, the amount of heat generation, the volume of sound or sound, the amount of light emission, etc. according to the strength of the stepping force (or tapping force) of the virtual object 150 on the real object 151.
- the outputs from the output devices 1101 to 1108 see FIG. 11 described later
- the interaction between the virtual object 150 and the real space is expressed.
- the output control unit 106 causes the virtual object 150 to perform a reaction of the output unit 105 while accurately synchronizing with the action performed by the virtual object 150 (that is, the operation of the virtual object 150 displayed by the display unit 103). Produce the feeling of being in the real place.
- the output control unit 106 can use the output unit 105 to perform an action on the virtual object 150 instead of a reaction on the action of the virtual object 150.
- an action on the virtual object 150 on the contrary, such as shaking, heating, cooling, or blowing wind.
- the environment detection unit 107 detects information regarding the environment of the real space surrounding the virtual object 150 and the user.
- the virtual object control unit 101 may input the environment information detected by the environment detection unit 107 as necessary and use it for controlling the operation and behavior of the virtual object 150. Further, the output control unit 106 may adjust the output of the output unit 105 according to the environment information, and add the influence of the environment to the expression of the interaction between the virtual object 150 and the real object 151.
- the environment detection unit 107 detects, for example, ambient light intensity, acoustic intensity, position or location, temperature, weather, time, surrounding image, number of people outside, and the like as environment information.
- the environment detection unit includes various sensors such as a light quantity sensor, a microphone, a GPS (Global Positioning System) sensor, a temperature sensor, a humidity sensor, a clock, an image sensor (camera), and a radiation sensor.
- a light quantity sensor such as a microphone, a GPS (Global Positioning System) sensor, a temperature sensor, a humidity sensor, a clock, an image sensor (camera), and a radiation sensor.
- GPS Global Positioning System
- the state detection unit 108 detects the state of the user who uses the virtual object 150 (or observes the virtual object 150).
- the virtual object control unit 101 inputs the state information detected by the state detection unit 108 as necessary, and uses it to control the timing of appearance / disappearance of the virtual object 150 and the operation and behavior of the virtual object 150.
- the output control unit 106 may also adjust the output of the output unit 105 in accordance with the state information, and add the influence of the user state to the expression of the interaction between the virtual object 150 and the real space.
- the state detection unit 108 acquires, for example, information on the position and posture of the user's head or posture information in order to track the user's head movement.
- the state detection unit 108 is a sensor that can detect a total of nine axes, for example, a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
- the state detection unit 108 may use any one or more sensors such as a GPS (Global Positioning System) sensor, a Doppler sensor, an infrared sensor, and a radio wave intensity sensor.
- the state detection unit 108 combines the information provided from various infrastructures such as mobile phone base station information and PlaceEngine (registered trademark) information (electrical measurement information from a wireless LAN access point) with the acquisition of position information. You may make it use.
- PlaceEngine registered trademark
- the state detection unit 108 includes a user's working state and behavior state (moving state such as stationary, walking, and running, eyelid opening / closing state, line-of-sight direction), and mental state (the user is immersed or concentrated while observing the virtual object). Emotion, emotion, emotion, etc.) and even physiological state.
- the state detection unit 108 is a camera that shoots the face of the user, a gyro sensor, an acceleration sensor, a speed sensor, a pressure sensor, a temperature sensor that detects body temperature or temperature, and a sweat sensor.
- Various state sensors such as a myoelectric sensor, an electrooculogram sensor, an electroencephalogram sensor, an expiration sensor, and a gas sensor, and a timer may be provided.
- the environment detection unit 107 and the state detection unit 108 are arbitrary components of the information processing system 100. Whether the virtual object control unit 101 controls the operation or action of the virtual object 150 based on the environment information and the state information is arbitrary.
- the information processing system 100 configured as shown in FIG. 1 may be realized by a single device, or may be realized by two or more devices that are physically independent and interconnected by a network or the like. .
- FIG. 2 shows a specific configuration example 100-2 of the information processing system.
- the information processing system 100-2 includes two devices, a display device 201 that synthesizes and displays the virtual object 150 in the real space, and an output device 202 that represents the interaction performed by the virtual object 150 in the real space. Consists of physically independent devices.
- the display device 201 is a game machine main body, for example, and includes a virtual object control unit 101, a virtual object generation unit 102, a display unit 103, and a detection unit 104.
- the virtual object control unit 101 controls the appearance and disappearance of the virtual object 150 and its operation or action according to rules such as a game.
- the virtual object generation unit 102 generates an image of the virtual object 150.
- the display unit 103 synthesizes the image of the virtual object 150 generated by the virtual object generation unit 102 so as to overlap the real image, and displays and outputs it on the screen.
- the detection unit 104 constantly monitors the operation and behavior of the virtual object 150 controlled by the virtual object control unit 101, and detects an action performed by the virtual object 150 in the real space (or in the virtual space).
- the display device 201 may include at least one of the environment detection unit 107 and the state detection unit 108.
- the virtual object control unit 101 may change the action or action of the virtual object 150 or the action to be performed on the real object 151 so as to be influenced by the detected environment information and state information.
- the output device 202 includes an output control unit 106 and an output unit 105 that are built in the real object 151.
- the output control unit 106 expresses a reaction to the virtual object 150 by controlling driving of each output device constituting the output unit 105 in accordance with an action performed by the virtual object 150 on the real object 151.
- the output control unit 106 performs output control for the output unit 105 so that the output device 202 performs an action on the virtual object 150.
- the display device 201 and the output device 202 include communication units 211 and 212, respectively.
- the display device 201 transmits information related to the action of the virtual object 150 detected by the detection unit 104 or information related to the action of the virtual object 150 specified by the virtual object control unit 101 to the output device 202 via the communication unit 211. Send.
- the output control unit 106 configures the output unit 105 as a reaction to the action that the virtual object 150 is performing on the real object 151. Control the drive of each output device.
- the communication units 211 and 212 are connected using any communication means such as wireless or wired communication.
- communication means MHL (Mobile High-definition Link), USB (Universal Serial Bus), HDMI (registered trademark) (Multidefinition Multimedia Interface), Wi-Fi (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (trademark) BLE (Bluetooth (registered trademark) Low Energy) communication, ultra-low power wireless communication such as ANT, mesh network standardized by IEEE802.11s, infrared communication, human body communication, signal transmission via conductive fiber, etc. Can be mentioned.
- the output device 202 may further include an environment detection unit 213.
- the environment detection unit 213 detects information regarding the real space surrounding the virtual object 150 that is performing an action on the real object 151.
- the output control unit 106 can adjust the output of the output unit 105 according to the environment information, and add the influence of the environment to the expression of the interaction between the virtual object 150 and the real space.
- the output device 202 can be configured as, for example, a dedicated hardware device that expresses an interaction performed by the virtual object 150 in real space.
- the output unit 105 vibrates in order to express the reaction to the action performed by the virtual object 150 in various forms such as vibration of the real object 151, heat generation or cooling, wind, sound, light emission, movement, jumping, and the like.
- a device, a pulse generating device, a heat generating device, a cooling device, a blower device, an acoustic device, a light emitting device, a moving device, and the like are included.
- the output device 202 can use these output devices to perform an action on the virtual object 150 instead of a reaction on the action of the virtual object 150. For example, it is also possible to perform an action on the virtual object 150 on the contrary, such as shaking, heating, cooling, or blowing wind.
- the output device 202 can be configured by using an existing electronic device instead of a dedicated hardware device, for example, a smartphone having a vibration function and a speaker, a multi-function terminal, or the like.
- a dedicated hardware device for example, a smartphone having a vibration function and a speaker, a multi-function terminal, or the like.
- the types of output devices equipped as the output unit 105 are limited, and the method for expressing the interaction between the virtual object 150 and the real object 151 is also limited.
- the action and reaction performed between the virtual object 150 and the real object 151 are accurately synchronized, in other words, the operation of the virtual object 150 displayed on the display unit 103 and the driving of the output unit 105 on the output device 202 side are Accurate synchronization is important for showing realistic interactions. On the other hand, if the response delay time is too long for the action and it is not well synchronized, the interaction will be unnatural. If the virtual object 150 is always operated, it is easy for the user to appear as if the user is interacting in real time.
- FIG. 3 shows an example of a communication sequence in the information processing system 100-2.
- the display device 201 When the display device 201 identifies the real object 151 by reading the identification information 152 or recognizing the object, the display device 201 synthesizes and displays the virtual object 150 corresponding to the identified real object 151 in the real space (SEQ301). The display device 201 refreshes the operation or action of the virtual object 150 at a predetermined control cycle.
- the display device 201 detects an action performed by the virtual object 150 on the real object 151, the display device 201 transmits a message including the detection result to the output device 202, for example, at a predetermined time interval (SEQ302).
- the display device 201 does not transmit information regarding the action of the virtual object 150 but converts the reaction to the action of the virtual object 150 into an instruction for expressing the reaction with the real object 151 or the control information of the output unit 105 and transmits it. It may be.
- the output device 202 operates the output unit 105 as a reaction on the real object 151 side (SEQ303) to express the interaction between the virtual object 150 and the real object 151.
- SEQ303 a reaction on the real object 151 side
- the output device 202 may return a confirmation response to the message from the display device 201, a reaction output completion (interaction) completion report, and the like to the display device 201 (SEQ304).
- the output device 202 can also perform an action on the virtual object 150, not as a reaction of the real object 151 with respect to the action of the virtual object 150 (SEQ305).
- the output device 202 may transmit information related to the action performed by the output unit 105 to the display device 201 so that the virtual object 150 performs the reaction (SEQ 306).
- the display device 201 When the display device 201 receives action information for the virtual object 150 from the output device 202, the display device 201 generates an image of the virtual object 150 that takes a reaction to the information, and synthesizes and displays the image on the real object 151 in the real space.
- the reaction of the virtual object 150 is performed in exact synchronization (or with a small delay time) with the action performed by the output device 202, that is, the real object 151, the user is given an impression that the interaction is actually performed. Can be given.
- FIG. 4 shows another specific configuration example 100-4 of the information processing system.
- the information processing system 100-4 includes a control device 401 that controls the appearance or disappearance of the virtual object 150, its operation or action, a display device 402 that displays the virtual object 150 by combining it with the real space, It is composed of three physically independent devices called an output device 403 that expresses the interaction between the virtual object 150 and the real space.
- control device 401 may be a server installed at home or on the cloud
- the display device 402 may be a user terminal that logs into the server.
- multiple user terminals are logged in to the same server at the same time.
- control device 401 and the display device 402, the display device 402 and the output device 403, and the output device 403 and the control device 401 are interconnected by wireless or wired communication means.
- a communication medium such as a LAN (Local Area Network) such as a home network or a wide area network such as the Internet can be used.
- the control device 401 includes a virtual object control unit 101, and controls the appearance or disappearance of the virtual object 150, its operation or action in accordance with, for example, a rule such as a game or the context of a user's action.
- the control device 401 can be configured by one server device installed on the Internet or a combination of a plurality of server devices.
- the display device 402 includes a virtual object generation unit 102, a display unit 103, and a detection unit 104.
- the virtual object generation unit 102 generates an image of the virtual object 150 when receiving information related to the operation or action of the virtual object 150 from the control device 402. Then, the display unit 103 synthesizes the image of the virtual object 150 generated by the virtual object generation unit 102 so as to overlap the corresponding real object 151, and displays and outputs it on the screen.
- the detection unit 104 constantly monitors the operation and behavior of the virtual object 150 controlled by the virtual object control unit 101 (or the image of the virtual object 150 displayed by the display unit 103), and the virtual object 150 is a virtual object. An action to be performed on 151 is detected.
- the display device 402 may include at least one of an environment detection unit and a state detection unit.
- the display device 402 includes a state detection unit 414.
- the virtual object control unit 101 receives the environment information and the state information detected by the display device 402, the virtual object control unit 101 applies the action to the real object 151 as if the virtual object 150 is affected by the environment or the state. You may make it change the action to perform.
- the state detection unit 414 includes a sensor that detects the position and orientation of the display device 402 such as an acceleration sensor and a gyro sensor, for example. In order to move the display position of the virtual object 150 on the screen in accordance with the movement of the visual field, detection results of an acceleration sensor, a gyro sensor, or the like are used.
- the output device 403 includes an output control unit 106 and an output unit 105 built in the real object 151.
- the output control unit 106 controls driving of each output device constituting the output unit 105 as a reaction of the real object 151 with respect to an action performed by the virtual object 150 on the real object 151.
- the output device 403 receives the detection result of the action of the virtual object 150 from the display device 402 or the like, or receives the information regarding the action such as the action or action of the virtual object 150 directly from the control device 401, based on this,
- the output unit 105 outputs a reaction to the virtual object 150 and expresses an interaction between the virtual object 150 and the real space.
- the output device 403 may further include an environment detection unit 413.
- the environment detection unit 413 detects information related to the real space surrounding the virtual object 150 performing an action on the real object 151.
- the output control unit 106 can adjust the output of the output unit 105 according to the environment information, and add the influence from the environment to the expression of the interaction performed with the virtual object 150.
- the output device 202 may further include a state detection unit 415.
- the state detection unit 415 includes a gyro sensor, for example, and detects a change in the position and orientation of the real object 151 that occurs when an action or reaction is performed on the virtual object 150.
- the action of the virtual object 150 placed on the real object 151 is controlled according to such a change in the position or orientation of the real object 151, and the interaction between the virtual object 150 and the real object 151 is further controlled.
- FIG. 5 shows a modification 100-5 of the information processing system shown in FIG.
- the virtual object generation unit 102 is provided in the display device 402
- the virtual object generation unit 102 is provided in the control device 401 such as a game machine body. Is different.
- the control device 401 and the display device 402, the display device 402 and the output device 403, and the output device 403 and the control device 401 are interconnected by wireless or wired communication means.
- control device 402 transmits information on the operation or action of an individual virtual object or an image of an individual virtual object to each display device.
- the control device 401 as the game machine main body has the same virtual object, a virtual object unique to each user (avatar or the like) with respect to the display device 402 as a game controller possessed by each of a plurality of users participating in the game. Information on a game character or the like may be transmitted. Further, the control device 401 may transmit information on a plurality of virtual objects (virtual objects of each user) to each display device, such as an avatar of another user in the vicinity, together with the user's own avatar. (See FIG. 25 and later).
- the action and reaction performed between the virtual object 150 and the real object 151 are accurately synchronized, in other words, the operation of the virtual object 150 displayed on the display unit 103 and the driving of the output unit 105 on the output device 202 side.
- the response delay time is too long for the action and it is not well synchronized, the interaction will be unnatural.
- the virtual object 150 is always operated, it is easy to give the user an impression that the reaction is taken in real time.
- FIG. 6 shows an example of a communication sequence in the information processing system 100-4 (or 100-5).
- the display device 402 When the display device 402 specifies the real object 151 by reading the identification information 152 or recognizing the object in the user's field of view (SEQ601), the display device 402 notifies the control device 401 (SEQ602).
- control device 401 causes the virtual object 150 corresponding to the read identification information 152 to appear (SEQ603), and the action or action information of the virtual object 150 or the image of the virtual object 150 derived from the information.
- the data is transmitted to the display device 402 (SEQ604).
- the control device 401 refreshes the operation or action of the virtual object 150 at a predetermined control cycle.
- the display device 402 displays the image of the virtual object 150 corresponding to the specified real object 151 by combining with the real space (SEQ605).
- the display device 402 When the display device 402 detects the state of the user using the display device 402 itself or the environment information surrounding the user (SEQ 606), the display device 402 transmits the detection result to the control device 401 (SEQ 607). Then, the control device 401 performs an image update process according to the received state information (SEQ608), and transmits necessary information to the display device 402 (SEQ609).
- a real image displayed in a see-through display or a video see-through manner moves through the display device 402.
- the display device 402 is a device of a type worn by the user, such as a head-mounted display, or a device mounted on a moving body
- the virtual object 150 is overlaid on the real object 151.
- By moving the display area of the virtual object 150 so as to cancel the movement of the head the movement of the virtual object 150 following the movement of the user's head can be presented.
- the display device 402 includes a sensor for detecting the position and orientation of the display device 402 such as an acceleration sensor and a gyro sensor.
- a sensor for detecting the position and orientation of the display device 402 such as an acceleration sensor and a gyro sensor.
- the display device 402 transmits the detection result to the control device 401 (SEQ 611).
- the display area of the virtual object 150 is moved so as to cancel the movement of the user's head, and the virtual object 150 overlaps an appropriate place in the real space (for example, on the real object 151). In this manner, an image correction process is performed (SEQ612).
- image correction processing may be performed on the display device 402 side.
- servo control for example, D (differential) control in PID control
- motion prediction are performed so that the display position of the virtual object 150 converges to 0 at a certain time.
- control device 401 transmits information on the virtual object 150 after the image correction to the display device 402 (SEQ 613), and the display device 402 displays the virtual object 150 after the correction (SEQ 614) (same as above).
- the display device 402 detects an action performed by the virtual object 150 on the real object 151 (SEQ615), the display device 402 transmits a message including the detection result to the output device 403 at a predetermined time interval, for example (SEQ616). Alternatively, the control device 401 notifies the output device 403 that an action has been instructed to the virtual object 150 (SEQ617) (SEQ618).
- the output device 403 operates the output unit 105 as a reaction on the real object 151 side (SEQ619), and the interaction between the virtual object 150 and the real space.
- SEQ619 a reaction on the real object 151 side
- the interaction with the virtual object 150 seems to be performed in real space. Impression can be given to the user.
- the output device 403 may return a confirmation response to a message from the display device 402 or the control unit 401, a reaction output completion (interaction) completion report, or the like to the display device 402 or the control unit 401 (SEQ620). , SEQ 621).
- the output device 202 can also perform an action on the virtual object 150 instead of as a reaction of the real object 151 with respect to the action of the virtual object 150 (SEQ622).
- the output device 403 may transmit information related to the action performed by the output unit 105 to the control device 401 or the display device 402 so that the virtual object 150 performs the reaction (SEQ623, SEQ624). .
- control device 401 can cause the virtual object 150 to react to the action of the real object 151, thereby expressing the interaction between the virtual object 150 and the real space (SEQ625).
- the reaction of the virtual object 150 is performed in exact synchronization with the action performed by the output device 202, that is, the real object 151, the user sees the interaction realistically.
- FIG. 29 shows a further modification 100-29 of the information processing system shown in FIG.
- the functions of the control device 401 such as the virtual object control unit 101 are integrated with the output device 403.
- the control device 401 is assumed to be a server on the cloud (described above).
- the output device 403 is a game machine main body installed in the home
- the display device 402 is a game controller connected to the game machine main body. It is also assumed that a plurality of users' game controllers are connected to one game machine body.
- the virtual object control unit 101 in the output device 403 controls the appearance and disappearance of the virtual object 150 and its operation or action according to rules such as a game.
- the virtual object generation unit 102 in the display device 402 generates an image of the virtual object 150 when receiving information about the operation or action of the virtual object 150 from the virtual object control unit 101. Then, the display unit 103 synthesizes the image of the virtual object 150 generated by the virtual object generation unit 102 so as to overlap the real image, and displays and outputs it on the screen.
- the output control unit 106 in the output device 403 receives information on the action that the virtual object 150 is performing on the real object 151 from the virtual object control unit 101, the output control unit 106 configures the output unit 105 as a reaction of the real object 151. Control the drive of each output device.
- the action and reaction performed between the virtual object 150 and the real object 151 are accurately synchronized, in other words, the operation of the virtual object 150 displayed on the display unit 103 and the driving of the output unit 105 on the output device 202 side.
- the response delay time is too long for the action and the synchronization is not well established, the user is given the impression that the interaction is unnatural.
- the virtual object 150 is constantly operated, it is easy for the user to appear as if taking a reaction in real time.
- the virtual object 150 appears as an image as shown in FIGS. 2, 4, 5, and 29, and a voice such as a word spoken by the virtual object 150 accompanies the image, or no image is displayed.
- the appearance form of the virtual object 150 is various, such as when only sound emitted from a predetermined direction appears.
- the virtual object 150 is predetermined on the system side regardless of the user's intention.
- a push (PUSH) type appearance method that appears due to the occurrence of this event is assumed.
- the display device 201 shown in FIG. 2 and the display device 402 shown in FIGS. 4, 5, and 29 are configured as an image display device (head-mounted display) that is used by being mounted on the user's head or face. can do.
- FIG. 24A shows a state where the user observes the real object 151 on which the identification information 152 is pasted through the head-mounted display 2401. As described above, when the user sees or finds the real object 151, the virtual object 150 associated with the real object 151 (or identification information 152) appears.
- FIG. 24B shows a state where the user observes the appearing virtual object 150 through the head mounted display 2401.
- a plurality of user terminals log in to the same server at the same time, and virtual objects corresponding to the user terminals appear simultaneously on the same real object.
- a plurality of user terminals share one output device 403.
- FIG. 24 a scene in which a plurality of users wearing head mounted displays 2501, 2502, 2503,... Are looking at the same real object 151.
- Different virtual objects 150A, 150B, 150C,... For each user are associated with the real object 151 (or identification information 152), and each user has his / her own head-mounted display 2501, 2502, 2503,. And the virtual objects 150A, 150B, 150C,... Of other users can be observed. Further, each of the virtual objects 150A, 150B, 150C,... Interacts with the real object 151, and the virtual objects 150A, 150B, 150C,.
- FIG. 7 shows an external configuration of an image display apparatus 700 according to an embodiment of the technique disclosed in this specification.
- the image display device 700 is a head-mounted display that is used by being worn on the head or face of a user, and displays an image for each of the left and right eyes.
- the illustrated image display device 700 is a transmissive or see-through type, and the user can view a real-world landscape (ie, see-through) through the image while displaying the image. Therefore, an AR (Augmented Reality) image including the virtual object 150 can be superimposed on the real world landscape. Further, since the display image cannot be seen from the outside (that is, another person), it is easy to protect privacy when displaying information.
- the image display device 700 can be used as the display device 201 shown in FIG. 2 or the display device 402 shown in FIGS. 4, 5, and 29.
- the illustrated image display apparatus 700 has a structure similar to eyesight correction glasses.
- Virtual image optical units 701L and 701R including a transparent light guide unit and the like are disposed at positions of the image display device 700 main body facing the left and right eyes of the user. Inside each virtual image optical unit 701L and 701R, An image (not shown) to be observed is displayed.
- Each of the virtual image optical units 701L and 701R is supported by a support body 702 having a shape similar to a spectacle frame, for example.
- An outer camera 712 for inputting a surrounding image is installed at the approximate center of the spectacle frame-like support 702.
- the outer camera 712 can photograph a landscape in the user's line-of-sight direction, for example.
- the outer camera 712 can be configured with a plurality of cameras so that the three-dimensional information of the surrounding image can be acquired using the parallax information.
- shooting is performed while moving the camera using SLAM (Simultaneous Localization and Mapping) image recognition, and parallax information is calculated using a plurality of temporally moving frame images (for example, (See Patent Document 5), and the three-dimensional information of the surrounding image can be acquired from the calculated parallax information.
- SLAM Simultaneous Localization and Mapping
- a user's field of view following the virtual object 150 is photographed by the outer camera 712.
- a virtual object 150 corresponding to the real object 151 is displayed on at least one of the virtual image optical units 701L and 701R.
- microphones 703L and 703R are installed near the left and right ends of the support 702, respectively.
- the microphones 703L and 703R substantially symmetrically on the left and right, it is possible to recognize only the voice localized at the center (user's voice) and separate it from ambient noise and other people's voice. For example, it is possible to prevent an erroneous operation of picking up a voice other than the user when performing an operation by voice input from the microphones 703L and 703R.
- FIG. 8 shows a state in which the image display device 700 worn by the user is viewed from above.
- display panels 704L and 704R for displaying and outputting left-eye and right-eye images are disposed at the left and right ends of the image display device 700, respectively.
- Each of the display panels 704L and 704R includes a micro display such as a liquid crystal display or an organic EL element, or a laser scanning display such as a retina direct drawing display.
- the left and right display images output from the display panels 704L and 704R are guided to near the left and right eyes by the virtual image optical units 701L and 701R, and the enlarged virtual images are formed on the user's pupil.
- each of the virtual image optical units 701L and 701R includes an optical system that collects irradiation light from the micro display, a light guide plate that is disposed at a position where light passing through the optical system is incident, and A deflection filter that reflects light incident on the light guide plate and a deflection filter that emits light that has been totally reflected inside the light guide plate and propagated toward the user's eyes are provided.
- the display panels 704L and 704R are used for see-through display of the virtual object 150 as one of the AR images.
- the image display device 700 used by being mounted on the user's head or face may display the virtual object 150 by video see-through instead of see-through.
- FIG. 9 shows an internal configuration example of the image display device 700. Hereinafter, each part will be described.
- the control unit 901 includes a ROM (Read Only Memory) 901A and a RAM (Random Access Memory) 901B.
- the ROM 901A stores program codes executed by the control unit 901 and various data.
- the control unit 901 executes the program loaded into the RAM 901B, thereby starting the image display control and comprehensively controlling the operation of the entire image display device 700.
- an image display control program As a program or data stored in the ROM 901A, an image display control program, a control program for generating or controlling display of the virtual object 150, an action performed by the virtual object 150 in the real space is detected (for example, the virtual object 150 Identifies a real object 151 performing physical operation and detects an action performed on the real object 151), a detection program, a communication processing program with an external device such as a server (not shown) on the Internet), a real object A library of virtual objects 150 corresponding to each 151, identification information 152 unique to the device 700, and the like can be given. However, the library of virtual objects 150 may not be held locally by the image display device 700 but may be acquired sequentially from a server (not shown) on the Internet.
- the input operation unit 902 includes one or more operation elements that the user performs input operations such as keys, buttons, and switches, receives a user instruction via the operation elements, and outputs the instruction to the control unit 901.
- the input operation unit 902 similarly accepts a user instruction including a remote control command received by the remote control reception unit 903 and outputs it to the control unit 901.
- the user may perform the appearance and disappearance of the display of the virtual object 150 and the operation on the virtual object 150 via the input operation unit 902.
- the status information acquisition unit 904 is a functional module that acquires status information of the image processing apparatus 700 main body or a user wearing the apparatus 700.
- the status information acquisition unit 904 may be equipped with various sensors for detecting status information by itself, or an external device (for example, a user wears a part or all of these sensors).
- the state information may be acquired from a smartphone, a wristwatch, or another multifunction terminal) via the communication unit 905 (described later).
- the state information acquisition unit 904 acquires, for example, information on the position and posture of the user's head or posture information in order to track the user's head movement.
- the state information acquisition unit 904 is a sensor that can detect a total of nine axes including, for example, a three-axis gyro sensor, a three-axis acceleration sensor, and a three-axis geomagnetic sensor.
- the state information acquisition unit 304 may use any one or more sensors such as a GPS (Global Positioning System) sensor, a Doppler sensor, an infrared sensor, and a radio wave intensity sensor.
- the state information acquisition unit 904 acquires information provided from various infrastructures such as mobile phone base station information and PlaceEngine (registered trademark) information (electrical measurement information from a wireless LAN access point) for acquiring position and orientation information. Further, it may be used in combination.
- the state acquisition unit 904 for tracking head movement is built in a head-mounted display as the image display device 700, but accessory parts attached to the head-mounted display, etc. You may make it comprise.
- the externally connected state acquisition unit 904 expresses the posture information of the head in the form of a rotation matrix, for example, such as wireless communication such as Bluetooth (registered trademark) communication or USB (Universal Serial Bus). Send to the head-mounted display via high-speed wired interface.
- the state information acquisition unit 904 may include, for example, the user's work state (installation state of the image display device 700) as the state information of the user wearing the image display device 700. Presence / absence), user's action state (moving state such as stationary, walking, running, eyelid opening / closing state, line of sight, pupil size, physical operation such as action or reaction performed on virtual object 150), mental state ( The degree of excitement, excitement, arousal, emotion, emotion, etc., such as whether the user is immersed or concentrated while observing the display image), and the physiological state are acquired.
- the user's work state installation state of the image display device 700
- Presence / absence user's action state
- moving state such as stationary, walking, running, eyelid opening / closing state, line of sight, pupil size, physical operation such as action or reaction performed on virtual object 150
- mental state The degree of excitement, excitement, arousal, emotion, emotion, etc., such as whether the user is immersed or concentrated while observing the display image
- the state information acquisition unit 904 obtains the state information from the user by using a mounting sensor such as a mechanical switch, an inner camera that captures the user's face, a gyro sensor, an acceleration sensor, a speed sensor, a pressure sensor Sensors, temperature sensors that detect body temperature or temperature, sweat sensors, myoelectric sensors, electro-oculogram sensors, electroencephalogram sensors, breath sensors, gas / ion concentration sensors, and other state sensors, and timers (all not shown) May be.
- a mounting sensor such as a mechanical switch, an inner camera that captures the user's face, a gyro sensor, an acceleration sensor, a speed sensor, a pressure sensor Sensors, temperature sensors that detect body temperature or temperature, sweat sensors, myoelectric sensors, electro-oculogram sensors, electroencephalogram sensors, breath sensors, gas / ion concentration sensors, and other state sensors, and timers (all not shown) May be.
- the appearance or disappearance of the display of the virtual object 150, the operation of the virtual object 150, and the operation of the real object 151 on the virtual object 150 are controlled based on the state or state change acquired by the state information acquisition unit 904. Sometimes.
- the environment information acquisition unit 916 is a functional module that acquires information on one or more environmental factors surrounding the image processing apparatus 700 main body or a user wearing the apparatus 700.
- the environmental factors referred to here are electromagnetic waves (ultraviolet rays, blue light, radio waves), heat rays (infrared rays), radiation, harmful chemical substances floating in the atmosphere, noise, negative ions, and the like that the device 700 or the user receives.
- the environmental information acquisition unit 916 may be equipped with various sensors for detecting such environmental factors.
- the environment information acquisition unit 916 collects external devices (for example, smartphones, watches, and other multi-function terminals worn by the user) including some or all of these sensors, and environment information. You may make it acquire the information regarding an environmental factor via the communication part 905 (after-mentioned) from a server.
- the appearance or disappearance of the display of the virtual object 150, the operation of the virtual object 150, and the operation of the real object 151 on the virtual object 150 are controlled based on the environment acquired by the environment information acquisition unit 916. Sometimes.
- the outer camera 912 is disposed, for example, in the approximate center of the front surface of the image display apparatus 700 (see the outer camera 712 in FIG. 7), and can capture a surrounding image.
- an image of the user's own eyes that is, the user's line-of-sight direction, is obtained with the outer camera 912. Images can be taken.
- the outer camera 912 can be configured by a plurality of cameras so that the three-dimensional information of the surrounding image can be acquired using the parallax information.
- SLAM Simultaneous Localization and Mapping
- parallax information is calculated using a plurality of temporally moving frame images (for example, (See Patent Document 5), and the three-dimensional information of the surrounding image can be acquired from the calculated parallax information.
- the user can adjust the zoom of the outer camera 912 through the operation of the input operation unit 902, the size of the pupil recognized by the internal camera, and voice input.
- the captured image of the outer camera 912 can be displayed and output on the display unit 909, and can also be stored in the storage unit 906.
- the user's field of view following the virtual object 150 can be captured by the outer camera 912. Therefore, in this embodiment, the outer camera 912 can also be used as the detection unit 104 described above.
- the real object 151 can be specified by reading the identification information 152 such as a barcode attached to the real object 151 from the photographed image of the outer camera 912 or by recognizing the object from the photographed image of the outer camera 912. it can.
- the communication unit 905 performs communication processing with other image display devices, multi-function terminals, external devices such as a server (not shown) on the Internet, and modulation / demodulation and encoding / decoding processing of communication signals.
- the control unit 901 transmits transmission data to the external device from the communication unit 905.
- the configuration of the communication unit 905 is arbitrary.
- the communication unit 905 can be configured in accordance with a communication method used for transmission / reception operations with an external device serving as a communication partner.
- the communication method may be either wired or wireless.
- Communication standards mentioned here include MHL (Mobile High-definition Link), USB (Universal Serial Bus), HDMI (registered trademark) Multimedia Interface, Wi-Fi (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth (trademark) BLE (Bluetooth (registered trademark) Low Energy) communication, ultra-low power wireless communication such as ANT, mesh network standardized by IEEE802.11s, infrared communication, human body communication, signal transmission via conductive fiber, etc. Can be mentioned.
- the communication unit 905 may be a cellular radio transceiver that operates according to a standard such as W-CDMA (Wideband Code Multiple Access), LTE (Long Term Evolution), or the like.
- W-CDMA Wideband Code Multiple Access
- LTE Long Term Evolution
- information on the virtual object 150 generated inside the image display device 700, acquired state information, and environment information may be transmitted to an external device via the communication unit 905.
- information (such as a library) generated by an external server or the like for controlling the display of the virtual object 150 may be received via the communication unit 905.
- the storage unit 906 is a large-capacity storage device configured by an SSD (Solid State Drive) or the like.
- the storage unit 906 stores application programs executed by the control unit 901 and various data. For example, information (such as a library) of the virtual object 150 whose display is controlled by the control unit 901 and a display image of the virtual object 150 displayed on the display unit 909 may be stored in the storage unit 906.
- the image processing unit 907 further performs signal processing such as image quality correction on the image signal output from the control unit 901 and converts the image signal to a resolution that matches the screen of the display unit 909.
- the display driving unit 908 sequentially selects the pixels of the display unit 909 for each row and performs line sequential scanning, and supplies a pixel signal based on the image signal that has been subjected to signal processing.
- the display unit 909 includes a display panel configured by, for example, a micro display such as an organic EL (Electro-Luminescence) element or a liquid crystal display, or a laser scanning display such as a direct retina display.
- the virtual image optical unit 910 enlarges and projects the display image of the display unit 909 and causes the user to observe it as an enlarged virtual image.
- the virtual object 150 and other AR images are made to be observed by the user through the virtual image optical unit 910.
- the outer display unit 915 has a display screen that faces the outside of the image display device 700 (in the direction opposite to the face of the user who wears the image display device 700). Can be displayed. For example, when the image of the virtual object 150 displayed on the display unit 909 is also displayed on the outer display unit 915, virtual reality can be shared with other users around. For the detailed configuration of the outer display unit 915, see, for example, Japanese Patent Application Nos. 2012-200902 and 2012-200903 already assigned to the applicant.
- the audio processing unit 913 further performs signal processing such as sound quality correction, audio amplification, and input audio signal on the audio signal output from the control unit 901.
- the voice input / output unit 914 outputs the voice after the voice processing to the outside and inputs the voice from the microphone (described above).
- the audio input / output unit 914 can output a binaural sound source.
- the basic operation is that the virtual object 150 appears when the user sees or finds the real object 151.
- the identification information 152 is read by capturing the user's field of view with the outer camera 912 and analyzing the captured image, or is present by object recognition.
- the object 151 can be specified, and the virtual object 150 appears on the display unit 909 accordingly.
- the display device 201 or the display device 402 can be configured by using an information device (for example, see Patent Document 6) such as a smartphone or a digital camera that displays a captured image of the camera through.
- an information device for example, see Patent Document 6
- a smartphone or a digital camera that displays a captured image of the camera through.
- FIG. 10 schematically shows an internal configuration example of the output device 202 or 403 configured as a dedicated hardware device.
- the output device 202 is a hardware device that represents an interaction performed by the virtual object 150 in the real space, and is integrated with or built in the real object 151, for example.
- the output device 202 performs a reaction on the action of the virtual object 150 or performs an action on the virtual object 150.
- the illustrated output device 202 includes a control unit 1001, a communication unit 1002, a state detection unit 1003, an environment detection unit 1004, an actuator unit 1020, and a drive control unit 1010.
- the control unit 1001 includes a ROM 1001A and a RAM 1001B.
- the ROM 901A stores program codes executed by the control unit 1001 and various data.
- the control unit 1001 controls the operation of the actuator unit 1020 via the drive control unit 1010 by executing a program loaded into the RAM 1001B.
- the operation of the actuator unit 1020 for expressing a reaction to an action performed by the virtual object 150 on the real object 151, an action performed by the real object 151 on the virtual object 150, or the like is controlled.
- Control program Further, data such as an operation pattern of the actuator unit 1020 for expressing an action or reaction desired to be performed on the virtual object 150 through the real object 151 may be stored in the ROM 100A.
- the state detection unit 1003 detects the state of the real object 151 on which the output device 202 is mounted or a change in the state.
- the state detection unit 1003 includes, for example, a gyro sensor, an acceleration sensor, a speed sensor, a pressure sensor, a temperature sensor that detects body temperature or air temperature, a sweat sensor, a myoelectric sensor, an electro-oculogram sensor, an electroencephalogram sensor, an expiration sensor, and a gas ion concentration.
- One or more state sensors, such as sensors, are provided.
- the environment detection unit 1004 detects one or more environmental factors surrounding the real object 151 on which the output device 202 is mounted, or the virtual object 150 performing a physical operation on the real object 151, or a change thereof.
- the actuator unit 1020 includes one or more output device groups for outputting a reaction or action of the real object 151 with respect to the virtual object 150.
- FIG. 11 schematically shows a configuration example of the actuator unit 1020.
- the actuator unit 1020 includes at least one of a vibration device 1101, a heat generation device 1102, a cooling device 1103, a blower device 1104 such as a fan, an acoustic device 1105 such as a speaker, a light emitting device 1106, a moving device 1107 such as a wheel, and a pulse generation device 1108. Part output device.
- the control unit 1001 can control the operations of these output devices 1101 to 1108 through the drive control unit 1010.
- the actuator unit 1020 performs vibration, heat generation or cooling, wind, sound, light emission, movement, and the like on the real object 151 on which the virtual object 150 performs an action as a reaction or action on the virtual object 150.
- Output such as jumping can be generated.
- There are many types of output devices equipped in the actuator unit 1020 and when a plurality of output devices are operated in combination, more various interactions can be presented.
- control unit 1001 When the control unit 1001 receives a detection result of an action performed by the virtual object 150 in the real space (for example, a physical operation performed on the real object 151) from the display devices 201 and 402 via the communication unit 1002, The reaction for the action is calculated, and the operation of the actuator unit 1020 is controlled via the drive control unit 1010. Alternatively, when the control unit 1001 receives a reaction calculation result from the display devices 201 and 402 via the communication unit 1002, the control unit 1001 controls the operation of the actuator unit 1020 accordingly.
- a detection result of an action performed by the virtual object 150 in the real space for example, a physical operation performed on the real object 151
- the control unit 1001 controls the operation of the actuator unit 1020 accordingly.
- control unit 1001 receives information regarding the state of the user and changes in the state from the display devices 201 and 402, or information regarding changes in the environment surrounding the display devices 201 and 402 and the user, and changes in the environment via the communication unit 1002. Is received, the reaction or action of the real object 151 corresponding to the state or the change of the state or the environment or the change of the environment is calculated, and the operation of the actuator unit 1020 is controlled via the drive control unit 1010. Alternatively, when the control unit 1001 receives a reaction or a calculation result of an action corresponding to the state or environment from the display devices 201 and 402 via the communication unit 1002, the control unit 1001 controls the operation of the actuator unit 1020 accordingly.
- the state of the user referred to here includes a physical operation such as an action or a reaction performed on the virtual object 150 by the user himself / herself.
- control unit 1001 reacts or changes depending on the state or state change of the real object 151 detected by the state detection unit 1003, the environmental factor surrounding the virtual object 150 detected by the state detection unit 1004, or the change thereof. And the operation of the actuator unit 1020 is controlled according to this.
- the control unit 1001 may feed back the action or reaction of the real object 151 performed using the actuator unit 1020 to the display devices 201 and 402 via the communication unit 1002.
- the shape and size of the output device 202 or the real object 151 on which the output device 202 is mounted are arbitrary (hereinafter, the output device 202 and the real object 151 on which the output device 202 is mounted are unified as “output device 202”. I will call it).
- a flat output device such as a cushion as shown in FIG. 12A 202 and a disk-shaped output device 202 as shown in FIG. 12B are conceivable.
- the appropriate size of the output device 202 varies depending on the size of the virtual object 150 that performs an action or receives a reaction. For example, when the virtual object 150 has a palm size, the output device 202 may also have a palm size, as shown in FIG. 12C.
- the upper surface is an operation surface 1201 for performing physical operations (actions performed by the virtual object 150 and reactions performed by the output device 202 on the virtual object 150).
- the virtual object 150 rides on the operation surface 1201 and performs various activities such as walking, running, jumping, and playing a musical instrument.
- FIGS. 12A to 12C only a single virtual object 150 appears on one output device 202 (real object 151), but two or more virtual objects as shown in FIG. A scene where objects appear and coexist simultaneously is also assumed.
- one of the plurality of virtual objects is the user's own avatar or the character of the game being played by the user, and the other virtual objects are other users' avatars or game characters.
- the real object 151 includes the output unit 105 or is integrated with the output unit 105 and appears on the real object 151. It is possible to react to the action of the virtual object 150, or to perform an action on the virtual object 150.
- the real object 151 does not include the output unit 105, is simply an object in which the identification information 152 is arranged or can be recognized, and is a stage where the virtual object 150 appears. Does not.
- the real object 151 may be a region 2601 in which the identification information 152 is disposed indoors as shown in FIG. 26, or a place that can be specified by image recognition or the like.
- the virtual object 150 is displayed so as to overlap the area 2601.
- the place where the identification information 152 is disposed is merely an anchor for calling the virtual object 150, and no physical operation is performed on the virtual object 150 (no interaction with the virtual object 150 is performed).
- a user wearing a head-mounted display 2701 turns to the right and looks at the right shoulder.
- the fact that the user has turned to the right can be detected based on, for example, recognition processing of a captured image of the outer camera 912 or sensor output such as a gyro sensor.
- the head mounted display 2701 displays the virtual object 150 so as to be placed directly on the right shoulder of the user.
- the appearance of turning to the right corresponds to the PULL type (described above).
- a voice that the virtual object 150 speaks from near the right shoulder of the user is generated.
- the virtual object 150 starts to be displayed in response. It is a natural interaction that the virtual object 150 speaks from a fixed direction such as the right shoulder.
- the appearance method of talking to the user autonomously when the virtual object 150 wants to appear is classified as a PUSH type (described above). Further, when the user turns away from the right shoulder and the virtual object 150 placed on the right shoulder goes out of the field of view, the display of the virtual object 150 disappears.
- the head-mounted display 2801 when it is detected that the user wearing the head-mounted display 2801 looks at his / her palm 2802, the head-mounted display 2801 is placed on the user's palm 2802.
- the virtual object 150 is displayed so as to be placed directly. At this time, whether or not the user is looking at the palm of the user may be confirmed by, for example, authentication processing of a fingerprint photographed by the outer camera 912.
- the virtual object 150 is displayed. It may be an about method. Palms can be identified by machine learning. Further, the display of the virtual object 150 may be extinguished by the user retracting the palm or holding the hand.
- the right shoulder is raised and lowered instead of driving the hardware device such as the actuator unit 1020.
- the action on the virtual object 150 can be performed by human operations such as shaking the right shoulder, raising and lowering the palm, and joining hands.
- An action performed by the user on the virtual object 150 based on a photographed image of the outer camera 912 can be detected based on a photographed image of the outer camera 912, for example.
- the virtual object control unit 101 may control the reaction operation of the virtual object 150 such as stepping on the right shoulder or the palm of the hand, jumping, or rolling down in response to the detected action on the virtual object 150. .
- the vibration device 1101 is configured by a combination of one or more devices that convert an electrical signal into force or mechanical distortion, such as a piezoelectric element.
- the operation surface 1201 of the output device 202 is disposed so as to be supported by three or more piezoelectric elements 1301, 1302, and 1303. Then, as shown in FIG. 14, by operating the piezoelectric elements 1301, 1302, and 1303 to have different strokes, a free space can be expressed on the operation surface 1201 as shown in FIG.
- the walking virtual object 150 falls from the operation surface 1201 unless it is tilted forward, the ground contact position of the foot is changed, or the jump is performed.
- the detection unit 104 detects an action on the virtual object 150 of the output device 202 (real object 151) that the operation surface 120 of the real object 151 is tilted based on the recognition result of the captured image of the outer camera 912 and the like. Alternatively, the detection unit 104 receives a notification from the output control unit 106 that the actuator unit 1020 has been operated so as to tilt the operation surface 120. Then, the detection unit 104 notifies the virtual object control unit 101 of the action that the output device 202 has performed on the virtual object 150. In contrast, the virtual object control unit 101 moves the virtual object 150 so that it does not fall off the floor, changes its posture, or falls off the floor without changing the posture. Control.
- the display unit 103 displays a virtual object 150 that takes a reaction.
- the operation of the actuator unit 1020 is not only an action on the virtual object 150 but also a tactile feedback to the user.
- the vibration device 1101 vibrates and steps.
- the tactile sensation 1601 is transmitted from the lower surface of the output device 202 (real object 151) to the palm.
- the acoustic device 1105 may generate a stepping sound (a pseudo sound such as “don, don”) 1602 to enhance the sense of presence. Further, it may be expressed by temperature that the heat generating device 1106 generates heat and receives a physical operation from the virtual object 150. Further, when the virtual object 150 that is constantly operated, such as continuing to step on the virtual object 150, is displayed, it is possible to give an impression that the virtual object 150 is interacting without delay.
- the vibration 1701 of the vibration device 1101 and the larger footstep sound 1702 due to the acoustic device 1105 are generated, and the ventilation The device 1104 may generate a breeze 1703, the heat generating device 1102 may generate heat, or the light emitting device 1106 may emit a flashing light 1704, so that the virtual object 150 may actually pass. Further, when the virtual object 150 that is constantly operated, such as continuing to step on the virtual object 150, is displayed, it is possible to give an impression that the virtual object 150 is interacting without delay.
- the output device 202 can transmit the mental state of the virtual object 150 to the palm of the user by temperature using the heat generating device 1102 and the cooling device 1103.
- the heat generating device 1102 generates heat
- the hot air 1801 conveys the excited state of the virtual object 150 from the palm to the user. Yes (see FIG. 18).
- the cooling device 1103 cools the user's palm
- the cold object 1901 is cooled (frozen) with the cold air 1901 from the palm. Can be communicated to the user (see FIG. 19).
- the output device 202 can add a sound effect or the like to the action or reaction performed by the virtual object 150 using the acoustic device 1105 to give a sense of reality.
- the virtual object 150 is playing a musical instrument such as the violin 2000
- the sound of the other musical instruments 2001, 2002,... Can be uttered from the acoustic device 1105 to improve the performance effect (see FIG. 20).
- the state information acquisition unit 904 or the like can further enhance the atmosphere by emitting the sound of applause 2010 from the acoustic device 1105 in synchronization with the timing when the user is impressed by the performance of the virtual object 150.
- FIG. 10 and FIG. 11 show a configuration example of a high-spec output device 202 equipped with a plurality of output devices 1101, 1102,.
- the actuator unit 1020 is equipped with various types of output devices, the actions and reactions in the real space of the virtual object 150 can be expressed in a more diverse manner, as can be seen with reference to FIGS. .
- the output device 202 can be substituted by an information terminal equipped with only a limited output device.
- a flat palm-sized information processing device such as a smartphone equipped with a vibrator function such as an eccentric motor can be used as the output device 202 according to the present embodiment. .
- the virtual object 150 does not operate in synchronization with the action or reaction performed by the output device 403, the virtual object 150 becomes an unnatural operation.
- the display device 402 is configured as a head-mounted display, when the user swings, the display area of the virtual object 150 is moved so as to cancel the head movement detected by the gyro sensor or the like. Thus, the virtual object 150 that follows the movement of the user's head can be presented.
- the display area of the virtual object 150 moves to the opposite left side. Since the object 150 stays on the real object 151 where it appears, the virtual object 150 appears to the user as if it exists in the real space.
- the display area of the virtual object 150 is made to follow the movement of the user's head, that is, the field of view (that is, the virtual object 150 must be returned to the location of the original real object 151).
- the virtual object 150 has a strong impression as a display image of the head mounted display 2101 and does not appear to exist in the real space.
- the display becomes unnatural.
- the control device 401 that controls the virtual object 150 and the display device 402 that displays the virtual object 150 are configured as physically independent devices. In some cases, a delay due to communication processing also occurs, so the latency problem becomes more serious.
- FIG. 23 shows a state in which the movement of the image of the virtual object 150 is delayed with respect to the movement of the user's head.
- the actual movement of the user's head is indicated by a solid line 2301
- the detected amount of head movement recognized from the captured image of the camera is indicated by a dashed line 2302.
- a correct value is known only with a time delay.
- the movement of the user's head itself can be detected by a sensor such as a gyro sensor (described above).
- a sensor such as a gyro sensor (described above).
- a detected amount of head movement by a sensor such as a gyro sensor is indicated by a dotted line 2303.
- the gyro sensor is a relative sensor, the movement of the user's head per unit time can be detected, but the absolute coordinates are unknown and there is a concern that it will eventually shift.
- a process of matching to a correct value may be performed in parallel using the image recognition result indicated by reference numeral 2302. Specifically, the servo is applied so that the difference between the detection amount of the head movement by the gyro sensor and the detection amount by the image recognition becomes zero, and whether the difference converges to zero in a certain time (see the image). So that the difference is zero when the head is stationary.
- the detection amount of the gyro sensor is differentially controlled (motion prediction such as feedforward use of differential control in which a differential signal is superimposed), there is a certain effect.
- the virtual object 150 takes a reaction to the action of the real object 151, there is a latency problem as well.
- the reaction that the virtual object 150 tilts forward changes the ground contact position of the foot, jumps, or falls down is performed.
- the delay time until the virtual object 151 takes a reaction with respect to the movement of the real object 151 is too long, the operation of the virtual object 150 does not appear to be a reaction and an unnatural image is generated. If a real object 151 is photographed with a camera and its movement is detected by image recognition, the correct value is known only with a time delay.
- the detection unit 104 in the display device 402 detects the action of the real object 151 from the photographed image of the camera, etc., the control device 401 generates the action of the virtual object 150 corresponding to the action, and the display device 402 displays the action.
- a delay occurs due to arithmetic processing in each of the devices 401 and 402 and communication processing between the devices 401 and 402.
- a change in the position or orientation of the real object 151 detected by the gyro sensor (described above) included in the landing device 403 as the state detection unit 415 is used.
- the operation of the corresponding virtual object 150 may be controlled.
- the virtual object 150 is caused to take a reaction of tilting forward, changing the ground contact position of the foot, jumping, or falling down.
- the gyro sensor is a relative sensor, the movement of the user's head per unit time can be detected, but since the absolute coordinates are not known, it is to be matched with the image information (same as above).
- the technology disclosed in this specification relates to a technology for presenting an interaction between a virtual object and a real space.
- FIG. 2, FIG. 4, FIG. As system configurations for realizing the technology, FIG. 2, FIG. 4, FIG.
- the implementation is not limited to these.
- a head-mounted display is used as a display device that presents a virtual object to the user.
- the technology disclosed in this specification is not limited to this.
- Various information terminals having a display screen such as a smartphone, a tablet terminal, and a game controller can be used as a display device.
- FIGS. 12, 26, 27, and 28 are given as real objects in which virtual objects appear, but the technology disclosed in this specification is not limited to these.
- an output unit for adding motion to a real object A control unit that controls an output from the output unit according to an operation performed by the virtual object on the real object or an operation performed on the virtual object by the real object;
- An information processing apparatus comprising: (2) It further includes identification information for identifying the information processing apparatus.
- (3) a receiving unit that receives a detection result of an operation performed by the virtual object on the real object The control unit controls the output from the output unit according to the received detection result.
- the information processing apparatus controls appearance and disappearance of the virtual object.
- the information processing apparatus according to (4) above.
- the output unit includes at least one output device among a vibration device, a pulse generation device, a heat generation device, a cooling device, a blower device, an acoustic device, a light emitting device, and a moving device incorporated in the real object. Prepare The information processing apparatus according to (1) above.
- the control unit controls the output from the output unit in synchronization with the operation of the virtual object displayed by the display device.
- the information processing apparatus according to (1) above.
- An information processing method comprising: (9) a detection unit for detecting a specific real object; A display unit for displaying a virtual object in response to detecting the specific real object; A display device comprising: (10) The detection unit identifies the real object, The display unit displays the virtual object corresponding to the identified real object; The display device according to (9) above. (11) The detection unit identifies the real object based on identification information included in the real object, or identifies the real object by object recognition. The display device according to (9) above.
- the detection unit detects the real object from within the user's field of view, The display unit displays the virtual object superimposed on the real object; The display device according to (9) above.
- a virtual object control unit that controls the operation of the virtual object is further provided.
- the virtual object control unit controls appearance and disappearance of the virtual object.
- the virtual object control unit controls the appearance or disappearance of the virtual object according to a user's action.
- (16) The virtual object control unit controls the information amount of the virtual object displayed by the display unit according to a user's action or situation, and a time zone.
- the virtual object control unit controls an operation of the virtual object with respect to the real object.
- the virtual object control unit controls the operation of the virtual object according to an operation that the virtual object receives from the real object.
- the detection unit detects an operation of the virtual object with respect to the real object, or an operation received by the virtual object from the real object,
- the virtual object control unit controls the operation of the virtual object based on a detection result of the detection unit.
- (20) The virtual object control unit controls the operation of the virtual object so as to be synchronized with the operation of the real object.
- a transmission unit that transmits the detection result of the detection unit to an external device is further provided.
- (22) The virtual object displayed on the display unit is always operated.
- (23) The display unit is used by being mounted on a user's head or face.
- (24) A position and orientation detection unit that detects the position and orientation of the user's head or face, The display unit corrects the display of the virtual object in a direction opposite to the change in the position or posture of the user's head or face,
- (25) a detection step of detecting a specific real object;
- a control device for controlling the operation of the virtual object;
- a display device that detects a real object and displays a corresponding virtual object;
- An output device that applies an action to a real object in response to an action performed by the virtual object on the real object or an action performed by the real object on the virtual object;
- An information processing system comprising: (27) a display device that detects a real object, displays a corresponding virtual object, and controls the operation of the virtual object;
- An output device that applies an action to a real object in response to an action performed by the virtual object on the real object or an action performed by the real object on the virtual object;
- An information processing system comprising:
- Communication part 1003 ... State detection part, 1004 ... Environment detection part 1010 ... Drive control part, 1020 ... Actuator part 1101 ... Vibration device, 1102 ... Heat generation device 1103 ... Cooling device, 1104 ... Air blow device 1105 ... Acoustic device, 1106 ... Light emitting device 1107 ... Moving device, 1108 ... Pulse generating device
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Heart & Thoracic Surgery (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Cardiology (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
実在物体に動作を加える出力部と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、前記出力部からの出力を制御する制御部と、
を具備する情報処理装置である。
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作を取得するステップと、
前記仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、前記実在物体に動作を加えるステップと、
を有する情報処理方法である。
特定の実在物体を検出する検出部と、
前記特定の実在物体を検出したことに応じて、仮想物体を表示する表示部と、
を具備する表示装置である。
特定の実在物体を検出する検出ステップと、
前記特定の実在物体を検出したことに応じて、仮想物体を表示する表示ステップと、
を有する表示方法である。
仮想物体の動作を制御する制御装置と、
実在物体を検出して、対応する仮想物体を表示する表示装置と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、実在物体に動作を加える出力装置と、
を具備する情報処理システムである。
実在物体を検出して、対応する仮想物体を表示するとともに、仮想物体の動作を制御する表示装置と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、実在物体に動作を加える出力装置と、
を具備する情報処理システムである。
(2)ユーザーが現実世界で行なっている行動若しくは動作
(3)ユーザーが仮想空間で行なう行動若しくは動作(例えばマウスやタッチパネル、その他の入力デバイスを介して行なうゲームの操作)
(4)現実空間(若しくは仮想空間)におけるユーザーの現在の状態
(5)仮想物体150がいる現実空間(若しくは仮想空間)における環境
(6)実在物体151(出力部105)が仮想物体150に対して行なうアクション又はリアクション
(1)実在物体に動作を加える出力部と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、前記出力部からの出力を制御する制御部と、
を具備する情報処理装置。
(2)前記情報処理装置を識別するための識別情報をさらに備える、
上記(1)に記載の情報処理装置。
(3)前記仮想物体が前記実在物体に対して行なう動作の検出結果を受信する受信部をさらに備え、
前記制御部は、受信した前記検出結果に従って、前記出力部からの出力を制御する、
上記(1)に記載の情報処理装置。
(4)前記仮想物体の動作を制御する仮想物体制御部をさらに備え、
前記制御部は、前記仮想物体制御部により制御される前記仮想物体の動作に応じて、前記出力部からの出力を制御する、
上記(1)に記載の情報処理装置。
(5)前記仮想物体制御部は、前記仮想物体の出現と消滅を制御する、
上記(4)に記載の情報処理装置。
(6)前記出力部は、前記実在物体の中に組み込まれた、振動デバイス、パルス発生デバイス、発熱デバイス、冷却デバイス、送風デバイス、音響デバイス、発光デバイス、移動デバイスのうち少なくとも1つの出力デバイスを備える、
上記(1)に記載の情報処理装置。
(7)前記制御部は、表示装置が表示する前記仮想物体の動作と同期して、前記出力部からの出力を制御する、
上記(1)に記載の情報処理装置。
(8)仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作を取得するステップと、
前記仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、前記実在物体に動作を加えるステップと、
を有する情報処理方法。
(9)特定の実在物体を検出する検出部と、
前記特定の実在物体を検出したことに応じて、仮想物体を表示する表示部と、
を具備する表示装置。
(10)前記検出部は、前記実在物体を識別し、
前記表示部は、識別した前記実在物体に対応した前記仮想物体を表示する、
上記(9)に記載の表示装置。
(11)前記検出部は、前記実在物体が備える識別情報に基づいて前記実在物体を識別し、又は、物体認識により前記実在物体を識別する、
上記(9)に記載の表示装置。
(12)前記検出部は、ユーザーの視界内から前記実在物体を検出し、
前記表示部は、前記実在物体に重ね合わせて前記仮想物体を表示する、
上記(9)に記載の表示装置。
(13)前記仮想物体の動作を制御する仮想物体制御部をさらに備える、
上記(9)に記載の表示装置。
(14)前記仮想物体制御部は、前記仮想物体の出現と消滅を制御する、
上記(13)に記載の表示装置。
(15)前記仮想物体制御部は、ユーザーの行動に応じて前記仮想物体の出現又は消滅を制御する、
上記(13)に記載の表示装置。
(16)前記仮想物体制御部は、ユーザーの行動又は状況、時間帯に応じて、前記表示部が表示する前記仮想物体の情報量を制御する、
上記(13)に記載の表示装置。
(17)前記仮想物体制御部は、前記仮想物体の前記実在物体に対する動作を制御する、
上記(13)に記載の表示装置。
(18)前記仮想物体制御部は、前記仮想物体が前記実在物体から受ける動作に応じて、前記仮想物体の動作を制御する、
上記(13)に記載の表示装置。
(19)前記検出部は、前記仮想物体の前記実在物体に対する動作、又は、前記仮想物体が前記実在物体から受ける動作を検出し、
前記仮想物体制御部は、前記検出部の検出結果に基づいて、前記仮想物体の動作を制御する、
上記(13)に記載の表示装置。
(20)前記仮想物体制御部は、前記実在物体の動作と同期するように、前記仮想物体の動作を制御する、
上記(13)に記載の表示装置。
(21)前記検出部の検出結果を外部の装置に送信する送信部をさらに備える、
上記(9)に記載の表示装置。
(22)前記表示部で表示する前記仮想物体を常時動作させる、
上記(9)に記載の表示装置。
(23)前記表示部は、ユーザーの頭部又は顔部に装着して用いられる、
上記(9)に記載の表示装置。
(24)ユーザーの頭部又は顔部の位置並びに姿勢を検出する位置姿勢検出部をさらに備え、
前記表示部は、ユーザーの頭部又は顔部の位置又は姿勢の変化とは逆方向に、前記仮想物体の表示を修正する、
上記(23)に記載の表示装置。
(25)特定の実在物体を検出する検出ステップと、
前記特定の実在物体を検出したことに応じて、仮想物体を表示する表示ステップと、
を有する表示方法。
(26)仮想物体の動作を制御する制御装置と、
実在物体を検出して、対応する仮想物体を表示する表示装置と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、実在物体に動作を加える出力装置と、
を具備する情報処理システム。
(27)実在物体を検出して、対応する仮想物体を表示するとともに、仮想物体の動作を制御する表示装置と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、実在物体に動作を加える出力装置と、
を具備する情報処理システム。
103…表示部、104…検出部
105…出力部、105…出力制御部
107…環境検出部、108…状態検出部
150…仮想物体、151…実在物体、152…識別情報
201…表示装置、202…出力装置、213…環境検出部
401…制御装置、402…表示装置、403…出力装置
413…環境検出部、414…状態検出部、415…状態検出部
700…画像表示装置(透過型)
701L、701R…虚像光学部、702…支持体
703L、703R…マイクロフォン、704L、704R…表示パネル
901…制御部、901A…ROM、901B…RAM
902…入力操作部、903…リモコン受信部
904…状態情報取得部、905…通信部、906…記憶部
907…画像処理部、908…表示駆動部
909…表示部、910…虚像光学部、912…外側カメラ
913…音声処理部、914…音声入出力部
915…外側表示部、916…環境情報取得部
1001…制御部、1001A…ROM、1001B…RAM
1002…通信部、1003…状態検出部、1004…環境検出部
1010…駆動制御部、1020…アクチュエーター部
1101…振動デバイス、1102…発熱デバイス
1103…冷却デバイス、1104…送風デバイス
1105…音響デバイス、1106…発光デバイス
1107…移動デバイス、1108…パルス発生デバイス
Claims (20)
- 実在物体に動作を加える出力部と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、前記出力部からの出力を制御する制御部と、
を具備する情報処理装置。 - 前記情報処理装置を識別するための識別情報をさらに備える、
請求項1に記載の情報処理装置。 - 前記仮想物体が前記実在物体に対して行なう動作の検出結果を受信する受信部をさらに備え、
前記制御部は、受信した前記検出結果に従って、前記出力部からの出力を制御する、
請求項1に記載の情報処理装置。 - 前記仮想物体の動作を制御する仮想物体制御部をさらに備え、
前記制御部は、前記仮想物体制御部により制御される前記仮想物体の動作に応じて、前記出力部からの出力を制御する、
請求項1に記載の情報処理装置。 - 前記出力部は、前記実在物体の中に組み込まれた、振動デバイス、パルス発生デバイス、発熱デバイス、冷却デバイス、送風デバイス、音響デバイス、発光デバイス、移動デバイスのうち少なくとも1つの出力デバイスを備える、
請求項1に記載の情報処理装置。 - 仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作を取得するステップと、
前記仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、前記実在物体に動作を加えるステップと、
を有する情報処理方法。 - 特定の実在物体を検出する検出部と、
前記特定の実在物体を検出したことに応じて、仮想物体を表示する表示部と、
を具備する表示装置。 - 前記検出部は、前記実在物体を識別し、
前記表示部は、識別した前記実在物体に対応した前記仮想物体を表示する、
請求項7に記載の表示装置。 - 前記検出部は、前記実在物体が備える識別情報に基づいて前記実在物体を識別し、又は、物体認識により前記実在物体を識別する、
請求項7に記載の表示装置。 - 前記検出部は、ユーザーの視界内から前記実在物体を検出し、
前記表示部は、前記実在物体に重ね合わせて前記仮想物体を表示する、
請求項7に記載の表示装置。 - 前記仮想物体の動作を制御する仮想物体制御部をさらに備える、
請求項7に記載の表示装置。 - 前記仮想物体制御部は、ユーザーの行動に応じて前記仮想物体の出現又は消滅を制御する、
請求項11に記載の表示装置。 - 前記仮想物体制御部は、ユーザーの行動又は状況、時間帯に応じて、前記表示部が表示する前記仮想物体の情報量を制御する、
請求項11に記載の表示装置。 - 前記仮想物体制御部は、前記仮想物体の前記実在物体に対する動作を制御し、又は、前記仮想物体が前記実在物体から受ける動作に応じて、前記仮想物体の動作を制御する、
請求項11に記載の表示装置。 - 前記検出部は、前記仮想物体の前記実在物体に対する動作、又は、前記仮想物体が前記実在物体から受ける動作を検出し、
前記仮想物体制御部は、前記検出部の検出結果に基づいて、前記仮想物体の動作を制御する、
請求項11に記載の表示装置。 - 前記仮想物体制御部は、前記実在物体の動作と同期するように、前記仮想物体の動作を制御する、
請求項11に記載の表示装置。 - 前記表示部は、ユーザーの頭部又は顔部に装着して用いられ、
ユーザーの頭部又は顔部の位置並びに姿勢を検出する位置姿勢検出部をさらに備え、
前記表示部は、ユーザーの頭部又は顔部の位置又は姿勢の変化とは逆方向に、前記仮想物体の表示を修正する、
請求項7に記載の表示装置。 - 特定の実在物体を検出する検出ステップと、
前記特定の実在物体を検出したことに応じて、仮想物体を表示する表示ステップと、
を有する表示方法。 - 仮想物体の動作を制御する制御装置と、
実在物体を検出して、対応する仮想物体を表示する表示装置と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、実在物体に動作を加える出力装置と、
を具備する情報処理システム。 - 実在物体を検出して、対応する仮想物体を表示するとともに、仮想物体の動作を制御する表示装置と、
仮想物体が前記実在物体に対して行なった動作、又は、前記実在物体が前記仮想物体に対して行なう動作に応じて、実在物体に動作を加える出力装置と、
を具備する情報処理システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/782,638 US10269180B2 (en) | 2013-04-16 | 2014-03-04 | Information processing apparatus and information processing method, display apparatus and display method, and information processing system |
BR112015025869A BR112015025869A2 (pt) | 2013-04-16 | 2014-03-04 | aparelhos de processamento de informação e de exibição, métodos para processamento de informação e para exibição, e, sistema de processamento de informação |
EP14785048.1A EP2988275A4 (en) | 2013-04-16 | 2014-03-04 | INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD, DISPLAY DEVICE AND DISPLAY METHOD AND INFORMATION PROCESSING SYSTEM |
JP2015512346A JP6217747B2 (ja) | 2013-04-16 | 2014-03-04 | 情報処理装置及び情報処理方法 |
CN201480020555.0A CN105144248B (zh) | 2013-04-16 | 2014-03-04 | 信息处理设备和信息处理方法、显示设备和显示方法与信息处理系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-086023 | 2013-04-16 | ||
JP2013086023 | 2013-04-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014171200A1 true WO2014171200A1 (ja) | 2014-10-23 |
Family
ID=51731161
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/055352 WO2014171200A1 (ja) | 2013-04-16 | 2014-03-04 | 情報処理装置及び情報処理方法、表示装置及び表示方法、並びに情報処理システム |
Country Status (6)
Country | Link |
---|---|
US (1) | US10269180B2 (ja) |
EP (1) | EP2988275A4 (ja) |
JP (1) | JP6217747B2 (ja) |
CN (1) | CN105144248B (ja) |
BR (1) | BR112015025869A2 (ja) |
WO (1) | WO2014171200A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016126772A (ja) * | 2014-12-31 | 2016-07-11 | イマージョン コーポレーションImmersion Corporation | 拡張及び仮想現実アプリケーションのための触覚的に向上されたオブジェクトを生成するシステム及び方法 |
JP2017033334A (ja) * | 2015-08-03 | 2017-02-09 | 株式会社オプティム | ヘッドマウントディスプレイ、データ出力方法、及びヘッドマウントディスプレイ用プログラム。 |
WO2017076785A1 (de) * | 2015-11-07 | 2017-05-11 | Audi Ag | Virtual-reality-brille und verfahren zum betreiben einer virtual-reality-brille |
JP2018088102A (ja) * | 2016-11-28 | 2018-06-07 | 株式会社スクウェア・エニックス | プログラム、コンピュータ装置、及び、判定方法 |
US10235809B2 (en) | 2016-06-30 | 2019-03-19 | Microsoft Technology Licensing, Llc | Reality to virtual reality portal for dual presence of devices |
WO2019150781A1 (ja) * | 2018-01-30 | 2019-08-08 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2019213231A (ja) * | 2014-11-07 | 2019-12-12 | ソニー株式会社 | 情報処理システム、制御方法、および記憶媒体 |
US20200053501A1 (en) * | 2016-11-16 | 2020-02-13 | Sony Corporation | Information processing apparatus, information processing method, and program |
JP2020511048A (ja) * | 2017-07-18 | 2020-04-09 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | バーチャルプロップ割り当て方法、サーバー、クライアント及び記憶媒体 |
US11054894B2 (en) | 2017-05-05 | 2021-07-06 | Microsoft Technology Licensing, Llc | Integrated mixed-input system |
Families Citing this family (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6138566B2 (ja) * | 2013-04-24 | 2017-05-31 | 川崎重工業株式会社 | 部品取付作業支援システムおよび部品取付方法 |
JP6396463B2 (ja) * | 2014-07-01 | 2018-09-26 | シャープ株式会社 | 姿勢制御装置、ロボット、プログラム、および姿勢制御方法 |
EP2990085B1 (en) * | 2014-08-29 | 2022-08-10 | Nintendo Co., Ltd. | Method and apparatus for estimating the value of an input in presence of a perturbing factor |
US9715865B1 (en) * | 2014-09-26 | 2017-07-25 | Amazon Technologies, Inc. | Forming a representation of an item with light |
CN107111314B (zh) * | 2014-11-07 | 2021-10-08 | 索尼公司 | 控制系统、控制方法以及存储介质 |
CN114461062A (zh) * | 2014-11-07 | 2022-05-10 | 索尼公司 | 信息处理系统、控制方法和计算机可读存储介质 |
US10664975B2 (en) * | 2014-11-18 | 2020-05-26 | Seiko Epson Corporation | Image processing apparatus, control method for image processing apparatus, and computer program for generating a virtual image corresponding to a moving target |
US9568994B2 (en) * | 2015-05-19 | 2017-02-14 | Spotify Ab | Cadence and media content phase alignment |
US9536560B2 (en) | 2015-05-19 | 2017-01-03 | Spotify Ab | Cadence determination and media content selection |
CN106412469B (zh) * | 2015-08-03 | 2019-05-24 | 中强光电股份有限公司 | 投影系统、投影装置与投影系统的投影方法 |
US20170038829A1 (en) * | 2015-08-07 | 2017-02-09 | Microsoft Technology Licensing, Llc | Social interaction for remote communication |
JP6367166B2 (ja) * | 2015-09-01 | 2018-08-01 | 株式会社東芝 | 電子機器及び方法 |
US10583361B2 (en) * | 2016-02-04 | 2020-03-10 | Disney Enterprises, Inc. | Incorporating and coordinating multiple home systems into a play experience |
US10163198B2 (en) * | 2016-02-26 | 2018-12-25 | Samsung Electronics Co., Ltd. | Portable image device for simulating interaction with electronic device |
CN109069927A (zh) * | 2016-06-10 | 2018-12-21 | Colopl株式会社 | 用于提供虚拟空间的方法、用于使计算机实现该方法的程序以及用于提供虚拟空间的系统 |
CN106293876A (zh) * | 2016-08-04 | 2017-01-04 | 腾讯科技(深圳)有限公司 | 基于虚拟现实场景的信息认证方法和装置 |
CN107835288A (zh) * | 2016-09-16 | 2018-03-23 | 天津思博科科技发展有限公司 | 应用智能终端实现的互动娱乐装置 |
GB2554914B (en) * | 2016-10-14 | 2022-07-20 | Vr Chitect Ltd | Virtual reality system and method |
KR102369905B1 (ko) * | 2016-10-31 | 2022-03-03 | 주식회사 테그웨이 | 피드백 디바이스 및 이를 이용하는 열적 피드백 제공 방법 |
CN107357416A (zh) * | 2016-12-30 | 2017-11-17 | 长春市睿鑫博冠科技发展有限公司 | 一种人机交互装置及交互方法 |
US11422626B2 (en) * | 2016-12-19 | 2022-08-23 | Sony Corporation | Information processing device, and information processing method, for outputting sensory stimulation to a user |
CN109863471A (zh) * | 2016-12-20 | 2019-06-07 | 三星电子株式会社 | 显示装置及其显示方法 |
JP6866646B2 (ja) * | 2017-01-16 | 2021-04-28 | オムロン株式会社 | センサ支援システム、端末、センサおよびセンサ支援方法 |
CN106843532A (zh) * | 2017-02-08 | 2017-06-13 | 北京小鸟看看科技有限公司 | 一种虚拟现实场景的实现方法和装置 |
US10627895B2 (en) * | 2017-03-21 | 2020-04-21 | Lenovo (Singapore) Pte Ltd | Providing a virtual control |
US10251011B2 (en) | 2017-04-24 | 2019-04-02 | Intel Corporation | Augmented reality virtual reality ray tracing sensory enhancement system, apparatus and method |
US20190038978A1 (en) * | 2017-08-01 | 2019-02-07 | Intel Corporation | Extendable platforms for transfer of data between physical objects and a virtual environment |
WO2019123744A1 (ja) * | 2017-12-22 | 2019-06-27 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
CN108334192A (zh) * | 2018-01-02 | 2018-07-27 | 联想(北京)有限公司 | 一种信息处理方法、穿戴式装置及存储介质 |
WO2019176236A1 (ja) * | 2018-03-13 | 2019-09-19 | ソニー株式会社 | 情報処理装置、情報処理方法、および記録媒体 |
GB2571956B (en) | 2018-03-14 | 2022-04-27 | Sony Interactive Entertainment Inc | Head-mountable apparatus and methods |
CN108509043B (zh) * | 2018-03-29 | 2021-01-15 | 联想(北京)有限公司 | 一种交互控制方法及系统 |
US10679393B2 (en) * | 2018-07-24 | 2020-06-09 | Snap Inc. | Conditional modification of augmented reality object |
CN111083391A (zh) * | 2018-10-19 | 2020-04-28 | 舜宇光学(浙江)研究院有限公司 | 虚实融合系统及其方法 |
CN111103967A (zh) * | 2018-10-25 | 2020-05-05 | 北京微播视界科技有限公司 | 虚拟对象的控制方法和装置 |
JPWO2020195292A1 (ja) * | 2019-03-26 | 2020-10-01 | ||
CN110766788B (zh) * | 2019-10-15 | 2023-03-24 | 三星电子(中国)研发中心 | 将虚拟物体映射到现实世界的方法及装置 |
US11861802B2 (en) | 2019-11-11 | 2024-01-02 | Spin Master Ltd. | Augmented reality system |
US11360733B2 (en) | 2020-09-10 | 2022-06-14 | Snap Inc. | Colocated shared augmented reality without shared backend |
CN114820835A (zh) * | 2021-01-28 | 2022-07-29 | 索尼半导体解决方案公司 | 信息处理方法、信息处理装置和非易失性存储介质 |
US12051163B2 (en) | 2022-08-25 | 2024-07-30 | Snap Inc. | External computer vision for an eyewear device |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000184398A (ja) * | 1998-10-09 | 2000-06-30 | Sony Corp | 仮想画像立体合成装置、仮想画像立体合成方法、ゲ―ム装置及び記録媒体 |
JP2002112286A (ja) * | 2000-09-27 | 2002-04-12 | Mixed Reality Systems Laboratory Inc | 複合現実感提示装置及びその方法並びに記憶媒体 |
JP2002304246A (ja) * | 2001-04-04 | 2002-10-18 | Nippon Telegr & Teleph Corp <Ntt> | 力覚提示装置及び仮想空間システム |
JP2005012385A (ja) | 2003-06-18 | 2005-01-13 | Nippon Telegr & Teleph Corp <Ntt> | オブジェクト表示方法およびオブジェクト表示装置 |
JP2005165776A (ja) | 2003-12-03 | 2005-06-23 | Canon Inc | 画像処理方法、画像処理装置 |
JP2006072667A (ja) * | 2004-09-01 | 2006-03-16 | Sony Computer Entertainment Inc | 画像処理装置、ゲーム装置および画像処理方法 |
JP2006262980A (ja) | 2005-03-22 | 2006-10-05 | Olympus Corp | 情報端末装置及び仮想ペット表示方法 |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2009069918A (ja) * | 2007-09-10 | 2009-04-02 | Canon Inc | 情報処理装置、情報処理方法 |
JP2010049690A (ja) | 2008-08-19 | 2010-03-04 | Sony Computer Entertainment Europe Ltd | エンタテイメント装置、システム、及び方法 |
JP2011521318A (ja) * | 2008-04-16 | 2011-07-21 | バーチュアル プロテインズ ベー.フェー. | インタラクティブな仮想現実画像生成システム |
JP2012155655A (ja) | 2011-01-28 | 2012-08-16 | Sony Corp | 情報処理装置、報知方法及びプログラム |
JP2012248930A (ja) | 2011-05-25 | 2012-12-13 | Kyocera Corp | 携帯端末、表示制御プログラムおよび表示制御方法 |
JP2014010838A (ja) * | 2012-06-29 | 2014-01-20 | Disney Enterprises Inc | 拡張現実感シミュレーション連続体 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002184398A (ja) | 2000-12-13 | 2002-06-28 | Shin Etsu Chem Co Ltd | ニッケル水素電池の正極作成用スラリー、ニッケル水素電池の正極及びその製造方法 |
US7352356B2 (en) * | 2001-12-13 | 2008-04-01 | United States Of America | Refreshable scanning tactile graphic display for localized sensory stimulation |
WO2011119118A1 (en) * | 2010-03-26 | 2011-09-29 | Agency For Science, Technology And Research | A haptic system, a method of forming a haptic system and a method of controlling a haptic system |
KR101194957B1 (ko) | 2010-09-13 | 2012-10-25 | 한양대학교 산학협력단 | 마커 기반 증강 현실에서 촉감을 제공하는 시스템 및 방법 |
US8493353B2 (en) * | 2011-04-13 | 2013-07-23 | Longsand Limited | Methods and systems for generating and joining shared experience |
CA2835120C (en) * | 2011-05-06 | 2019-05-28 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
US8894462B2 (en) * | 2011-12-22 | 2014-11-25 | Activision Publishing, Inc. | Interactive video game with visual lighting effects |
US8963805B2 (en) * | 2012-01-27 | 2015-02-24 | Microsoft Corporation | Executable virtual objects associated with real objects |
US9183676B2 (en) * | 2012-04-27 | 2015-11-10 | Microsoft Technology Licensing, Llc | Displaying a collision between real and virtual objects |
TWI501109B (zh) * | 2012-11-05 | 2015-09-21 | Univ Nat Taiwan | 擬真觸覺力回饋裝置及其方法 |
US9286725B2 (en) * | 2013-11-14 | 2016-03-15 | Nintendo Co., Ltd. | Visually convincing depiction of object interactions in augmented reality images |
-
2014
- 2014-03-04 CN CN201480020555.0A patent/CN105144248B/zh not_active Expired - Fee Related
- 2014-03-04 EP EP14785048.1A patent/EP2988275A4/en not_active Ceased
- 2014-03-04 JP JP2015512346A patent/JP6217747B2/ja active Active
- 2014-03-04 WO PCT/JP2014/055352 patent/WO2014171200A1/ja active Application Filing
- 2014-03-04 BR BR112015025869A patent/BR112015025869A2/pt not_active Application Discontinuation
- 2014-03-04 US US14/782,638 patent/US10269180B2/en not_active Expired - Fee Related
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000184398A (ja) * | 1998-10-09 | 2000-06-30 | Sony Corp | 仮想画像立体合成装置、仮想画像立体合成方法、ゲ―ム装置及び記録媒体 |
JP2002112286A (ja) * | 2000-09-27 | 2002-04-12 | Mixed Reality Systems Laboratory Inc | 複合現実感提示装置及びその方法並びに記憶媒体 |
JP2002304246A (ja) * | 2001-04-04 | 2002-10-18 | Nippon Telegr & Teleph Corp <Ntt> | 力覚提示装置及び仮想空間システム |
JP2005012385A (ja) | 2003-06-18 | 2005-01-13 | Nippon Telegr & Teleph Corp <Ntt> | オブジェクト表示方法およびオブジェクト表示装置 |
JP2005165776A (ja) | 2003-12-03 | 2005-06-23 | Canon Inc | 画像処理方法、画像処理装置 |
JP2006072667A (ja) * | 2004-09-01 | 2006-03-16 | Sony Computer Entertainment Inc | 画像処理装置、ゲーム装置および画像処理方法 |
JP2006262980A (ja) | 2005-03-22 | 2006-10-05 | Olympus Corp | 情報端末装置及び仮想ペット表示方法 |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2009069918A (ja) * | 2007-09-10 | 2009-04-02 | Canon Inc | 情報処理装置、情報処理方法 |
JP2011521318A (ja) * | 2008-04-16 | 2011-07-21 | バーチュアル プロテインズ ベー.フェー. | インタラクティブな仮想現実画像生成システム |
JP2010049690A (ja) | 2008-08-19 | 2010-03-04 | Sony Computer Entertainment Europe Ltd | エンタテイメント装置、システム、及び方法 |
JP2012155655A (ja) | 2011-01-28 | 2012-08-16 | Sony Corp | 情報処理装置、報知方法及びプログラム |
JP2012248930A (ja) | 2011-05-25 | 2012-12-13 | Kyocera Corp | 携帯端末、表示制御プログラムおよび表示制御方法 |
JP2014010838A (ja) * | 2012-06-29 | 2014-01-20 | Disney Enterprises Inc | 拡張現実感シミュレーション連続体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2988275A4 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019213231A (ja) * | 2014-11-07 | 2019-12-12 | ソニー株式会社 | 情報処理システム、制御方法、および記憶媒体 |
JP2016126772A (ja) * | 2014-12-31 | 2016-07-11 | イマージョン コーポレーションImmersion Corporation | 拡張及び仮想現実アプリケーションのための触覚的に向上されたオブジェクトを生成するシステム及び方法 |
JP2017033334A (ja) * | 2015-08-03 | 2017-02-09 | 株式会社オプティム | ヘッドマウントディスプレイ、データ出力方法、及びヘッドマウントディスプレイ用プログラム。 |
WO2017076785A1 (de) * | 2015-11-07 | 2017-05-11 | Audi Ag | Virtual-reality-brille und verfahren zum betreiben einer virtual-reality-brille |
US10235809B2 (en) | 2016-06-30 | 2019-03-19 | Microsoft Technology Licensing, Llc | Reality to virtual reality portal for dual presence of devices |
US10986458B2 (en) * | 2016-11-16 | 2021-04-20 | Sony Corporation | Information processing apparatus and information processing method |
US20200053501A1 (en) * | 2016-11-16 | 2020-02-13 | Sony Corporation | Information processing apparatus, information processing method, and program |
JP2018088102A (ja) * | 2016-11-28 | 2018-06-07 | 株式会社スクウェア・エニックス | プログラム、コンピュータ装置、及び、判定方法 |
US11054894B2 (en) | 2017-05-05 | 2021-07-06 | Microsoft Technology Licensing, Llc | Integrated mixed-input system |
JP2020511048A (ja) * | 2017-07-18 | 2020-04-09 | テンセント・テクノロジー・(シェンジェン)・カンパニー・リミテッド | バーチャルプロップ割り当て方法、サーバー、クライアント及び記憶媒体 |
US11228811B2 (en) | 2017-07-18 | 2022-01-18 | Tencent Technology (Shenzhen) Company Limited | Virtual prop allocation method, server, client, and storage medium |
WO2019150781A1 (ja) * | 2018-01-30 | 2019-08-08 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
US11262838B2 (en) | 2018-01-30 | 2022-03-01 | Sony Corporation | Information processing device and information processing method |
Also Published As
Publication number | Publication date |
---|---|
BR112015025869A2 (pt) | 2017-07-25 |
EP2988275A4 (en) | 2016-11-30 |
JP6217747B2 (ja) | 2017-10-25 |
EP2988275A1 (en) | 2016-02-24 |
CN105144248B (zh) | 2019-08-06 |
US20160093107A1 (en) | 2016-03-31 |
CN105144248A (zh) | 2015-12-09 |
US10269180B2 (en) | 2019-04-23 |
JPWO2014171200A1 (ja) | 2017-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6217747B2 (ja) | 情報処理装置及び情報処理方法 | |
US10453248B2 (en) | Method of providing virtual space and system for executing the same | |
JP6263252B1 (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
US10438394B2 (en) | Information processing method, virtual space delivering system and apparatus therefor | |
US10546407B2 (en) | Information processing method and system for executing the information processing method | |
JP6244593B1 (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
US10313481B2 (en) | Information processing method and system for executing the information method | |
JP6290467B1 (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム | |
US20190018479A1 (en) | Program for providing virtual space, information processing apparatus for executing the program, and method for providing virtual space | |
US10459599B2 (en) | Method for moving in virtual space and information processing apparatus for executing the method | |
US20180357817A1 (en) | Information processing method, program, and computer | |
US20180196506A1 (en) | Information processing method and apparatus, information processing system, and program for executing the information processing method on computer | |
US10564801B2 (en) | Method for communicating via virtual space and information processing apparatus for executing the method | |
US10410395B2 (en) | Method for communicating via virtual space and system for executing the method | |
US20180348986A1 (en) | Method executed on computer for providing virtual space, program and information processing apparatus therefor | |
JPWO2016013269A1 (ja) | 画像表示装置及び画像表示方法、並びにコンピューター・プログラム | |
US20180348987A1 (en) | Method executed on computer for providing virtual space, program and information processing apparatus therefor | |
US20180299948A1 (en) | Method for communicating via virtual space and system for executing the method | |
US20180374275A1 (en) | Information processing method and apparatus, and program for executing the information processing method on computer | |
JP2018089228A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
JP6368404B1 (ja) | 情報処理方法、プログラム及びコンピュータ | |
JP2018125003A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム | |
JP2018124981A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
US10319346B2 (en) | Method for communicating via virtual space and system for executing the method | |
JP2018092635A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480020555.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14785048 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015512346 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14782638 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014785048 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015025869 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015025869 Country of ref document: BR Kind code of ref document: A2 Effective date: 20151009 |