[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US12086924B2 - Terminal device, method, and computer program - Google Patents

Terminal device, method, and computer program Download PDF

Info

Publication number
US12086924B2
US12086924B2 US18/172,357 US202318172357A US12086924B2 US 12086924 B2 US12086924 B2 US 12086924B2 US 202318172357 A US202318172357 A US 202318172357A US 12086924 B2 US12086924 B2 US 12086924B2
Authority
US
United States
Prior art keywords
terminal device
unit
display
photographing
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US18/172,357
Other versions
US20230206546A1 (en
Inventor
Mahoko Niiyama
Tomoaki Ogata
Jun Oki
Kazuya Tanabe
Kaoru Toyoguchi
Tomohide Inomata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JVCKenwood Corp
Original Assignee
JVCKenwood Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JVCKenwood Corp filed Critical JVCKenwood Corp
Assigned to JVCKENWOOD CORPORATION reassignment JVCKENWOOD CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NIIYAMA, Mahoko, OKI, JUN, INOMATA, TOMOHIDE, OGATA, TOMOAKI, TANABE, KAZUYA, TOYOGUCHI, KAORU
Publication of US20230206546A1 publication Critical patent/US20230206546A1/en
Application granted granted Critical
Publication of US12086924B2 publication Critical patent/US12086924B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/301Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device using an additional display connected to the game console, e.g. on the controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present disclosure relates to a terminal device, a method, and a computer program.
  • VR virtual reality
  • a terminal device includes a calculation unit that calculates a relative position between the terminal device and a display device that displays virtual reality to be used by a user, a display control unit that causes a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and based on the relative position, changes a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device, and a video data acquisition unit that acquires photographing data related to video in the space of the photographing range that is changed by the display control unit in the space of the virtual reality.
  • the display control unit causes a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit, the virtual terminal device has a photographing mode of a front camera for photographing an image in the photographing range including an avatar of the user and an avatar of another user in the same space of the virtual reality, and a storage control unit that saves, in a storage unit, the photographing data in the photographing range according to the photographing mode acquired by the video data acquisition unit, based on a saving operation is further provided.
  • a method includes calculating a relative position between a terminal device and a display device that displays virtual reality to be used by a user, causing a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and based on the relative position, changing a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device, acquiring photographing data related to video in a space of the photographing range that is changed in the virtual space, causing a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit, setting a photographing mode of the virtual terminal device to a photographing mode of a front camera for photographing an image in the photographing range including an avatar of the user and an avatar of another user in the same space of the virtual reality, and saving the photographing data in the photographing range according to the photographing mode in a storage unit, based on a saving operation.
  • a non-transitory computer readable recording medium storing therein a computer program according to an embodiment of the present disclosure causes a computer to execute calculating a relative position between a terminal device and a display device that displays virtual reality to be used by a user, causing a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and based on the relative position, changing a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device, acquiring photographing data related to video in a space of the photographing range that is changed in the virtual space, causing a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit, setting a photographing mode of the virtual terminal device to a photographing mode of a front camera for photographing an image in the photographing range including an avatar of the user and an avatar of another user in the same space of the virtual reality, and saving the photographing data in the photographing range according to the photographing mode in a
  • FIG. 1 is a diagram for describing a structure example of an information processing system according to an embodiment
  • FIG. 2 is a block diagram illustrating a structure example of a terminal device according to the embodiment
  • FIG. 3 is a diagram for describing identifiers displayed on a display unit
  • FIG. 4 is a block diagram illustrating a structure example of a display device according to the embodiment.
  • FIG. 5 is a block diagram illustrating a structure example of a server according to the embodiment.
  • FIG. 6 is a diagram for describing how to use the terminal device and the display device according to the embodiment.
  • FIG. 7 is a diagram for describing how to use the terminal device and the display device according to the embodiment.
  • FIG. 8 is a flowchart expressing one example of a procedure of the terminal device according to the embodiment.
  • FIG. 9 is a block diagram illustrating a structure example of a terminal device according to a first modification
  • FIG. 10 is a block diagram illustrating a structure example of a terminal device according to a second modification.
  • FIG. 11 is a block diagram illustrating a structure example of a display device according to the second modification.
  • FIG. 1 is a diagram for describing a structure example of the information processing system according to the embodiment.
  • an information processing system 1 includes a terminal device 10 , a display device 20 , and a server 30 .
  • the terminal device 10 , the display device 20 , and the server 30 are connected to each other via a network N so that communication therebetween is possible.
  • the information processing system 1 is a system in which memories in a VR space can be recorded.
  • the terminal device 10 is a device that includes, for example, a smartphone or a tablet terminal.
  • the display device 20 is, for example, a device that allows a user to experience VR.
  • the display device 20 is, for example, a device that includes a head mounted display (HMD) that is worn on the user's head.
  • HMD head mounted display
  • FIG. 2 is a block diagram illustrating a structure example of the terminal device according to the embodiment.
  • the terminal device 10 includes a display unit 12 , a sound output unit 13 , an operation unit 14 , a storage unit 15 , a communication unit 16 , and a control unit 17 .
  • the display unit 12 , the sound output unit 13 , the operation unit 14 , the storage unit 15 , the communication unit 16 , and the control unit 17 are connected to each other through a bus B 1 .
  • An image capture unit 11 photographs various kinds of video around the terminal device 10 .
  • the image capture unit 11 photographs the display device 20 , for example.
  • the image capture unit 11 includes an image capture element, a circuit that generates video data on the basis of the output of the image capture element, and the like, which are not illustrated.
  • Examples of the image capture element include, but are not limited to, a complementary metal oxide semiconductor (CMOS) image sensor and a charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the display unit 12 displays various kinds of information.
  • the display unit 12 displays, for example, various kinds of video.
  • the display unit 12 displays, for example, an identifier for calculating the relative position between the terminal device 10 and the display device 20 .
  • the identifier is described below.
  • the display unit 12 includes a display including, for example, a liquid crystal display (LCD) or an organic electro-luminescence display.
  • the sound output unit 13 outputs various kinds of sounds.
  • the sound output unit 13 can be realized by, for example, a speaker.
  • the operation unit 14 receives various operations on the terminal device 10 from the user.
  • the operation unit 14 includes a button, a switch, or a touch panel, for example.
  • the operation unit 14 receives, for example, operations to start or end communication with the display device 20 .
  • the operation unit 14 receives, for example, operations for photographing and acquiring video displayed on the display device 20 and for storing the photographed video in the storage unit 15 .
  • the storage unit 15 stores various kinds of information therein.
  • the storage unit 15 for example, stores therein video data of the scenery and various objects such as buildings and characters in the VR space displayed on the display device 20 .
  • the storage unit 15 may store therein video data rendered in two-dimensional video, information specifying the three-dimensional object and the photographing range, or only information specifying the photographing range.
  • the storage unit 15 can be realized, for example, by a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or a solid state drive.
  • the communication unit 16 performs communication between the terminal device 10 and an external device.
  • the communication unit 16 for example, performs the communication with the display device 20 .
  • the control unit 17 controls the operation of each part of the terminal device 10 .
  • the control unit 17 is achieved in a manner that a central processing unit (CPU), a micro-processing unit (MPU), or the like executes a computer program (for example, computer program according to the present disclosure) stored in a storage unit, which is not illustrated, using a RAM or the like as a working area.
  • the control unit 17 may be achieved by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA), for example.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the control unit 17 may be achieved by a combination of hardware and software.
  • the control unit 17 includes a calculation unit 171 , a video data acquisition unit 172 , a display control unit 173 , an output control unit 174 , an operation control unit 175 , an image capture control unit 176 , a storage control unit 177 , and a communication control unit 178 .
  • the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 .
  • the calculation unit 171 calculates the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, the size of the identifier displayed on the display unit 12 that is photographed by an external image capture device, which is acquired by the video data acquisition unit 172 .
  • the calculation unit 171 may calculate the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, a table, which is not illustrated, where the relative distance between the terminal device 10 and the display device 20 is associated with the size of the identifier in the photographed video.
  • the calculation unit 171 calculates the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the degree of distortion and the direction of distortion of the identifier displayed on the display unit 12 .
  • the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 on the basis of, for example, the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20 .
  • FIG. 3 is a diagram for describing the identifiers displayed on the display unit 12 .
  • the calculation unit 171 calculates the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, the size of an identifier 111 displayed on the display unit 12 that is photographed by an image capture unit 21 of the display device 20 .
  • the calculation unit 171 calculates the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the degree of distortion and the direction of distortion of the identifier 111 displayed on the display unit 12 that is photographed by the image capture unit 21 of the display device 20 .
  • the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 on the basis of, for example, the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20 .
  • the calculation unit 171 may, for example, calculate the relative position between the terminal device 10 and the display device 20 on the basis of the size of the identifier 111 photographed by another external camera.
  • the identifiers 111 are displayed at the four corners of the display unit 12 , for example.
  • the identifier 111 is, for example, but not limited to, a QR code (registered trademark).
  • One identifier 111 may be displayed in the center of the display unit 12 , for example.
  • the calculation unit 171 may calculate the relative position between the terminal device 10 and the display device 20 on the basis of photographing data of the display device 20 that is photographed by the image capture unit 11 .
  • the calculation unit 171 may, for example, determine the size of the display device 20 included in the photographing data using a well-known image recognition process and calculate the relative position between the terminal device 10 and the display device 20 on the basis of the determined size.
  • the calculation unit 171 may calculate the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, a table in which the size of the display device 20 and the relative distance between the terminal device 10 and the display device 20 are associated.
  • the calculation unit 171 may calculate the tilt of the terminal device 10 relative to the display device 20 on the basis of the degree of distortion and the direction of distortion of the display device 20 .
  • the calculation unit 171 may calculate the relative position between the terminal device 10 and the display device 20 on the basis of the photographing data of the identifier such as a QR code provided on a housing of the display device 20 that is photographed by the image capture unit 11 .
  • the calculation unit 171 may, for example, determine the size and distortion of the identifier in the photographing data using a well-known image recognition process, and calculate the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20 on the basis of the determined size.
  • the calculation unit 171 may calculate the relative position between the terminal device 10 and the display device 20 by grasping the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the determined size and distortion of the identifier.
  • the calculation unit 171 may calculate the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, a table in which the size of the identifier and the relative distance between the terminal device 10 and the display device 20 are associated.
  • the calculation unit 171 may calculate the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the degree of distortion and the direction of distortion of the identifier.
  • the video data acquisition unit 172 acquires various kinds of video data.
  • the video data acquisition unit 172 acquires from the display device 20 , the video data photographed by a virtual terminal device in the VR space displayed on the display device 20 .
  • the operation control unit 175 has acquired an operation signal for photographing video in the VR space displayed on the display device 20 that is input to the operation unit 14
  • the video data acquisition unit 172 acquires the video data photographed by the virtual terminal device in the VR space that is displayed on the display device 20 .
  • Photographing with the virtual terminal device in the VR space refers to generating video that captures a predetermined range of the VR space from the position of a virtual camera equipped in the virtual terminal device in the VR space.
  • the virtual terminal device in the VR space may have the shape and size that are either the same as or different from those of the terminal device 10 .
  • the position of the virtual camera in the virtual terminal device may be either the same as or different from the position of the image capture unit 11 in the terminal device 10 .
  • the characteristics of the virtual camera may be either the same as or different from the characteristics of the image capture unit 11 .
  • the characteristics of the virtual camera may be changed as needed by the user's operation in the VR space, such as angle widening or telephoto operation (zoom in or zoom out). The user may be able to perform operations in the VR space through an operation screen displayed on a display unit of the virtual terminal device.
  • the display control unit 173 causes the display unit 12 to display various kinds of video.
  • the display control unit 173 causes the display unit 12 to display an identifier for calculating the relative position between the terminal device 10 and the display device 20 , for example.
  • the display control unit 173 causes the display unit 12 to display the video related to the video data photographed in the VR space that the video data acquisition unit 172 has acquired from the display device 20 .
  • the display control unit 173 controls the photographing range to be displayed on the display unit of the virtual terminal device when photographing various objects in the VR space with the virtual terminal device in the VR space displayed on the display device 20 .
  • the display control unit 173 controls the photographing range to be displayed on the display unit of the virtual terminal device in the VR space on the basis of the relative position between the real terminal device 10 and the display device 20 .
  • the display control unit 173 controls so that the photographing range to be displayed on the display unit of the virtual terminal device in the VR space changes depending on the change in the relative position between the real terminal device 10 and the display device 20 , for example.
  • the display control unit 173 outputs through the communication unit 16 to the display device 20 , a control signal for changing the photographing range to be displayed on the virtual terminal device in the VR space, for example.
  • the display control unit 173 for example, generates an avatar of a user who uses the terminal device 10 and the display device 20 to be displayed on the video displayed on the display device 20 .
  • the display control unit 173 for example, generates the avatar on the basis of the video data of the face of the user who uses the terminal device 10 and the display device 20 .
  • the display control unit 173 may, for example, change each part of the avatar's face on the basis of operation information received from the user through the operation unit 14 .
  • the display control unit 173 for example, outputs a control signal for displaying the generated avatar on the display device 20 via the communication unit 16 to the display device 20 .
  • the output control unit 174 controls the sound output unit 13 to output a sound.
  • the operation control unit 175 acquires the operation signal related to the operation input to the operation unit 14 .
  • the operation control unit 175 for example, outputs a control signal related to the acquired operation signal to control the operation of the terminal device 10 .
  • the operation control unit 175 for example, acquires an operation signal related to the operation for photographing the video in the VR space displayed on the display device 20 that is input to the operation unit 14 .
  • the image capture control unit 176 controls the image capture unit 11 .
  • the image capture control unit 176 sets an image capture condition by the image capture unit 11 and causes the image capture unit 11 to capture images.
  • the image capture control unit 176 controls the image capture unit 11 to capture an image of an identifier such as a QR code provided on the housing.
  • an identifier such as a QR code provided on the housing.
  • the storage control unit 177 stores various kinds of data in the storage unit 15 .
  • the storage control unit 177 causes the storage unit 15 to store therein the video data related to the video photographed in the VR space displayed on the display device 20 that is acquired by the video data acquisition unit 172 .
  • the communication control unit 178 controls the communication between the terminal device 10 and the external device by controlling the communication unit 16 .
  • the communication control unit 178 controls the communication between the terminal device 10 and the display device 20 by controlling the communication unit 16 .
  • FIG. 4 is a block diagram illustrating a structure example of the display device according to the embodiment.
  • the display device 20 includes the image capture unit 21 , a display unit 22 , a sound output unit 23 , an operation unit 24 , and a communication unit 25 .
  • the image capture unit 21 , the display unit 22 , the sound output unit 23 , the operation unit 24 , and the communication unit 25 are connected to each other through a bus B 3 .
  • the image capture unit 21 photographs various kinds of video around the display device 20 .
  • the image capture unit 21 photographs the identifier 111 displayed on the display unit 12 of the terminal device 10 , for example.
  • the image capture unit 21 includes an image capture element, a circuit that generates video data on the basis of the output of the image capture element, and the like, which are not illustrated.
  • the image capture element may be, but not limited to, a CMOS image sensor or a CCD.
  • the display unit 22 displays various kinds of video.
  • the display unit 22 for example, displays the video in the VR space.
  • the display unit 22 is realized, for example, as an HMD.
  • the sound output unit 23 outputs various kinds of sounds.
  • the sound output unit 23 can be realized by, for example, a speaker.
  • the operation unit 24 receives various operations from the user for the display device 20 .
  • the operation unit 24 includes, for example, a button or a switch.
  • the operation unit 24 receives, for example, the operations to start or end the communication with the terminal device 10 .
  • the communication unit 25 performs the communication between the display device 20 and the external device.
  • the communication unit 25 for example, performs the communication with the terminal device 10 .
  • the communication unit 25 for example, performs the communication with the server 30 .
  • a control unit 26 controls the operation of each part of the display device 20 .
  • the control unit 26 is achieved in a manner that a CPU, an MPU, or the like executes a computer program stored in a storage unit, which is not illustrated, using a RAM or the like as a working area.
  • the control unit 26 may be achieved by an integrated circuit, such as an ASIC or an FPGA.
  • the control unit 26 may be achieved by a combination of hardware and software.
  • the control unit 26 includes an acquisition unit 261 , a display control unit 262 , an output control unit 263 , an operation control unit 264 , an image capture control unit 265 , and a communication control unit 266 .
  • the acquisition unit 261 , the display control unit 262 , the output control unit 263 , the operation control unit 264 , the image capture control unit 265 , and the communication control unit 266 are connected to each other through a bus B 4 .
  • the acquisition unit 261 acquires various kinds of information.
  • the acquisition unit 261 for example, acquires the video data related to the video of the identifier 111 displayed on the display unit 12 of the terminal device 10 that is photographed by the image capture unit 21 .
  • the display control unit 262 controls the video displayed on the display unit 22 .
  • the display control unit 262 causes the display unit 22 to display the video related to VR, for example.
  • the display control unit 262 causes the display unit 22 to display the video to make the users experience a theme park or the like in the VR space, for example.
  • the display control unit 262 changes the display range of the video displayed on the display unit 22 or displays the avatar on the video displayed on the display unit 22 in accordance with the control signal received from the terminal device 10 , for example.
  • the display control unit 262 changes the photographing range displayed on the virtual terminal device in the VR space displayed on the display unit 22 in accordance with the control signal received from the terminal device 10 , for example.
  • the control signal that the display control unit 262 receives from the terminal device 10 is the control signal generated based on the operation of the user on the operation screen displayed on the display unit of the virtual terminal device in the VR space with the operation unit 14 of the terminal device 10 , for example.
  • the display unit 12 of the terminal device 10 displays the identifier 111
  • the user actually operates the terminal device 10 in his/her hand, but the display unit 12 displays the identifier 111 and the operation screen is not displayed.
  • the control signal generated based on the operation is transmitted from the terminal device 10 to the display device 20 , and the display control unit 262 controls the operation screen displayed on the display unit of the virtual terminal device in the VR space corresponding to the terminal device 10 on the basis of the control signal.
  • the user perceives the behavior of the virtual operation screen through his/her vision from the display device 20 and perceives the actual operation through the touch from the terminal device 10 .
  • the terminal device 10 may generate vibration with an actuator, which is not illustrated, to provide feedback of the operation to the user.
  • the output control unit 263 causes the sound output unit 23 to output various kinds of sounds.
  • the output control unit 263 for example, outputs the sound of other users in the VR space.
  • the output control unit 263 causes the sound output unit 23 to output the sound of other users who are experiencing a theme park or the like together in the VR space.
  • the operation control unit 264 acquires the operation signal related to the operation input to the operation unit 24 .
  • the operation control unit 264 for example, outputs a control signal related to the acquired operation signal to control the operation of the display device 20 .
  • the image capture control unit 265 controls the image capture unit 21 .
  • the image capture control unit 265 sets an image capture condition by the image capture unit 21 and causes the image capture unit 21 to capture images.
  • the communication control unit 266 controls the communication unit 25 to control the communication between the display device 20 and the external device.
  • the communication control unit 266 controls the communication between the terminal device 10 and the display device 20 by controlling the communication unit 25 .
  • the communication control unit 266 controls the communication unit 25 to control the communication between the display device 20 and the server 30 , for example.
  • FIG. 5 is a block diagram illustrating a structure example of the server according to the embodiment.
  • the server 30 includes a communication unit 31 , a storage unit 32 , and a control unit 33 .
  • the communication unit 31 performs the communication between the server 30 and an external device.
  • the communication unit 31 for example, performs the communication with the terminal device 10 .
  • the communication unit 31 for example, performs the communication with the display device 20 .
  • the storage unit 32 stores various kinds of information therein.
  • the storage unit 32 stores map information in the VR space therein, for example.
  • the map information includes various kinds of information such as scenery, buildings, and characters in the VR space.
  • the storage unit 32 can be realized by a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or a solid state drive.
  • the control unit 33 controls the operation of each part of the server 30 .
  • the control unit 33 is achieved in a manner that a CPU, an MPU, or the like executes a computer program stored in a storage unit, which is not illustrated, using a RAM or the like as a working area.
  • the control unit 33 may be realized by an integrated circuit, such as an ASIC or an FPGA.
  • the control unit 33 may be achieved by a combination of hardware and software.
  • the control unit 33 generates a VR space on the basis of the map information stored in the storage unit 32 .
  • the control unit 33 generates a VR space where a virtual theme park or the like can be experienced, for example.
  • the connection between the display device 20 and the server 30 allows the user of the display device 20 to experience, for example, a virtual theme park with multiple users using avatars.
  • FIG. 6 and FIG. 7 are diagrams for describing how to use the terminal device and the display device according to the embodiment.
  • the display device 20 is worn on the head of a user U, for example.
  • the user U can experience a virtual theme park or the like in a VR space using the display device 20 .
  • the user U can, for example, photograph scenery or objects such as buildings and characters as memories of the theme park in the VR space, or take pictures with multiple other users by operating the terminal device 10 while using the display device 20 .
  • FIG. 7 expresses the photographing data in the VR space.
  • the user U can photograph an image including an avatar A 1 of the user U and avatars A 2 , A 3 , and A 4 of other users in the VR space using the virtual terminal device in the VR space.
  • the image to be photographed may be either a still image or a moving image.
  • the display range of the image to be photographed varies depending on the relative position between the terminal device 10 and the display device 20 .
  • the user U can change the photographing range of the VR space by adjusting the relative position between the terminal device 10 and the display device 20 in the real space.
  • the photographing range in the VR space is displayed on the display unit of the virtual terminal device in the VR space.
  • the photographing range to be displayed on the display unit of the virtual terminal device can be adjusted.
  • the user U can adjust the photographing range so that the avatar A 1 , the avatar A 2 , the avatar A 3 , and the avatar A 4 appear in one picture in the VR space by adjusting the relative position between the terminal device 10 and the display device 20 in the real space.
  • FIG. 8 is a flowchart expressing one example of the procedure of the process in the terminal device according to the embodiment.
  • the operation control unit 175 determines whether the photographing mode of the virtual camera in the VR space is set to the photographing mode for photographing with a rear camera via the operation unit 14 (step S 10 ). If it is determined that the mode is set to the photographing mode of photographing with the rear camera (Yes at step S 10 ), the process advances to step S 12 . If it is determined that the mode is not set to the photographing mode of photographing with the rear camera (No at step S 10 ), the process advances to step S 11 .
  • the operation control unit 175 determines via the operation unit 14 whether the photographing mode of the virtual camera in the VR space is set to the photographing mode for photographing with a front camera (step S 11 ). If it is determined that the mode is set to the photographing mode of photographing with the front camera (Yes at step S 11 ), the process advances to step S 12 . If it is determined that the mode is not set to the photographing mode of photographing with the front camera (No at step S 11 ), the process advances to step S 18 .
  • the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 (step S 12 ). Specifically, the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 on the basis of the size of the identifier 111 displayed on the display unit 12 of the terminal device 10 that is photographed by the image capture unit 21 of the display device 20 . Then, the process advances to step S 12 .
  • the display control unit 173 recognizes the photographing range in the VR space displayed on the display device 20 on the basis of the relative position calculated by the calculation unit 171 (step S 13 ). Specifically, the display control unit 173 recognizes the photographing range displayed on the display unit of the virtual terminal device in the VR space. Then, the process advances to step S 14 .
  • the operation control unit 175 determines whether an operation to photograph the photographing range in the VR space recognized at step S 13 is received via the operation unit 14 (step S 14 ). If it is determined that the operation to photograph the photographing range in the VR space is received (Yes at step S 14 ), the process advances to step S 15 . If it is determined that the operation to photograph the photographing range in the VR space is not received (No at step S 14 ), the process advances to step S 18 .
  • the video data acquisition unit 172 acquires the photographing data related to the photographing range in the VR space that is photographed, from the display device 20 (step S 15 ).
  • the display control unit 173 may cause the display unit 12 to display the video related to the photographing data in the VR space acquired by the video data acquisition unit 172 .
  • the photographing data may be either a still image or a moving image. This allows the user to grasp the video photographed in the VR space. Then, the process advances to step S 16 .
  • the operation control unit 175 determines whether an operation to save the photographing data acquired at step S 15 is received via the operation unit 14 (step S 16 ). If it is determined that the operation to save the photographing data is received (Yes at step S 16 ), the process advances to step S 17 . If it is determined that the operation to save the photographing data is not received (No at step S 16 ), the process advances to step S 18 .
  • step S 16 If the determination is Yes at step S 16 , the storage control unit 177 saves the photographing data acquired at step S 15 in the storage unit 15 (step S 17 ). Then, the process advances to step S 18 .
  • the control unit 17 determines whether to terminate the process (step S 18 ). Specifically, the operation control unit 175 determines that the process is terminated upon the reception of an operation to terminate the photographing or an operation to turn off the power of the terminal device 10 . If it is determined that the process is terminated (Yes at step S 18 ), the process in FIG. 8 is terminated. If it is determined that the process is not terminated (No at step S 18 ), the process advances to step S 10 .
  • the photographing range in the VR space displayed on the display device 20 is photographed by the operation using the terminal device 10 , and the photographed video data is saved in the terminal device 10 . Since the scenery in the VR space and the memories in the virtual theme park or the like can be recorded in the terminal device 10 in this embodiment, it is easy to transmit photos taken in the VR space on social network service (SNS).
  • SNS social network service
  • this embodiment it is possible to experience the virtual theme park and communicate with other users using the avatars in the VR space.
  • This embodiment also allows the user to take a group photo with his/her own avatar and the avatars of other users in the same VR space.
  • FIG. 9 is a block diagram illustrating a structure example of a terminal device according to the first modification.
  • a terminal device 10 A differs from the terminal device 10 illustrated in FIG. 2 in that a control unit 17 A includes a posture detection unit 179 .
  • the posture detection unit 179 detects the posture of the user experiencing VR using the display device 20 .
  • the posture detection unit 179 detects, for example, the posture of each part of the user, including the user's head, arms, and legs.
  • the video data acquisition unit 172 acquires the photographing data of the user from an external photographing device, which is not illustrated.
  • the posture detection unit 179 may detect the user's posture on the basis of the photographing data acquired by the video data acquisition unit 172 .
  • the posture detection unit 179 may detect the user's posture using motion capture or other known techniques.
  • the display control unit 173 changes the posture of the user's avatar displayed on the display device 20 on the basis of the detection results of the posture detection unit 179 .
  • the display control unit 173 raises the right hand of the user's avatar if the posture detection unit 179 detects that the user is raising his/her right hand. That is to say, the display control unit 173 changes the avatar's posture according to the user's posture detected by the posture detection unit 179 .
  • the posture of the avatar in the VR space is changed in real time according to the posture of the user in the real space.
  • the usability is improved.
  • FIG. 10 is a block diagram illustrating a structure example of a terminal device according to the second modification.
  • a terminal device 10 B differs from the terminal device 10 illustrated in FIG. 2 in that the terminal device 10 B includes a sensor 18 .
  • the sensor 18 includes various kinds of sensors.
  • the sensor 18 includes, for example, a sensor that detects the relative position between the terminal device 10 B and the display device 20 .
  • Examples of the sensor 18 include, but are not limited to, a laser radar (e.g., Laser Imaging Detection and Ranging (LIDAR)), an infrared sensor that includes an infrared illuminator and a light receiving sensor, and a time-of-flight (ToF) sensor.
  • LIDAR Laser Imaging Detection and Ranging
  • ToF time-of-flight
  • the calculation unit 171 of a control unit 17 B calculates the relative position between the terminal device 10 B and the display device 20 on the basis of the detection results by the sensor 18 . In other words, in the second modification, the calculation unit 171 calculates the relative position between the terminal device 10 B and the display device 20 using a spatial grasping means different from the image capture unit 11 .
  • the relative position between the terminal device 10 B and the display device 20 is calculated based on the detection results of various kinds of sensors.
  • the degree of freedom in design can be improved in the second modification.
  • FIG. 11 is a block diagram illustrating a structure example of a display device according to the third modification.
  • a display device 20 A differs from the display device 20 illustrated in FIG. 4 in that the display device 20 A includes a sensor 27 and a control unit 26 A includes a calculation unit 267 .
  • the sensor 27 includes various kinds of sensors.
  • the sensor 27 includes, for example, a sensor that detects the relative position between the terminal device 10 B and the display device 20 .
  • Examples of the sensor 27 include, but are not limited to, a laser radar (for example, LIDAR), an infrared sensor that includes an infrared illuminator and a light receiving sensor, and a ToF sensor.
  • the calculation unit 267 calculates the relative position between the terminal device 10 and the display device 20 A on the basis of the detection results by the sensor 27 .
  • the relative position between the terminal device 10 and the display device 20 A may be calculated by the display device 20 A instead of the terminal device 10 .
  • the display device 20 A calculates the information regarding the calculated relative position between the terminal device 10 and the display device 20 A to the terminal device 10 via the communication unit 25 .
  • the display control unit 173 for example, changes the photographing range to be displayed on the display unit of the virtual terminal device in the VR space displayed on the display device 20 A on the basis of the relative position between the terminal device 10 and the display device 20 A calculated by the calculation unit 267 .
  • the relative position between the terminal device 10 and the display device 20 A is calculated by the display device 20 A on the basis of the detection results of the sensor.
  • the degree of freedom in design can be improved in the third modification.
  • memories of activities in the VR space can be recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A terminal device includes a calculation unit that calculates a relative position between the terminal device and a display device that displays virtual reality to be used by a user, a display control unit that changes a photographing range to be photographed in the virtual reality displayed on the display device, based on the relative position, and a video data acquisition unit that acquires video data related to video in the virtual reality photographed in the photographing range that is changed by the display control unit.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is a Continuation of PCT International Application No. PCT/JP2020/047829 filed on Dec. 22, 2020 which claims the benefit of priority from Japanese Patent Application No. 2020-146358 filed on Aug. 31, 2020, the entire contents of both of which are incorporated herein by reference.
BACKGROUND 1. Technical Field
The present disclosure relates to a terminal device, a method, and a computer program.
2. Description of the Related Art
A technology that enables a plurality of users to experience a virtual reality (VR) space having realized VR has been known.
For example, Japanese Patent Application Laid-open No. 2019-211835 describes a technology that allows a plurality of users to experience VR while simultaneously moving the positions and directions of their viewpoints within the VR space.
It is assumed that a variety of communications between multiple users will be realized in the VR space. For example, a situation is assumed in which each user uses an individually created avatar to virtually go to an exhibition facility such as a museum or a theme park, go on a trip, or participate in a common game in a VR space. Therefore, it is desirable to record the memories in the VR space by taking pictures of scenery or group photos among avatars through a predetermined operation in a real world.
SUMMARY
It is an object of the present disclosure to at least partially solve the problems in the conventional technology.
A terminal device according to an embodiment of the present disclosure includes a calculation unit that calculates a relative position between the terminal device and a display device that displays virtual reality to be used by a user, a display control unit that causes a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and based on the relative position, changes a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device, and a video data acquisition unit that acquires photographing data related to video in the space of the photographing range that is changed by the display control unit in the space of the virtual reality. The display control unit causes a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit, the virtual terminal device has a photographing mode of a front camera for photographing an image in the photographing range including an avatar of the user and an avatar of another user in the same space of the virtual reality, and a storage control unit that saves, in a storage unit, the photographing data in the photographing range according to the photographing mode acquired by the video data acquisition unit, based on a saving operation is further provided.
A method according to an embodiment of the present disclosure includes calculating a relative position between a terminal device and a display device that displays virtual reality to be used by a user, causing a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and based on the relative position, changing a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device, acquiring photographing data related to video in a space of the photographing range that is changed in the virtual space, causing a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit, setting a photographing mode of the virtual terminal device to a photographing mode of a front camera for photographing an image in the photographing range including an avatar of the user and an avatar of another user in the same space of the virtual reality, and saving the photographing data in the photographing range according to the photographing mode in a storage unit, based on a saving operation.
A non-transitory computer readable recording medium storing therein a computer program according to an embodiment of the present disclosure is disclosed. The computer program causes a computer to execute calculating a relative position between a terminal device and a display device that displays virtual reality to be used by a user, causing a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and based on the relative position, changing a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device, acquiring photographing data related to video in a space of the photographing range that is changed in the virtual space, causing a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit, setting a photographing mode of the virtual terminal device to a photographing mode of a front camera for photographing an image in the photographing range including an avatar of the user and an avatar of another user in the same space of the virtual reality, and saving the photographing data in the photographing range according to the photographing mode in a storage unit, based on a saving operation.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagram for describing a structure example of an information processing system according to an embodiment;
FIG. 2 is a block diagram illustrating a structure example of a terminal device according to the embodiment;
FIG. 3 is a diagram for describing identifiers displayed on a display unit;
FIG. 4 is a block diagram illustrating a structure example of a display device according to the embodiment;
FIG. 5 is a block diagram illustrating a structure example of a server according to the embodiment;
FIG. 6 is a diagram for describing how to use the terminal device and the display device according to the embodiment;
FIG. 7 is a diagram for describing how to use the terminal device and the display device according to the embodiment;
FIG. 8 is a flowchart expressing one example of a procedure of the terminal device according to the embodiment;
FIG. 9 is a block diagram illustrating a structure example of a terminal device according to a first modification;
FIG. 10 is a block diagram illustrating a structure example of a terminal device according to a second modification; and
FIG. 11 is a block diagram illustrating a structure example of a display device according to the second modification.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Embodiments of the present disclosure will hereinafter be described in detail with reference to the accompanying drawings. The present disclosure is not limited by the embodiments, and if there is more than one embodiment, the combination of the embodiments is also included. In the following embodiments, the same parts are denoted with the same symbol and the redundant explanation is omitted.
Information Processing System
With reference to FIG. 1 , an information processing system according to an embodiment is described. FIG. 1 is a diagram for describing a structure example of the information processing system according to the embodiment.
As illustrated in FIG. 1 , an information processing system 1 includes a terminal device 10, a display device 20, and a server 30. The terminal device 10, the display device 20, and the server 30 are connected to each other via a network N so that communication therebetween is possible. The information processing system 1 is a system in which memories in a VR space can be recorded. The terminal device 10 is a device that includes, for example, a smartphone or a tablet terminal. The display device 20 is, for example, a device that allows a user to experience VR. The display device 20 is, for example, a device that includes a head mounted display (HMD) that is worn on the user's head. A user can experience, for example, a theme park in a VR space using avatars with other users on the server 30 while wearing the display device 20 on the head.
Terminal Device
With reference to FIG. 2 , a structure of the terminal device according to the embodiment is described. FIG. 2 is a block diagram illustrating a structure example of the terminal device according to the embodiment.
As illustrated in FIG. 2 , the terminal device 10 includes a display unit 12, a sound output unit 13, an operation unit 14, a storage unit 15, a communication unit 16, and a control unit 17. The display unit 12, the sound output unit 13, the operation unit 14, the storage unit 15, the communication unit 16, and the control unit 17 are connected to each other through a bus B1.
An image capture unit 11 photographs various kinds of video around the terminal device 10. The image capture unit 11 photographs the display device 20, for example. The image capture unit 11 includes an image capture element, a circuit that generates video data on the basis of the output of the image capture element, and the like, which are not illustrated. Examples of the image capture element include, but are not limited to, a complementary metal oxide semiconductor (CMOS) image sensor and a charge coupled device (CCD).
The display unit 12 displays various kinds of information. The display unit 12 displays, for example, various kinds of video. The display unit 12 displays, for example, an identifier for calculating the relative position between the terminal device 10 and the display device 20. The identifier is described below. The display unit 12 includes a display including, for example, a liquid crystal display (LCD) or an organic electro-luminescence display.
The sound output unit 13 outputs various kinds of sounds. The sound output unit 13 can be realized by, for example, a speaker.
The operation unit 14 receives various operations on the terminal device 10 from the user. The operation unit 14 includes a button, a switch, or a touch panel, for example. The operation unit 14 receives, for example, operations to start or end communication with the display device 20. The operation unit 14 receives, for example, operations for photographing and acquiring video displayed on the display device 20 and for storing the photographed video in the storage unit 15.
The storage unit 15 stores various kinds of information therein. The storage unit 15, for example, stores therein video data of the scenery and various objects such as buildings and characters in the VR space displayed on the display device 20. The storage unit 15 may store therein video data rendered in two-dimensional video, information specifying the three-dimensional object and the photographing range, or only information specifying the photographing range. The storage unit 15 can be realized, for example, by a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or a solid state drive.
The communication unit 16 performs communication between the terminal device 10 and an external device. The communication unit 16, for example, performs the communication with the display device 20.
The control unit 17 controls the operation of each part of the terminal device 10. The control unit 17 is achieved in a manner that a central processing unit (CPU), a micro-processing unit (MPU), or the like executes a computer program (for example, computer program according to the present disclosure) stored in a storage unit, which is not illustrated, using a RAM or the like as a working area. The control unit 17 may be achieved by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA), for example. The control unit 17 may be achieved by a combination of hardware and software.
The control unit 17 includes a calculation unit 171, a video data acquisition unit 172, a display control unit 173, an output control unit 174, an operation control unit 175, an image capture control unit 176, a storage control unit 177, and a communication control unit 178.
The calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20. The calculation unit 171 calculates the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, the size of the identifier displayed on the display unit 12 that is photographed by an external image capture device, which is acquired by the video data acquisition unit 172. The calculation unit 171 may calculate the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, a table, which is not illustrated, where the relative distance between the terminal device 10 and the display device 20 is associated with the size of the identifier in the photographed video. The calculation unit 171 calculates the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the degree of distortion and the direction of distortion of the identifier displayed on the display unit 12. The calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 on the basis of, for example, the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20.
FIG. 3 is a diagram for describing the identifiers displayed on the display unit 12. As illustrated in FIG. 3 , the calculation unit 171 calculates the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, the size of an identifier 111 displayed on the display unit 12 that is photographed by an image capture unit 21 of the display device 20. The calculation unit 171 calculates the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the degree of distortion and the direction of distortion of the identifier 111 displayed on the display unit 12 that is photographed by the image capture unit 21 of the display device 20. The calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 on the basis of, for example, the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20. The calculation unit 171 may, for example, calculate the relative position between the terminal device 10 and the display device 20 on the basis of the size of the identifier 111 photographed by another external camera. The identifiers 111 are displayed at the four corners of the display unit 12, for example. The identifier 111 is, for example, but not limited to, a QR code (registered trademark). One identifier 111 may be displayed in the center of the display unit 12, for example.
The calculation unit 171 may calculate the relative position between the terminal device 10 and the display device 20 on the basis of photographing data of the display device 20 that is photographed by the image capture unit 11. The calculation unit 171 may, for example, determine the size of the display device 20 included in the photographing data using a well-known image recognition process and calculate the relative position between the terminal device 10 and the display device 20 on the basis of the determined size. In this case, the calculation unit 171 may calculate the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, a table in which the size of the display device 20 and the relative distance between the terminal device 10 and the display device 20 are associated. For example, the calculation unit 171 may calculate the tilt of the terminal device 10 relative to the display device 20 on the basis of the degree of distortion and the direction of distortion of the display device 20.
The calculation unit 171 may calculate the relative position between the terminal device 10 and the display device 20 on the basis of the photographing data of the identifier such as a QR code provided on a housing of the display device 20 that is photographed by the image capture unit 11. The calculation unit 171 may, for example, determine the size and distortion of the identifier in the photographing data using a well-known image recognition process, and calculate the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20 on the basis of the determined size. The calculation unit 171 may calculate the relative position between the terminal device 10 and the display device 20 by grasping the relative distance between the terminal device 10 and the display device 20 and the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the determined size and distortion of the identifier. In this case, the calculation unit 171 may calculate the relative distance between the terminal device 10 and the display device 20 on the basis of, for example, a table in which the size of the identifier and the relative distance between the terminal device 10 and the display device 20 are associated. The calculation unit 171 may calculate the tilt of the terminal device 10 relative to the display device 20 on the basis of, for example, the degree of distortion and the direction of distortion of the identifier.
The video data acquisition unit 172 acquires various kinds of video data. The video data acquisition unit 172, for example, acquires from the display device 20, the video data photographed by a virtual terminal device in the VR space displayed on the display device 20. In a case where the operation control unit 175 has acquired an operation signal for photographing video in the VR space displayed on the display device 20 that is input to the operation unit 14, for example, the video data acquisition unit 172 acquires the video data photographed by the virtual terminal device in the VR space that is displayed on the display device 20. Photographing with the virtual terminal device in the VR space refers to generating video that captures a predetermined range of the VR space from the position of a virtual camera equipped in the virtual terminal device in the VR space. The virtual terminal device in the VR space may have the shape and size that are either the same as or different from those of the terminal device 10. When the terminal device 10 includes the image capture unit 11, the position of the virtual camera in the virtual terminal device may be either the same as or different from the position of the image capture unit 11 in the terminal device 10. When the terminal device 10 includes the image capture unit 11, the characteristics of the virtual camera may be either the same as or different from the characteristics of the image capture unit 11. The characteristics of the virtual camera may be changed as needed by the user's operation in the VR space, such as angle widening or telephoto operation (zoom in or zoom out). The user may be able to perform operations in the VR space through an operation screen displayed on a display unit of the virtual terminal device.
The display control unit 173 causes the display unit 12 to display various kinds of video. The display control unit 173 causes the display unit 12 to display an identifier for calculating the relative position between the terminal device 10 and the display device 20, for example. The display control unit 173 causes the display unit 12 to display the video related to the video data photographed in the VR space that the video data acquisition unit 172 has acquired from the display device 20.
The display control unit 173, for example, controls the photographing range to be displayed on the display unit of the virtual terminal device when photographing various objects in the VR space with the virtual terminal device in the VR space displayed on the display device 20. The display control unit 173, for example, controls the photographing range to be displayed on the display unit of the virtual terminal device in the VR space on the basis of the relative position between the real terminal device 10 and the display device 20. Specifically, the display control unit 173 controls so that the photographing range to be displayed on the display unit of the virtual terminal device in the VR space changes depending on the change in the relative position between the real terminal device 10 and the display device 20, for example. The display control unit 173 outputs through the communication unit 16 to the display device 20, a control signal for changing the photographing range to be displayed on the virtual terminal device in the VR space, for example.
The display control unit 173, for example, generates an avatar of a user who uses the terminal device 10 and the display device 20 to be displayed on the video displayed on the display device 20. The display control unit 173, for example, generates the avatar on the basis of the video data of the face of the user who uses the terminal device 10 and the display device 20. The display control unit 173 may, for example, change each part of the avatar's face on the basis of operation information received from the user through the operation unit 14. The display control unit 173, for example, outputs a control signal for displaying the generated avatar on the display device 20 via the communication unit 16 to the display device 20.
The output control unit 174 controls the sound output unit 13 to output a sound.
The operation control unit 175 acquires the operation signal related to the operation input to the operation unit 14. The operation control unit 175, for example, outputs a control signal related to the acquired operation signal to control the operation of the terminal device 10. The operation control unit 175, for example, acquires an operation signal related to the operation for photographing the video in the VR space displayed on the display device 20 that is input to the operation unit 14.
The image capture control unit 176 controls the image capture unit 11. The image capture control unit 176 sets an image capture condition by the image capture unit 11 and causes the image capture unit 11 to capture images. The image capture control unit 176, for example, controls the image capture unit 11 to capture an image of an identifier such as a QR code provided on the housing. When the display unit 12 of the terminal device 10 displays the identifier such as a QR code, the image capture unit 11 and the image capture control unit 176 are unnecessary.
The storage control unit 177 stores various kinds of data in the storage unit 15. The storage control unit 177, for example, causes the storage unit 15 to store therein the video data related to the video photographed in the VR space displayed on the display device 20 that is acquired by the video data acquisition unit 172.
The communication control unit 178 controls the communication between the terminal device 10 and the external device by controlling the communication unit 16. The communication control unit 178, for example, controls the communication between the terminal device 10 and the display device 20 by controlling the communication unit 16.
Display Device
With reference to FIG. 4 , a structure of the display device according to the embodiment is described. FIG. 4 is a block diagram illustrating a structure example of the display device according to the embodiment.
As illustrated in FIG. 4 , the display device 20 includes the image capture unit 21, a display unit 22, a sound output unit 23, an operation unit 24, and a communication unit 25. The image capture unit 21, the display unit 22, the sound output unit 23, the operation unit 24, and the communication unit 25 are connected to each other through a bus B3.
In this embodiment, the display device 20 is described as a device including, but not limited to, an HMD that is worn on the user's head. The display device 20 may be a display device including a display such as a liquid crystal display or an organic EL display, which is installed on a desk in a room, for example.
The image capture unit 21 photographs various kinds of video around the display device 20. The image capture unit 21 photographs the identifier 111 displayed on the display unit 12 of the terminal device 10, for example. The image capture unit 21 includes an image capture element, a circuit that generates video data on the basis of the output of the image capture element, and the like, which are not illustrated. The image capture element may be, but not limited to, a CMOS image sensor or a CCD.
The display unit 22 displays various kinds of video. The display unit 22, for example, displays the video in the VR space. The display unit 22 is realized, for example, as an HMD.
The sound output unit 23 outputs various kinds of sounds. The sound output unit 23 can be realized by, for example, a speaker.
The operation unit 24 receives various operations from the user for the display device 20. The operation unit 24 includes, for example, a button or a switch. The operation unit 24 receives, for example, the operations to start or end the communication with the terminal device 10.
The communication unit 25 performs the communication between the display device 20 and the external device. The communication unit 25, for example, performs the communication with the terminal device 10. The communication unit 25, for example, performs the communication with the server 30.
A control unit 26 controls the operation of each part of the display device 20. The control unit 26 is achieved in a manner that a CPU, an MPU, or the like executes a computer program stored in a storage unit, which is not illustrated, using a RAM or the like as a working area. The control unit 26 may be achieved by an integrated circuit, such as an ASIC or an FPGA. The control unit 26 may be achieved by a combination of hardware and software.
The control unit 26 includes an acquisition unit 261, a display control unit 262, an output control unit 263, an operation control unit 264, an image capture control unit 265, and a communication control unit 266. The acquisition unit 261, the display control unit 262, the output control unit 263, the operation control unit 264, the image capture control unit 265, and the communication control unit 266 are connected to each other through a bus B4.
The acquisition unit 261 acquires various kinds of information. The acquisition unit 261, for example, acquires the video data related to the video of the identifier 111 displayed on the display unit 12 of the terminal device 10 that is photographed by the image capture unit 21.
The display control unit 262 controls the video displayed on the display unit 22. The display control unit 262 causes the display unit 22 to display the video related to VR, for example. The display control unit 262 causes the display unit 22 to display the video to make the users experience a theme park or the like in the VR space, for example. The display control unit 262 changes the display range of the video displayed on the display unit 22 or displays the avatar on the video displayed on the display unit 22 in accordance with the control signal received from the terminal device 10, for example. The display control unit 262 changes the photographing range displayed on the virtual terminal device in the VR space displayed on the display unit 22 in accordance with the control signal received from the terminal device 10, for example. The control signal that the display control unit 262 receives from the terminal device 10 is the control signal generated based on the operation of the user on the operation screen displayed on the display unit of the virtual terminal device in the VR space with the operation unit 14 of the terminal device 10, for example. When the display unit 12 of the terminal device 10 displays the identifier 111, the user actually operates the terminal device 10 in his/her hand, but the display unit 12 displays the identifier 111 and the operation screen is not displayed. Meanwhile, the control signal generated based on the operation is transmitted from the terminal device 10 to the display device 20, and the display control unit 262 controls the operation screen displayed on the display unit of the virtual terminal device in the VR space corresponding to the terminal device 10 on the basis of the control signal. The user perceives the behavior of the virtual operation screen through his/her vision from the display device 20 and perceives the actual operation through the touch from the terminal device 10. The terminal device 10 may generate vibration with an actuator, which is not illustrated, to provide feedback of the operation to the user.
The output control unit 263 causes the sound output unit 23 to output various kinds of sounds. The output control unit 263, for example, outputs the sound of other users in the VR space. The output control unit 263 causes the sound output unit 23 to output the sound of other users who are experiencing a theme park or the like together in the VR space.
The operation control unit 264 acquires the operation signal related to the operation input to the operation unit 24. The operation control unit 264, for example, outputs a control signal related to the acquired operation signal to control the operation of the display device 20.
The image capture control unit 265 controls the image capture unit 21. The image capture control unit 265 sets an image capture condition by the image capture unit 21 and causes the image capture unit 21 to capture images.
The communication control unit 266 controls the communication unit 25 to control the communication between the display device 20 and the external device. The communication control unit 266, for example, controls the communication between the terminal device 10 and the display device 20 by controlling the communication unit 25. The communication control unit 266 controls the communication unit 25 to control the communication between the display device 20 and the server 30, for example.
Server
With reference to FIG. 5 , a structure of the server according to the embodiment is described. FIG. 5 is a block diagram illustrating a structure example of the server according to the embodiment.
As illustrated in FIG. 5 , the server 30 includes a communication unit 31, a storage unit 32, and a control unit 33.
The communication unit 31 performs the communication between the server 30 and an external device. The communication unit 31, for example, performs the communication with the terminal device 10. The communication unit 31, for example, performs the communication with the display device 20.
The storage unit 32 stores various kinds of information therein. The storage unit 32 stores map information in the VR space therein, for example. The map information includes various kinds of information such as scenery, buildings, and characters in the VR space. The storage unit 32 can be realized by a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or a solid state drive.
The control unit 33 controls the operation of each part of the server 30. The control unit 33 is achieved in a manner that a CPU, an MPU, or the like executes a computer program stored in a storage unit, which is not illustrated, using a RAM or the like as a working area. The control unit 33 may be realized by an integrated circuit, such as an ASIC or an FPGA. The control unit 33 may be achieved by a combination of hardware and software.
The control unit 33 generates a VR space on the basis of the map information stored in the storage unit 32. The control unit 33 generates a VR space where a virtual theme park or the like can be experienced, for example. The connection between the display device 20 and the server 30 allows the user of the display device 20 to experience, for example, a virtual theme park with multiple users using avatars.
How to Use Terminal Device and Display Device
With reference to FIG. 6 and FIG. 7 , how to use the terminal device and the display device according to the embodiment is described. FIG. 6 and FIG. 7 are diagrams for describing how to use the terminal device and the display device according to the embodiment.
As illustrated in FIG. 6 , the display device 20 is worn on the head of a user U, for example. In other words, the user U can experience a virtual theme park or the like in a VR space using the display device 20. The user U can, for example, photograph scenery or objects such as buildings and characters as memories of the theme park in the VR space, or take pictures with multiple other users by operating the terminal device 10 while using the display device 20.
FIG. 7 expresses the photographing data in the VR space. By operating the terminal device 10, for example, the user U can photograph an image including an avatar A1 of the user U and avatars A2, A3, and A4 of other users in the VR space using the virtual terminal device in the VR space. The image to be photographed may be either a still image or a moving image. The display range of the image to be photographed varies depending on the relative position between the terminal device 10 and the display device 20. In other words, the user U can change the photographing range of the VR space by adjusting the relative position between the terminal device 10 and the display device 20 in the real space. The photographing range in the VR space is displayed on the display unit of the virtual terminal device in the VR space. By adjusting the relative position between the terminal device 10 and the display device 20 in the real space, the photographing range to be displayed on the display unit of the virtual terminal device can be adjusted. Specifically, the user U can adjust the photographing range so that the avatar A1, the avatar A2, the avatar A3, and the avatar A4 appear in one picture in the VR space by adjusting the relative position between the terminal device 10 and the display device 20 in the real space.
Process in Terminal Device
With reference to FIG. 8 , a procedure of the process in the terminal device according to the embodiment is described. FIG. 8 is a flowchart expressing one example of the procedure of the process in the terminal device according to the embodiment.
First, the operation control unit 175 determines whether the photographing mode of the virtual camera in the VR space is set to the photographing mode for photographing with a rear camera via the operation unit 14 (step S10). If it is determined that the mode is set to the photographing mode of photographing with the rear camera (Yes at step S10), the process advances to step S12. If it is determined that the mode is not set to the photographing mode of photographing with the rear camera (No at step S10), the process advances to step S11.
If the determination is No at step S10, the operation control unit 175 determines via the operation unit 14 whether the photographing mode of the virtual camera in the VR space is set to the photographing mode for photographing with a front camera (step S11). If it is determined that the mode is set to the photographing mode of photographing with the front camera (Yes at step S11), the process advances to step S12. If it is determined that the mode is not set to the photographing mode of photographing with the front camera (No at step S11), the process advances to step S18.
If the determination is Yes at step S10 and the determination is Yes at step S11, the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 (step S12). Specifically, the calculation unit 171 calculates the relative position between the terminal device 10 and the display device 20 on the basis of the size of the identifier 111 displayed on the display unit 12 of the terminal device 10 that is photographed by the image capture unit 21 of the display device 20. Then, the process advances to step S12.
The display control unit 173 recognizes the photographing range in the VR space displayed on the display device 20 on the basis of the relative position calculated by the calculation unit 171 (step S13). Specifically, the display control unit 173 recognizes the photographing range displayed on the display unit of the virtual terminal device in the VR space. Then, the process advances to step S14.
The operation control unit 175 determines whether an operation to photograph the photographing range in the VR space recognized at step S13 is received via the operation unit 14 (step S14). If it is determined that the operation to photograph the photographing range in the VR space is received (Yes at step S14), the process advances to step S15. If it is determined that the operation to photograph the photographing range in the VR space is not received (No at step S14), the process advances to step S18.
If the determination is Yes at step S14, the video data acquisition unit 172 acquires the photographing data related to the photographing range in the VR space that is photographed, from the display device 20 (step S15). Here, the display control unit 173 may cause the display unit 12 to display the video related to the photographing data in the VR space acquired by the video data acquisition unit 172. The photographing data may be either a still image or a moving image. This allows the user to grasp the video photographed in the VR space. Then, the process advances to step S16.
The operation control unit 175 determines whether an operation to save the photographing data acquired at step S15 is received via the operation unit 14 (step S16). If it is determined that the operation to save the photographing data is received (Yes at step S16), the process advances to step S17. If it is determined that the operation to save the photographing data is not received (No at step S16), the process advances to step S18.
If the determination is Yes at step S16, the storage control unit 177 saves the photographing data acquired at step S15 in the storage unit 15 (step S17). Then, the process advances to step S18.
The control unit 17 determines whether to terminate the process (step S18). Specifically, the operation control unit 175 determines that the process is terminated upon the reception of an operation to terminate the photographing or an operation to turn off the power of the terminal device 10. If it is determined that the process is terminated (Yes at step S18), the process in FIG. 8 is terminated. If it is determined that the process is not terminated (No at step S18), the process advances to step S10.
As described above, in the present embodiment, the photographing range in the VR space displayed on the display device 20 is photographed by the operation using the terminal device 10, and the photographed video data is saved in the terminal device 10. Since the scenery in the VR space and the memories in the virtual theme park or the like can be recorded in the terminal device 10 in this embodiment, it is easy to transmit photos taken in the VR space on social network service (SNS).
In this embodiment, it is possible to experience the virtual theme park and communicate with other users using the avatars in the VR space. This embodiment also allows the user to take a group photo with his/her own avatar and the avatars of other users in the same VR space.
First Modification
With reference to FIG. 9 , a first modification of the present embodiment is described. FIG. 9 is a block diagram illustrating a structure example of a terminal device according to the first modification.
As illustrated in FIG. 9 , a terminal device 10A differs from the terminal device 10 illustrated in FIG. 2 in that a control unit 17A includes a posture detection unit 179.
The posture detection unit 179 detects the posture of the user experiencing VR using the display device 20. The posture detection unit 179 detects, for example, the posture of each part of the user, including the user's head, arms, and legs. In the first modification, for example, the video data acquisition unit 172 acquires the photographing data of the user from an external photographing device, which is not illustrated. In this case, the posture detection unit 179 may detect the user's posture on the basis of the photographing data acquired by the video data acquisition unit 172. The posture detection unit 179 may detect the user's posture using motion capture or other known techniques.
The display control unit 173 changes the posture of the user's avatar displayed on the display device 20 on the basis of the detection results of the posture detection unit 179. The display control unit 173, for example, raises the right hand of the user's avatar if the posture detection unit 179 detects that the user is raising his/her right hand. That is to say, the display control unit 173 changes the avatar's posture according to the user's posture detected by the posture detection unit 179.
As described above, in the first modification of this embodiment, the posture of the avatar in the VR space is changed in real time according to the posture of the user in the real space. Thus, in the first modification of the present disclosure, since the avatar's posture in the VR space can be changed easily, the usability is improved.
Second Modification
With reference to FIG. 10 , a second modification of the present embodiment is described. FIG. 10 is a block diagram illustrating a structure example of a terminal device according to the second modification.
As illustrated in FIG. 10 , a terminal device 10B differs from the terminal device 10 illustrated in FIG. 2 in that the terminal device 10B includes a sensor 18.
The sensor 18 includes various kinds of sensors. The sensor 18 includes, for example, a sensor that detects the relative position between the terminal device 10B and the display device 20. Examples of the sensor 18 include, but are not limited to, a laser radar (e.g., Laser Imaging Detection and Ranging (LIDAR)), an infrared sensor that includes an infrared illuminator and a light receiving sensor, and a time-of-flight (ToF) sensor.
The calculation unit 171 of a control unit 17B calculates the relative position between the terminal device 10B and the display device 20 on the basis of the detection results by the sensor 18. In other words, in the second modification, the calculation unit 171 calculates the relative position between the terminal device 10B and the display device 20 using a spatial grasping means different from the image capture unit 11.
As described above, in the second modification of this embodiment, the relative position between the terminal device 10B and the display device 20 is calculated based on the detection results of various kinds of sensors. Thus, the degree of freedom in design can be improved in the second modification.
Third Modification
With reference to FIG. 11 , a third modification of the present embodiment is described. FIG. 11 is a block diagram illustrating a structure example of a display device according to the third modification.
As illustrated in FIG. 11 , a display device 20A differs from the display device 20 illustrated in FIG. 4 in that the display device 20A includes a sensor 27 and a control unit 26A includes a calculation unit 267.
The sensor 27 includes various kinds of sensors. The sensor 27 includes, for example, a sensor that detects the relative position between the terminal device 10B and the display device 20. Examples of the sensor 27 include, but are not limited to, a laser radar (for example, LIDAR), an infrared sensor that includes an infrared illuminator and a light receiving sensor, and a ToF sensor.
The calculation unit 267 calculates the relative position between the terminal device 10 and the display device 20A on the basis of the detection results by the sensor 27. In other words, the relative position between the terminal device 10 and the display device 20A may be calculated by the display device 20A instead of the terminal device 10. In this case, the display device 20A calculates the information regarding the calculated relative position between the terminal device 10 and the display device 20A to the terminal device 10 via the communication unit 25. Then, the display control unit 173, for example, changes the photographing range to be displayed on the display unit of the virtual terminal device in the VR space displayed on the display device 20A on the basis of the relative position between the terminal device 10 and the display device 20A calculated by the calculation unit 267.
As mentioned above, in the third modification of this embodiment, the relative position between the terminal device 10 and the display device 20A is calculated by the display device 20A on the basis of the detection results of the sensor. Thus, the degree of freedom in design can be improved in the third modification.
According to the present disclosure, memories of activities in the VR space can be recorded.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.

Claims (6)

What is claimed is:
1. A terminal device comprising:
a calculation unit that calculates a relative position between the terminal device and a display device that displays virtual reality to be used by a user;
a display control unit that causes a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and, based on the relative position, changes a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device; and
a video data acquisition unit that acquires photographing data related to video in the space of the photographing range that is changed by the display control unit, wherein
the display control unit causes a display unit of the virtual terminal device to display video related to the photographing data acquired by the video data acquisition unit,
the virtual terminal device has a photographing mode of a front camera for photographing the photographing range including an avatar of the user and an avatar of another user in a same space of the virtual reality in an image, and
the terminal device further comprises a storage control unit that saves, in a storage unit and based on a saving operation, the photographing data in the photographing range according to the photographing mode acquired by the video data acquisition unit.
2. The terminal device according to claim 1, wherein
the video data acquisition unit acquires video data related to video of an identifier displayed on a display unit of the terminal device that is photographed by an external image capture device or an image capture unit of the display device, or acquires video data related to video of an identifier provided on the display device that is photographed by the image capture unit of the terminal device, and
the calculation unit calculates the relative position based on the video data of the identifier.
3. The terminal device according to claim 2, further comprising a posture detection unit that detects a posture of the user, wherein
the display control unit changes a posture of an avatar of the user displayed on the display device according to the posture of the user detected by the posture detection unit.
4. The terminal device according to claim 1, further comprising a posture detection unit that detects a posture of the user, wherein
the display control unit changes a posture of an avatar of the user displayed on the display device according to the posture of the user detected by the posture detection unit.
5. A method comprising:
calculating a relative position between a terminal device and a display device that displays virtual reality to be used by a user;
causing a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality and, based on the relative position, changing a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device;
acquiring photographing data related to video in the photographing range that is changed in the space;
causing a display unit of the virtual terminal device to display video related to the photographing data;
setting a photographing mode of the virtual terminal device to a photographing mode of a front camera for photographing the photographing range including an avatar of the user and an avatar of another user in a same space of the virtual reality in an image; and
saving, based on a saving operation, the photographing data in the photographing range according to the photographing mode in a storage unit.
6. A non-transitory computer readable recording medium storing therein a computer program that, in response to execution by a computer, causes the computer to execute:
calculating a relative position between a terminal device and a display device that displays virtual reality to be used by a user;
causing a virtual terminal device to display in accordance with a control signal from the terminal device in a space of the virtual reality;
based on the relative position, changing a photographing range displayed on the virtual terminal device in the space of the virtual reality displayed on the display device;
acquiring photographing data related to video in the photographing range that is changed in the space;
causing a display unit of the virtual terminal device to display video related to the photographing data;
setting a photographing mode of the virtual terminal device to a photographing mode of a front camera for photographing the photographing range including an avatar of the user and an avatar of another user in a same space of the virtual reality in an image; and
saving, based on a saving operation, the photographing data in the photographing range according to the photographing mode in a storage unit.
US18/172,357 2020-08-31 2023-02-22 Terminal device, method, and computer program Active US12086924B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-146358 2020-08-31
JP2020146358A JP2022041258A (en) 2020-08-31 2020-08-31 Terminal device, method, and program
PCT/JP2020/047829 WO2022044357A1 (en) 2020-08-31 2020-12-22 Terminal device, method, and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/047829 Continuation WO2022044357A1 (en) 2020-08-31 2020-12-22 Terminal device, method, and program

Publications (2)

Publication Number Publication Date
US20230206546A1 US20230206546A1 (en) 2023-06-29
US12086924B2 true US12086924B2 (en) 2024-09-10

Family

ID=80355024

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/172,357 Active US12086924B2 (en) 2020-08-31 2023-02-22 Terminal device, method, and computer program

Country Status (3)

Country Link
US (1) US12086924B2 (en)
JP (1) JP2022041258A (en)
WO (1) WO2022044357A1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007033257A (en) 2005-07-27 2007-02-08 Canon Inc Information processing method and device
US20070069026A1 (en) 2005-09-27 2007-03-29 Honda Motor Co., Ltd. Two-dimensional code detector and program thereof, and robot control information generator and robot
JP2017146651A (en) 2016-02-15 2017-08-24 株式会社コロプラ Image processing method and image processing program
US20180322681A1 (en) 2017-03-02 2018-11-08 Colopl, Inc. Information processing method, program, virtual space delivering system and apparatus therefor
US20190025586A1 (en) 2017-07-13 2019-01-24 Colopl, Inc. Information processing method, information processing program, information processing system, and information processing apparatus
US20190043263A1 (en) * 2017-07-19 2019-02-07 Colopl, Inc. Program executed on a computer for providing vertual space, method and information processing apparatus for executing the program
JP2019061434A (en) 2017-09-26 2019-04-18 株式会社コロプラ Program, information processing apparatus, information processing system, and information processing method
JP2019211835A (en) 2018-05-31 2019-12-12 凸版印刷株式会社 Multiplayer simultaneous operation system in vr, method and program
JP2020035392A (en) 2018-08-31 2020-03-05 真一 福重 Remote communication system and the like
JP2020091897A (en) 2020-02-25 2020-06-11 株式会社ドワンゴ Information display terminal, information transmission method, and computer program
US11348326B2 (en) * 2013-03-28 2022-05-31 Sony Corporation Display control device, display control method, and recording medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070092161A1 (en) 2005-07-27 2007-04-26 Canon Kabushiki Kaisha Information processing method and information processing apparatus
JP2007033257A (en) 2005-07-27 2007-02-08 Canon Inc Information processing method and device
US20070069026A1 (en) 2005-09-27 2007-03-29 Honda Motor Co., Ltd. Two-dimensional code detector and program thereof, and robot control information generator and robot
JP2007090448A (en) 2005-09-27 2007-04-12 Honda Motor Co Ltd Two-dimensional code detecting device, program for it, and robot control information generating device and robot
US11348326B2 (en) * 2013-03-28 2022-05-31 Sony Corporation Display control device, display control method, and recording medium
JP2017146651A (en) 2016-02-15 2017-08-24 株式会社コロプラ Image processing method and image processing program
US20180322681A1 (en) 2017-03-02 2018-11-08 Colopl, Inc. Information processing method, program, virtual space delivering system and apparatus therefor
US20190025586A1 (en) 2017-07-13 2019-01-24 Colopl, Inc. Information processing method, information processing program, information processing system, and information processing apparatus
JP2019020908A (en) 2017-07-13 2019-02-07 株式会社コロプラ Information processing method, information processing program, information processing system, and information processing device
US20190043263A1 (en) * 2017-07-19 2019-02-07 Colopl, Inc. Program executed on a computer for providing vertual space, method and information processing apparatus for executing the program
JP2019061434A (en) 2017-09-26 2019-04-18 株式会社コロプラ Program, information processing apparatus, information processing system, and information processing method
JP2019211835A (en) 2018-05-31 2019-12-12 凸版印刷株式会社 Multiplayer simultaneous operation system in vr, method and program
JP2020035392A (en) 2018-08-31 2020-03-05 真一 福重 Remote communication system and the like
JP2020091897A (en) 2020-02-25 2020-06-11 株式会社ドワンゴ Information display terminal, information transmission method, and computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Search Report and Written Opinion for International Application No. PCT/JP2020/047829 mailed on Mar. 23, 2021, 9 pages.

Also Published As

Publication number Publication date
US20230206546A1 (en) 2023-06-29
JP2022041258A (en) 2022-03-11
WO2022044357A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
US20220153412A1 (en) Control method, control system, and smart glasses for first person view unmanned aerial vehicle flight
TWI586167B (en) Controlling a camera with face detection
US9662583B2 (en) Portable type game device and method for controlling portable type game device
WO2015157862A1 (en) Augmented reality communications
JP7000050B2 (en) Imaging control device and its control method
TWI642903B (en) Locating method, locator, and locating system for head-mounted display
KR20220070292A (en) Automated eyewear device sharing system
US11991477B2 (en) Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium
US12075167B2 (en) Communication terminal, display method, and non-transitory computer-readable medium for displaying images and controller
JP2023512966A (en) Image processing method, electronic device and computer readable storage medium
US20160381322A1 (en) Method, Synthesizing Device, and System for Implementing Video Conference
JP7452434B2 (en) Information processing device, information processing method and program
EP3805899A1 (en) Head mounted display system and scene scanning method thereof
US12086924B2 (en) Terminal device, method, and computer program
US20200342833A1 (en) Head mounted display system and scene scanning method thereof
CN113678171A (en) Information processing apparatus, information processing method, and recording medium
US11610343B2 (en) Video display control apparatus, method, and non-transitory computer readable medium
EP3702008A1 (en) Displaying a viewport of a virtual space
US11928775B2 (en) Apparatus, system, method, and non-transitory medium which map two images onto a three-dimensional object to generate a virtual image
CN107426522B (en) Video method and system based on virtual reality equipment
JP4544262B2 (en) Virtual reality space sharing system and method, and information processing apparatus and method
WO2023204104A1 (en) Virtual space presenting device
US20240073520A1 (en) Dual camera tracking system
WO2024057650A1 (en) Electronic device
JP2024095387A (en) Terminal device and method for operating terminal device

Legal Events

Date Code Title Description
AS Assignment

Owner name: JVCKENWOOD CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NIIYAMA, MAHOKO;OGATA, TOMOAKI;OKI, JUN;AND OTHERS;SIGNING DATES FROM 20230202 TO 20230214;REEL/FRAME:062760/0653

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE