CN110837295A - Handheld control equipment and tracking and positioning method, equipment and system thereof - Google Patents
Handheld control equipment and tracking and positioning method, equipment and system thereof Download PDFInfo
- Publication number
- CN110837295A CN110837295A CN201910989132.4A CN201910989132A CN110837295A CN 110837295 A CN110837295 A CN 110837295A CN 201910989132 A CN201910989132 A CN 201910989132A CN 110837295 A CN110837295 A CN 110837295A
- Authority
- CN
- China
- Prior art keywords
- handheld control
- handheld
- control device
- state
- tracking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 230000008859 change Effects 0.000 claims abstract description 46
- 238000012545 processing Methods 0.000 claims description 85
- 230000015654 memory Effects 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 14
- 238000001514 detection method Methods 0.000 claims description 5
- 238000010030 laminating Methods 0.000 claims description 4
- 230000033001 locomotion Effects 0.000 abstract description 19
- 239000011159 matrix material Substances 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 230000005291 magnetic effect Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000013507 mapping Methods 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000004397 blinking Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 101100498818 Arabidopsis thaliana DDR4 gene Proteins 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005294 ferromagnetic effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
Abstract
The application aims to provide a handheld control device and a tracking and positioning method, device and system thereof. Compared with the prior art, the application provides a be provided with handheld controlgear of spiral portion, contains the soft connecting wire that replaceable, contain a plurality of luminous characteristic points simultaneously on the spiral portion. Therefore, the handheld control equipment is convenient to be shot by the camera equipment from multiple angles, and the tracking and positioning of the handheld control equipment are more accurate; and, when this flexible connection circuit takes place loss or the characteristic point that can give out light on the flexible connection circuit arranges and upgrades, can only replace this flexible connection circuit, need not to change whole handheld controlgear, the renewal of being convenient for saves the cost, has promoted user's use and has experienced. In addition, the handheld control device tracks the locate mode, and the tracking precision to the hand motion is higher to the power consumption of handheld control device has been saved, has increased handheld control device duration.
Description
Technical Field
The application relates to the technical field of computers, in particular to a handheld control device and a tracking and positioning technology thereof.
Background
Virtual Reality (VR) technology can help people to obtain a more real visual effect on a screen by displaying a picture in a 3D form. With the development of virtual reality technology, one application of the current virtual reality technology is to enable a user to experience interactive experience in a large-scale virtual scene through equipment such as a VR all-in-one machine. For example, by establishing a VR experience hall with different subject contents, capturing the displacement and the motion of the user in the VR experience hall, and mapping the displacement and the motion to a specific logic in the virtual reality scene, the user can interact with the subject in the virtual reality scene. Since the tracking of the handheld control device to locate the user is one of the methods for tracking and locating the user in the virtual reality technology, what kind of handheld control device is used and how to locate the handheld control device becomes one of the core technologies of the virtual application.
Disclosure of Invention
The application aims to provide a handheld control device and a tracking and positioning method, device and system thereof.
According to an embodiment of the application, a handheld control device is provided, wherein, handheld controller includes handheld portion and spiral portion, the spiral portion with handheld portion is fixed, contain replaceable soft connecting line on the spiral portion, soft connecting line can the spiral portion laminating is fixed to contain a plurality of luminous characteristic points.
Optionally, the illuminable feature point is arranged on either one or both sides of the spiral portion via the flexible connection line.
Optionally, the handheld control device further comprises one or more sensors.
Optionally, a vibration feedback system and/or a gesture detection system are further included on the handheld control device.
Alternatively, the lighting states of the plural illuminable feature points are determined based on the current tracking state.
Optionally, if the tracking state is a pose determination state, the illuminable feature point executes brightness change based on a predetermined brightness change rule.
According to an embodiment of the present application, there is provided a processing apparatus, wherein the processing apparatus includes:
the handheld controller comprises a handheld part and a spiral part, wherein the spiral part is fixed with the handheld part and comprises a replaceable soft connection circuit, and the soft connection circuit can be attached and fixed to the spiral part and comprises a plurality of illuminable characteristic points;
the second device is used for determining the light-emitting state of the light-emitting characteristic point on the handheld control equipment according to the current tracking state of the handheld control equipment and sending a corresponding control instruction to the handheld control equipment;
third means for acquiring a plurality of continuous images captured by the image capture device on the handheld control device, wherein the handheld control device causes the illuminable feature point to execute a corresponding lighting state according to the control instruction, and the plurality of continuous images are captured by the image capture device based on the lighting state;
and the fourth device is used for determining the position and the posture of the handheld control equipment according to the continuous images.
Optionally, the processing device further comprises:
and the fifth device is used for determining the current tracking state of the handheld control equipment.
Optionally, if the tracking state is an attitude determination state, the second device is configured to:
and according to the posture determining state, determining the light emitting state of the illuminable feature point on the handheld control equipment as executing brightness change based on a preset brightness change rule, and sending a corresponding control instruction to the handheld control equipment.
Optionally, the processing device further comprises:
sixth means for acquiring additional information sent by the handheld control device, where the additional information includes one or more sensor information acquired by one or more sensors of the handheld control device;
wherein the fourth means is for:
and determining the position and the posture of the handheld control equipment according to the continuous images and the one or more kinds of sensing information.
According to an embodiment of the present application, there is provided a method for performing tracking and positioning of a handheld control device on a processing device side, wherein the method includes the following steps:
the handheld controller comprises a handheld part and a spiral part, wherein the spiral part is fixed with the handheld part, the spiral part comprises a replaceable soft connection circuit, and the soft connection circuit can be attached and fixed to the spiral part and comprises a plurality of luminous characteristic points;
determining the light-emitting state of the light-emitting characteristic point on the handheld control equipment according to the current tracking state of the handheld control equipment, and sending a corresponding control instruction to the handheld control equipment;
acquiring a plurality of continuous images shot by the camera equipment on the handheld control equipment, wherein the handheld control equipment enables the illuminable feature point to execute a corresponding lighting state according to the control instruction, and the plurality of continuous images are shot by the camera equipment based on the lighting state;
and determining the position and the posture of the handheld control equipment according to the continuous images.
Optionally, the method further comprises:
determining a current tracking state of the handheld control device.
Optionally, if the tracking state is the pose determination state, the step of determining the lighting state of the illuminable feature point on the handheld control device includes:
and according to the posture determining state, determining the light emitting state of the illuminable feature point on the handheld control equipment as executing brightness change based on a preset brightness change rule, and sending a corresponding control instruction to the handheld control equipment.
Optionally, the method further comprises:
acquiring additional information sent by the handheld control device, wherein the additional information comprises one or more types of sensing information acquired by one or more types of sensors of the handheld control device;
wherein the step of determining the position and attitude of the handheld control device comprises:
and determining the position and the posture of the handheld control equipment according to the continuous images and the one or more kinds of sensing information.
According to an embodiment of the application, there is provided a head display apparatus, wherein the head display apparatus comprises a processing apparatus as described in any one of the above.
Optionally, the head display device further includes a camera device to capture the handheld control device to generate a plurality of continuous images.
According to an embodiment of the present application, there is provided a system for tracking a location of a handheld control device, wherein the system comprises the handheld control device according to any one of the above items, and any one of the following items:
the processing equipment and the head display equipment comprise the camera equipment, wherein the camera equipment shoots the handheld control equipment to generate a plurality of continuous images;
the processing device, the head display device and the camera device for shooting the handheld control device to generate a plurality of continuous images, wherein the camera device is fixed at a certain fixed position in a space;
a head display device comprising the processing device according to any one of the preceding claims, and an image pickup device for taking the hand-held control device to generate a plurality of consecutive images, wherein the image pickup device is fixed at a fixed position in space;
a head display apparatus including the processing apparatus and the image pickup apparatus according to any one of the above.
According to an embodiment of the present application, there is provided a computer apparatus including:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any one of the above.
According to an embodiment of the application, a computer-readable storage medium is provided, on which a computer program is stored, the computer program being executable by a processor for performing the method according to any of the above.
Compared with the prior art, the application provides a be provided with handheld controlgear of spiral portion, contains the soft connecting wire that replaceable, contain a plurality of luminous characteristic points simultaneously on the spiral portion. Therefore, the handheld control equipment is convenient to be shot by the camera equipment from multiple angles, and the tracking and positioning of the handheld control equipment are more accurate; and, when this flexible connection circuit takes place loss or the characteristic point that can give out light on the flexible connection circuit arranges and upgrades, can only replace this flexible connection circuit, need not to change whole handheld controlgear, the renewal of being convenient for saves the cost, has promoted user's use and has experienced.
In addition, the method and the device establish connection among the processing device, the handheld control device and the camera device, the processing device determines the light emitting state of the light emitting characteristic point on the handheld control device according to the current tracking state of the handheld control device, the handheld control device executes the corresponding light emitting state, and the processing device determines the position and the posture of the handheld control device according to a plurality of continuous images shot by the camera device on the light emitting state. Therefore, the handheld control equipment has the advantages that the tracking and positioning mode is high in accuracy of tracking hand movement, and tracking delay is low, so that a user has more immersion; moreover, the fixed position of the camera equipment is flexible, so that the camera equipment can shoot freely and is easy to solve shooting interference; meanwhile, the light-emitting state is determined based on the tracking state, so that the method is more flexible, the power consumption of the handheld control equipment is saved, and the endurance time of the handheld control equipment is prolonged.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 to 4 respectively show schematic views of a handheld control device according to an embodiment of the present application from different viewing angles;
FIG. 5 shows a schematic diagram of a processing device for performing handheld control device tracking location according to one embodiment of the present application;
FIG. 6 is a flow chart illustrating a method for performing tracking and positioning of a handheld control device on a processing device according to an embodiment of the present application;
FIG. 7 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
Tracking the handheld control device mainly comprises an inertia method, an ultrasonic method, an electromagnetic method and the like. However, after the above-mentioned modes are analyzed, it is found that in the inertial tracking method, errors are correspondingly accumulated along with the increase of time of the navigation algorithm, so that the tracking accuracy of the inertial tracking method is low; in the ultrasonic method, the ultrasonic controller has a small working range due to the limitation of the sound speed; in the electromagnetic method, the tracking precision is easily interfered by metal materials in the environment, so that the electromagnetic tracking is unstable, the electromagnetic power consumption is high, and the cruising ability of the handheld control equipment is poor.
Therefore, the present application provides a novel handheld control device and a method, a device and a system for tracking and positioning the same, so as to solve one or more of the defects of the handheld control device and the positioning method thereof.
The present application is described in further detail below with reference to the attached figures.
The camera device includes, but is not limited to, any one of stand-alone camera devices, such as a camera, or camera devices integrated with other devices, such as an electronic device including a camera function.
The camera shooting device can be wirelessly connected with the handheld control device (such as wifi connection or Bluetooth connection), and the handheld control device shoots the images to generate a plurality of continuous images.
The camera shooting device can be connected with the processing device in a wireless connection or wired connection mode, so that the processing device can instruct the camera shooting device to shoot the handheld control device according to the corresponding shooting frequency; and the camera device can also send the plurality of shot continuous images to a processing device for the processing device to determine the position and the posture of the handheld control device. Here, the plurality of continuous images captured by the imaging device may be transmitted to the processing device via another device, for example, by being relayed.
The handheld control device can establish wireless connection (such as wifi connection or Bluetooth connection) with the processing device, so that the processing device and the handheld control device can send corresponding instruction information and/or response information.
In one embodiment, the imaging device is fixed at a fixed position in space.
In one embodiment, the camera device is included in the head display device in a fixed or built-in manner.
In one embodiment, the image pickup apparatus supports a Global shutter mode (Global shutter exposure mode) to achieve better tracking accuracy.
In one embodiment, the head display device of the present application includes the processing device in any of the embodiments described herein, i.e., the head display device is responsible for executing the processing operation of the processing device.
The head-mounted display device includes, but is not limited to, any one of a VR (virtual Reality)/AR (Augmented Reality)/MR (Mix Reality) all-in-one machine, VR/AR/MR glasses, etc. which can execute a virtual Reality technology and can be worn by a user to display corresponding information. In one embodiment, the virtual reality/augmented reality/mixed reality device may also be a device formed by integrating an existing virtual reality/augmented reality/mixed reality device with other devices through a network, wherein the other devices include user equipment and/or network equipment.
The processing device described herein includes, but is not limited to, any stand-alone computer device or a computer device integrated with other devices. The computer devices include, but are not limited to, user devices and/or network devices. In one embodiment, the processing device may be integrated into a head-up display device.
The user equipment includes, but is not limited to, any electronic product capable of performing human-computer interaction with a user, such as a virtual reality personal terminal, a personal computer, a smart phone, a tablet computer, and the like, and the electronic product may employ any operating system, such as a windows operating system, an android operating system, an iOS operating system, and the like. The network device includes an electronic device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and hardware thereof includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, a plurality of network server sets or a cloud of a plurality of servers; here, the cloud is composed of a large number of computers or web servers based on cloud computing (cloud computing), which is a kind of distributed computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc.
Of course, those skilled in the art will appreciate that the above-described devices are merely exemplary, and that other devices, existing or hereafter developed, that may be suitable for use in the present application, are also within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
The handheld control device described herein includes, but is not limited to, any handheld control device that may need to perform tracking positioning, such as a joystick for games, a handheld control device for virtual reality applications, and the like.
The application handheld controlgear include handheld portion and spiral portion, the spiral portion with handheld portion is fixed, contain interchangeable flexible connection circuit on the spiral portion, flexible connection circuit can the spiral portion laminating is fixed to include a plurality of luminous characteristic points.
Here, the screw part and the hand-held part may be fixed by any fixing means; in one embodiment, the fixed position of the spiral part fixed with the handheld part does not affect the holding of the handheld part by the user, and for example, the fixed position of the spiral part fixed with the handheld part can be determined according to the shape and the holding position of the handheld part. In one embodiment, the spiral part may be fixed to the handle in an articulated manner, and the spiral part may rotate on the handle under a predetermined condition (e.g., applying a certain external force, turning on a specific switch, etc.), so as to facilitate storage of the handheld control device.
The flexible connection circuit may include a plurality of light-emitting feature points, and the light emitted by the light-emitting feature points includes, but is not limited to, visible light, infrared light, or blue light. The arrangement of the characteristic points can be distributed over the whole flexible connection circuit in a uniform distribution mode, or can be distributed in the flexible connection circuit in a mode of multiple characteristic patterns, and the characteristic patterns can be any patterns or shapes. In addition, other arrangements of the illuminable feature points are also applicable to the present application and are included in the scope of the present application.
The soft connection circuit can supply power to the illuminable characteristic points; in one embodiment, the flexible connection line is further capable of obtaining, executing or sending control instructions to the handheld controller. The shape of the flexible connection lines themselves may also be any regular or irregular shape.
In one embodiment, the arrangement of the illuminable feature points on the soft connection circuit can meet the requirement of multi-angle visibility, so that the camera equipment can shoot the brightness of the illuminable feature points on the handheld control equipment.
In one embodiment, the number of the illuminable feature points is greater than or equal to 4.
The flexible connection line can be attached and fixed to the spiral part by means of bonding, bolting or the like, so that the flexible connection line can be easily replaced. Therefore, when the soft connection circuit is damaged (such as the damage of the light emitting point) or the light emitting point arrangement on the soft connection circuit is upgraded, the soft connection circuit can be replaced only, and the whole handheld control device does not need to be replaced, so that the use of a user is facilitated.
In one embodiment, the illuminable feature may be arranged on either one or both sides of the spiral portion via the flexible connection line.
In one embodiment, the handheld control device further comprises one or more sensors. Wherein, the sensor can be arranged on the hand-held part and also can be arranged on the spiral part. The sensors include, but are not limited to, IMU (Inertial Measurement Unit) sensors, electromagnetic sensors, ultrasonic sensors, etc., and sensing information obtained by the sensors may be sent to the processing device to assist in resolving the attitude and the position.
In one embodiment, the handheld control device further comprises one or more vibration feedback systems and/or gesture detection systems. Wherein the vibration feedback system and/or the gesture detection system can be disposed on the handheld portion and also disposed on the spiral portion. The vibration feedback system may be composed based on, for example, one or more vibration sensors to provide real-time feedback on the user's behavior; the gesture detection system may be based on, for example, a capacitive sensor, a depth sensor, etc., to detect the hand and finger states of the user, including the open and closed, bent, etc. states of the palm and/or the respective fingers.
In one embodiment, the lighting states of the plurality of illuminable feature points are determined based on the current tracking state.
The tracking state may include various division manners, for example, the tracking state includes a pose determination state and a tracking state, the pose determination state is a state in which the position and the pose corresponding to the handheld control device are determined, and the tracking state is a state in which the user holds the handheld control device (including still and moving) after the pose determination state. Alternatively, the tracking state includes a static state, a motion state, and the like. Here, the dividing manner of the tracking state may be determined according to information such as a type of the handheld control device, an application field of the handheld control device, and an application scenario of the handheld control device.
The tracking state and/or the lighting state of the plurality of illuminable feature points may be determined by a processing device corresponding to the handheld control device or may be determined by the processing device in conjunction with the handheld control device. For example, the processing device determines the tracking state and further determines the light emitting state of the light emitting feature point according to a plurality of continuous images sent by the image pickup device; or the handheld control device sends the corresponding parameters to the processing device, and the processing device determines the current tracking state and further determines the light emitting state of the light-emitting characteristic point.
The light-emitting state of the illuminable feature point comprises but is not limited to flashing and continuous lighting; and if the light-emitting state is flickering, the light-emitting state also comprises information such as flickering brightness, interval, frequency and the like.
In one embodiment, if the tracking state is the pose determination state, the illuminable feature point performs brightness change based on a predetermined brightness change rule. The predetermined brightness change rule may be a rule set by the handheld control device as a default, or may be a brightness change rule obtained according to an instruction sent by the processing device.
In one embodiment, if the tracking state is the tracking state, the illuminable feature point may continuously illuminate or flash.
Fig. 1 to 4 respectively show schematic views of a handheld control device according to an embodiment of the present application from different viewing angles.
Wherein the hand-held control device comprises a screw part 10 and a hand-held part 20. The spiral portion 10 includes an alternative flexible connection line 101, and a plurality of reference numerals 101 shown in fig. 1 correspond to the same flexible connection line. The flexible connection line 101 includes a plurality of illuminable feature points 102, and the flexible connection line 101 is fixed to the spiral portion 10 through a device 103.
As can be seen from the hand-held control device shown in fig. 1 to 4, the flexible connection line 101 may be nested on the spiral part 10 and further fixed by means 103. The flexible connection 101 can be removed from the nest and removed from the helix 10 after the device 103 is screwed or pulled out to allow replacement of the flexible connection.
The plural illuminable feature points 102 are distributed on both sides of the spiral portion 10.
The spiral state of the spiral part 10 is not limited to the state shown in fig. 1 to 4, and other spiral states are also applicable to and included in the scope of the present application.
FIG. 5 shows a schematic diagram of a processing device for performing handheld control device tracking location according to one embodiment of the present application.
Wherein the processing apparatus 50 comprises a first means 501, a second means 502, a third means 503, a fourth means 504 and a fifth means 505.
Specifically, first device 501 establishes a connection with handheld control equipment and camera equipment, wherein, handheld controller includes handheld portion and spiral portion, the spiral portion with handheld portion is fixed, contain replaceable flexible connection line on the spiral portion, flexible connection line can the laminating of spiral portion is fixed to include a plurality of luminous characteristic points.
Here, the first device 501 may be connected to the handheld control apparatus by a wireless connection, and may be connected to the image capturing apparatus by a wired connection or a wireless connection. In one embodiment, the imaging device may be integrated with the processing device 50.
The second device 502 determines the light-emitting state of the feature point capable of emitting light on the handheld control device according to the current tracking state of the handheld control device, and sends a corresponding control instruction to the handheld control device.
Here, the second apparatus 502 may determine the tracking status in real time, or may obtain the current tracking status of the handheld control device from the handheld control device or other apparatuses.
In one embodiment, the processing device 50 further comprises a fifth means (not shown), wherein the fifth means determines the current tracking status of the handheld control device.
Here, the fifth device determines the tracking status in one or more of the following manners:
-determining a current tracking status of the handheld control device based on a triggering event for the handheld control device to establish a connection with the first apparatus 501; for example, when the first apparatus 501 establishes a connection with the handheld control device, the tracking state is determined as the posture determination state.
-determining a current tracking state of the handheld control device from a triggering event of a user holding the handheld control device; for example, when a user triggers a specific key of the handheld control device or a sensor on the handheld control device detects that an external force is held, the tracking state is determined as a posture-fixing state.
-setting said tracking state to a tracking state when said pose determination state is completed.
-detecting a change in displacement of the handheld control device, and determining the tracking state as a rest state if the handheld control device is not displaced; otherwise, the tracking state is determined as the motion state.
The above-mentioned manner for determining the tracking status is only an example, and is not a limitation to the present application. Other ways of determining the tracking status are equally applicable and within the scope of the present application.
The second device 502 determines the light-emitting state of the feature point capable of emitting light on the handheld control device according to the current tracking state of the handheld control device, and sends a corresponding control instruction to the handheld control device. Here, the light-emitting state includes, but is not limited to, continuous light emission or blinking light emission, and the control instruction includes, but is not limited to, information such as light-emitting time, light-emitting brightness, and light-emitting frequency, so as to control the illuminable feature point to emit light.
In an embodiment, if the tracking state is the pose determination state, the second apparatus 502 determines, according to the pose determination state, that the light emitting state of the illuminable feature point on the handheld control device is to execute a brightness change based on a predetermined brightness change rule, and sends a corresponding control instruction to the handheld control device.
Here, the brightness change rule may set a corresponding brightness change including brightness, a bright-dark time, a bright-dark frequency, and the like for each illuminable feature point.
In one embodiment, the predetermined brightness change rule includes that brightness changes on a plurality of consecutive images of the illuminable feature point are unique during the pose determination, wherein the number of images of the plurality of images is determined according to the number of the illuminable feature points.
The third means 503 acquires multiple continuous images captured by the image capture device on the handheld control device, where the handheld control device causes the illuminable feature point to execute a corresponding lighting state according to the control instruction, and the multiple continuous images are captured by the image capture device based on the lighting state.
The handheld control equipment acquires the control instruction, so that the illuminable feature point executes a corresponding lighting state; and the camera shooting equipment shoots the handheld control equipment according to the light-emitting state so as to generate a plurality of continuous images.
Here, the image capturing apparatus may capture a corresponding image for each of the different light emission states based on the light emission state, and finally acquire a plurality of consecutive images; the handheld control device can also be shot at regular time based on a preset shooting period so as to acquire a plurality of continuous images. Then, the image pickup apparatus transmits the plurality of continuous images to the third device 503 of the processing apparatus.
In one embodiment, the illuminable feature point is turned on for a time period when brightness change is executed based on a preset brightness change rule, which is greater than or equal to an image capturing time period when the image capturing apparatus captures each brightness change.
In one embodiment, after the wireless connection is established between the handheld control device and the image pickup device, the image pickup device performs synchronization with the handheld control device according to a predetermined rule or a predetermined period, and images of the handheld control device are picked up by the image pickup device after each synchronization. For example, before each brightness change of the handheld control apparatus, the image pickup apparatus performs synchronization with the handheld control apparatus, and photographs the handheld control apparatus after the brightness change. Here, the image pickup apparatus may interact with the processing apparatus to acquire a brightness change rule of the handheld control apparatus, and perform a corresponding synchronization flow based thereon.
In an embodiment, the image capturing apparatus may send the multiple continuous images to other apparatuses, and the third device 503 may interact with the other apparatuses to obtain the multiple continuous images captured by the image capturing apparatus on the handheld control apparatus.
The fourth means 504 determines the position and the posture of the handheld control device from the continuous images.
Here, the fourth means 504 determines, based on the luminance of the feature point that can emit light on the continuous images, the correspondence relationship between the feature point that can emit light in the plurality of continuous images and each feature point that can emit light on the handheld control device, in conjunction with the luminance change rule at the time of capturing the images.
Based on the corresponding relationship, the fourth device 504 determines a mapping relationship from the three-dimensional space coordinate of each feature point on the handheld control device to the two-dimensional coordinate of the image according to the position information of each feature point on the image and the position information of each feature point on the handheld control device.
The fourth means 504 determines, according to the multiple continuous images, the spatial three-dimensional coordinates of the handheld control device in the coordinate system of the imaging device by combining the mapping relationship between the three-dimensional space coordinates of each feature point capable of emitting light on the handheld control device corresponding to each image and the two-dimensional coordinates of the image. Here, the fourth means 504 may determine the spatial three-dimensional coordinates of the handheld control device in the coordinate system of the camera device according to a correlation algorithm, for example, a PNP algorithm.
The fourth device 504 converts the three-dimensional space coordinates of the handheld control device in the coordinate system of the camera device into the coordinates of the handheld control device in the coordinate system of the world according to the transfer matrix between the coordinate system of the camera device and the coordinate system of the world, so as to determine the position and the posture of the handheld control device, thereby realizing the tracking and positioning of the handheld control device.
Here, the transition matrix may be a preset matrix acquired by the fourth device 504, or may be a calculated transition matrix acquired by the fourth device 504. The fourth means 504 may determine a transfer matrix between the camera coordinate system and the world coordinate system based on the position of the camera.
In one embodiment, the processing device 50 further includes a sixth device (not shown), wherein the sixth device obtains additional information transmitted by the handheld control device, wherein the additional information includes one or more sensing information obtained by one or more sensors of the handheld control device; the fourth means 504 determines the position and orientation of the handheld control device from the continuous images in combination with the one or more sensory information.
Herein, the sensors include, but are not limited to, inertial sensors, ultrasonic sensors, electromagnetic sensors, sensors included in vibration feedback systems and/or handheld feedback systems, and the like. The inertial sensor can directly acquire the motion information; the ultrasonic sensor can collect ultrasonic information, and TOF (time-of-right) related information can be caused by movement, so that the movement information can be reflected by collecting the ultrasonic information; the magnetic field sensor can collect electromagnetic information, and the movement information can be reflected by collecting the electromagnetic information because the movement can cause the change of the magnetic field intensity; the capacitance sensor, the depth sensor and the like can detect the hand and finger states of the user to acquire the hand state of the user.
The sixth means obtains one or more additional information obtained by one or more sensors, and the fourth means 504 calculates the motion trajectory of the handheld control device based on the additional information, for example, in combination with a navigation algorithm, and has a lower tracking delay.
Fig. 6 is a flowchart illustrating a method for performing tracking and positioning of a handheld control device on a processing device according to an embodiment of the present application.
Specifically, in step S601, the processing device establishes a connection with a handheld control device and an imaging device, where the handheld controller includes a handheld portion and a spiral portion, the spiral portion is fixed to the handheld portion, and the spiral portion includes a replaceable flexible connection line, where the flexible connection line is capable of being attached and fixed to the spiral portion and includes a plurality of illuminable feature points.
Here, in step S601, the processing device may be connected to the handheld control device by a wireless connection, and may be connected to the image capturing device by a wired connection or a wireless connection. In one embodiment, the image capture device may be integrated with the processing device.
In step S602, the processing device determines a light-emitting state of a feature point capable of emitting light on the handheld control device according to a current tracking state of the handheld control device, and sends a corresponding control instruction to the handheld control device.
Here, the processing device may determine the tracking status in real time, or may obtain the current tracking status of the handheld control device from the handheld control device or other apparatuses.
In one embodiment, the method further comprises step S605 (not shown), wherein in step S605, the processing device determines a current tracking status of the handheld control device.
Here, in step S605, the processing device determines the tracking status in one or more of the following manners:
-determining a current tracking status of the handheld control device based on a triggering event of the handheld control device establishing a connection with the processing device; for example, the tracking state is determined to be a pose determination state when the processing device establishes a connection with the handheld control device.
-determining a current tracking state of the handheld control device from a triggering event of a user holding the handheld control device; for example, when a user triggers a specific key of the handheld control device or a sensor on the handheld control device detects that an external force is held, the tracking state is determined as a posture-fixing state.
-setting said tracking state to a tracking state when said pose determination state is completed.
-detecting a change in displacement of the handheld control device, and determining the tracking state as a rest state if the handheld control device is not displaced; otherwise, the tracking state is determined as the motion state.
The above-mentioned manner for determining the tracking status is only an example, and is not a limitation to the present application. Other ways of determining the tracking status are equally applicable and within the scope of the present application.
In step S602, the processing device determines a light-emitting state of a feature point capable of emitting light on the handheld control device according to a current tracking state of the handheld control device, and sends a corresponding control instruction to the handheld control device. Here, the light-emitting state includes, but is not limited to, continuous light emission or blinking light emission, and the control instruction includes, but is not limited to, information such as light-emitting time, light-emitting brightness, and light-emitting frequency, so as to control the illuminable feature point to emit light.
In one embodiment, if the tracking state is the pose determination state, in step S602, the processing device determines, according to the pose determination state, that the light emitting state of the illuminable feature point on the handheld control device is to execute a brightness change based on a predetermined brightness change rule, and sends a corresponding control instruction to the handheld control device.
Here, the brightness change rule may set a corresponding brightness change including brightness, a bright-dark time, a bright-dark frequency, and the like for each illuminable feature point.
In one embodiment, the predetermined brightness change rule includes that brightness changes on a plurality of consecutive images of the illuminable feature point are unique during the pose determination, wherein the number of images of the plurality of images is determined according to the number of the illuminable feature points.
In step S603, the processing device acquires a plurality of continuous images captured by the imaging device on the handheld control device, wherein the handheld control device causes the illuminable feature point to execute a corresponding lighting state according to the control instruction, and the plurality of continuous images are captured by the imaging device based on the lighting state.
The handheld control equipment acquires the control instruction, so that the illuminable feature point executes a corresponding lighting state; and the camera shooting equipment shoots the handheld control equipment according to the light-emitting state so as to generate a plurality of continuous images.
Here, the image capturing apparatus may capture a corresponding image for each of the different light emission states based on the light emission state, and finally acquire a plurality of consecutive images; the handheld control device can also be shot at regular time based on a preset shooting period so as to acquire a plurality of continuous images. Then, the image pickup apparatus transmits the plurality of continuous images to a processing apparatus of the processing apparatus.
In one embodiment, the illuminable feature point is turned on for a time period when brightness change is executed based on a preset brightness change rule, which is greater than or equal to an image capturing time period when the image capturing apparatus captures each brightness change.
In one embodiment, after the wireless connection is established between the handheld control device and the image pickup device, the image pickup device performs synchronization with the handheld control device according to a predetermined rule or a predetermined period, and images of the handheld control device are picked up by the image pickup device after each synchronization. For example, before each brightness change of the handheld control apparatus, the image pickup apparatus performs synchronization with the handheld control apparatus, and photographs the handheld control apparatus after the brightness change. Here, the image pickup apparatus may interact with the processing apparatus to acquire a brightness change rule of the handheld control apparatus, and perform a corresponding synchronization flow based thereon.
In one embodiment, the camera device may send the plurality of consecutive images to another device, and the processing device may interact with the other device to obtain the plurality of consecutive images captured by the camera device on the handheld control device.
In step S604, the processing device determines the position and the posture of the handheld control device according to the continuous images.
Here, in step S604, the processing device determines a correspondence relationship between the illuminable feature point in the plurality of continuous images and each illuminable feature point on the handheld control device based on the luminance of the illuminable feature point on the continuous images in combination with a luminance change rule at the time of capturing the images.
And the processing equipment determines the mapping relation from the three-dimensional space coordinate of each illuminable characteristic point on the handheld control equipment to the two-dimensional coordinate of the image according to the position information of each illuminable characteristic point on the image and the position information of each illuminable characteristic point on the handheld control equipment based on the corresponding relation.
And the processing equipment determines the spatial three-dimensional coordinates of the handheld control equipment in a coordinate system of the camera equipment according to the plurality of continuous images and by combining the mapping relation between the three-dimensional space coordinates of each illuminable characteristic point on the handheld control equipment corresponding to each image and the two-dimensional coordinates of the image. The processing device can determine the spatial three-dimensional coordinates of the handheld control device in the camera device coordinate system according to a correlation algorithm, for example a PNP algorithm.
The processing equipment converts the space three-dimensional coordinates of the handheld control equipment in the coordinate system of the camera equipment into the coordinates of the handheld control equipment in the coordinate system of the world according to the transfer matrix between the coordinate system of the camera equipment and the coordinate system of the world so as to determine the position and the posture of the handheld control equipment, and therefore tracking and positioning of the handheld control equipment are achieved.
Here, the transition matrix may be a preset matrix acquired by the processing device, or may be a calculated transition matrix acquired by the processing device. The processing device may determine a transfer matrix between the camera device coordinate system and a world coordinate system based on a location at which the camera device is located.
In one embodiment, the method further includes step S606 (not shown), wherein in step S606, the processing device acquires additional information transmitted by the handheld control device, wherein the additional information includes one or more sensing information acquired by one or more sensors of the handheld control device; in step S604, the processing device determines the position and the posture of the handheld control device according to the continuous images and the one or more sensing information.
Herein, the sensors include, but are not limited to, inertial sensors, ultrasonic sensors, electromagnetic sensors, sensors included in vibration feedback systems and/or handheld feedback systems, and the like. The inertial sensor can directly acquire the motion information; the ultrasonic sensor can collect ultrasonic information, and TOF (time-of-right) related information can be caused by movement, so that the movement information can be reflected by collecting the ultrasonic information; the magnetic field sensor can collect electromagnetic information, and the movement information can be reflected by collecting the electromagnetic information because the movement can cause the change of the magnetic field intensity; the capacitance sensor, the depth sensor and the like can detect the hand and finger states of the user to acquire the hand state of the user.
The processing device obtains one or more additional information obtained by the one or more sensors, and in step S604, the processing device calculates the motion trajectory of the handheld control device based on the additional information, in combination with, for example, a navigation algorithm, and with lower tracking delay.
FIG. 7 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
In some embodiments, system 700 can be implemented as any of the processing devices of the embodiments shown in fig. 5, 6, or other described embodiments. In some embodiments, system 700 may include one or more computer-readable media (e.g., system memory or NVM/storage 720) having instructions and one or more processors (e.g., processor(s) 705) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 710 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 705 and/or to any suitable device or component in communication with system control module 710.
The system control module 710 may include a memory controller module 730 to provide an interface to the system memory 715. Memory controller module 730 may be a hardware module, a software module, and/or a firmware module.
System memory 715 may be used, for example, to load and store data and/or instructions for system 700. For one embodiment, system memory 715 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 715 may include a double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
For one embodiment, system control module 710 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 720 and communication interface(s) 725.
For example, NVM/storage 720 may be used to store data and/or instructions. NVM/storage 720 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 720 may include storage resources that are physically part of a device on which system 700 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 720 may be accessed over a network via communication interface(s) 725.
Communication interface(s) 725 may provide an interface for system 700 to communicate over one or more networks and/or with any other suitable devices. System 700 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 705 may be packaged together with logic for one or more controller(s) of system control module 710, such as memory controller module 730. For one embodiment, at least one of the processor(s) 705 may be packaged together with logic for one or more controller(s) of the system control module 710 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 705 may be integrated on the same die with logic for one or more controller(s) of the system control module 710. For one embodiment, at least one of the processor(s) 705 may be integrated on the same die with logic for one or more controller(s) of system control module 710 to form a system on a chip (SoC).
In various embodiments, system 700 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, system 700 may have more or fewer components and/or different architectures. For example, in some embodiments, system 700 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
Further, the present application shows a system for tracking a position of a handheld control device, wherein the system comprises the handheld control device according to any of the embodiments of the present application, and any of:
-a processing device according to any of the embodiments of the present application and a head display device comprising a camera device, said camera device taking a picture of said handheld control device, generating a plurality of consecutive images;
-a processing device, a head-up display device and a camera device for taking a picture of the hand-held control device for generating a plurality of consecutive images, wherein the camera device is fixed at a fixed position in space;
-a head-up device according to any of the embodiments of the present application, and a camera device for taking the hand-held control device to generate a plurality of consecutive images, wherein the camera device is fixed at a fixed position in space;
-a head display device according to any of the embodiments of the present application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Those skilled in the art will appreciate that the form in which the computer program instructions reside on a computer-readable medium includes, but is not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media by which communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules, or other data may be embodied in a modulated data signal, for example, in a wireless medium such as a carrier wave or similar mechanism such as is embodied as part of spread spectrum techniques. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
An embodiment according to the present application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or a solution according to the aforementioned embodiments of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Claims (19)
1. The utility model provides a handheld controlgear, wherein, handheld controller includes handheld portion and spiral portion, the spiral portion with handheld portion is fixed, contain interchangeable soft connecting circuit on the spiral portion, soft connecting circuit can the spiral portion laminating is fixed to contain a plurality of luminous characteristic points.
2. The handheld control device of claim 1, wherein the illuminable feature is disposed on either one or both sides of the helix via the flexible connection.
3. The handheld control device of claim 1 or 2, further comprising one or more sensors on the handheld control device.
4. The handheld control device of claim 1 or 2, further comprising a vibration feedback system and/or a gesture detection system thereon.
5. The handheld control device according to any one of claims 1 to 4, wherein the lighting state of the plurality of illuminable feature points is determined based on a current tracking state.
6. The handheld control apparatus according to claim 5, wherein the illuminable feature point performs a luminance change based on a predetermined luminance change rule if the tracking state is a pose determination state.
7. A processing device, wherein the processing device comprises:
the handheld controller comprises a handheld part and a spiral part, wherein the spiral part is fixed with the handheld part and comprises a replaceable soft connection circuit, and the soft connection circuit can be attached and fixed to the spiral part and comprises a plurality of illuminable characteristic points;
the second device is used for determining the light-emitting state of the light-emitting characteristic point on the handheld control equipment according to the current tracking state of the handheld control equipment and sending a corresponding control instruction to the handheld control equipment;
third means for acquiring a plurality of continuous images captured by the image capture device on the handheld control device, wherein the handheld control device causes the illuminable feature point to execute a corresponding lighting state according to the control instruction, and the plurality of continuous images are captured by the image capture device based on the lighting state;
and the fourth device is used for determining the position and the posture of the handheld control equipment according to the continuous images.
8. The processing device of claim 8, wherein the processing device further comprises:
and the fifth device is used for determining the current tracking state of the handheld control equipment.
9. The processing apparatus according to claim 7 or 8, wherein if the tracking state is a pose state, the second means is for:
and according to the posture determining state, determining the light emitting state of the illuminable feature point on the handheld control equipment as executing brightness change based on a preset brightness change rule, and sending a corresponding control instruction to the handheld control equipment.
10. The processing device of any of claims 7 to 9, wherein the processing device further comprises:
sixth means for acquiring additional information sent by the handheld control device, where the additional information includes one or more sensor information acquired by one or more sensors of the handheld control device;
wherein the fourth means is for:
and determining the position and the posture of the handheld control equipment according to the continuous images and the one or more kinds of sensing information.
11. A method for performing tracking and positioning of a handheld control device on a processing device side, wherein the method comprises the following steps:
the handheld controller comprises a handheld part and a spiral part, wherein the spiral part is fixed with the handheld part, the spiral part comprises a replaceable soft connection circuit, and the soft connection circuit can be attached and fixed to the spiral part and comprises a plurality of luminous characteristic points;
determining the light-emitting state of the light-emitting characteristic point on the handheld control equipment according to the current tracking state of the handheld control equipment, and sending a corresponding control instruction to the handheld control equipment;
acquiring a plurality of continuous images shot by the camera equipment on the handheld control equipment, wherein the handheld control equipment enables the illuminable feature point to execute a corresponding lighting state according to the control instruction, and the plurality of continuous images are shot by the camera equipment based on the lighting state;
and determining the position and the posture of the handheld control equipment according to the continuous images.
12. The method of claim 11, wherein the method further comprises:
determining a current tracking state of the handheld control device.
13. The method according to claim 11 or 12, wherein if the tracking state is a pose determination state, the step of determining the lighting state of the illuminable feature point on the handheld control device comprises:
and according to the posture determining state, determining the light emitting state of the illuminable feature point on the handheld control equipment as executing brightness change based on a preset brightness change rule, and sending a corresponding control instruction to the handheld control equipment.
14. The method of any of claims 11 to 13, wherein the method further comprises:
acquiring additional information sent by the handheld control device, wherein the additional information comprises one or more types of sensing information acquired by one or more types of sensors of the handheld control device;
wherein the step of determining the position and attitude of the handheld control device comprises:
and determining the position and the posture of the handheld control equipment according to the continuous images and the one or more kinds of sensing information.
15. A head display apparatus, wherein the head display apparatus comprises a processing apparatus according to any one of claims 7 to 10.
16. The head display device according to claim 15, wherein the head display device further comprises a camera device to capture the handheld control device to generate a plurality of consecutive images.
17. A system for tracking the location of a handheld control device, wherein the system comprises a handheld control device as claimed in any one of claims 1 to 6, and any one of:
the processing device according to any one of claims 7 to 10 and a head display device comprising a camera device, the camera device taking a picture of the handheld control device, generating a plurality of consecutive images;
the processing device, the head display device and the camera device for shooting the handheld control device to generate a plurality of continuous images according to any one of claims 7 to 10, wherein the camera device is fixed at a fixed position in space;
the head display device according to claim 15, and an image pickup device for taking the hand-held control device to generate a plurality of continuous images, wherein the image pickup device is fixed at a fixed position in space;
a head display device according to claim 16.
18. A computer device, the computer device comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 11-14.
19. A computer-readable storage medium, on which a computer program is stored, which computer program can be executed by a processor to perform the method of any one of claims 11 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910989132.4A CN110837295A (en) | 2019-10-17 | 2019-10-17 | Handheld control equipment and tracking and positioning method, equipment and system thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910989132.4A CN110837295A (en) | 2019-10-17 | 2019-10-17 | Handheld control equipment and tracking and positioning method, equipment and system thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110837295A true CN110837295A (en) | 2020-02-25 |
Family
ID=69575505
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910989132.4A Pending CN110837295A (en) | 2019-10-17 | 2019-10-17 | Handheld control equipment and tracking and positioning method, equipment and system thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110837295A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111459279A (en) * | 2020-04-02 | 2020-07-28 | 重庆爱奇艺智能科技有限公司 | Active light filling equipment, 3DOF handle, VR equipment and tracking system |
CN112451962A (en) * | 2020-11-09 | 2021-03-09 | 青岛小鸟看看科技有限公司 | Handle control tracker |
WO2022227374A1 (en) * | 2021-04-27 | 2022-11-03 | 青岛小鸟看看科技有限公司 | Control method and apparatus for handle control tracker, and head-mounted display device |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160357261A1 (en) * | 2015-06-03 | 2016-12-08 | Oculus Vr, Llc | Virtual Reality System with Head-Mounted Display, Camera and Hand-Held Controllers |
CN106326930A (en) * | 2016-08-24 | 2017-01-11 | 王忠民 | Method for determining position of tracked object in virtual reality and device and system thereof |
CN107168515A (en) * | 2017-03-31 | 2017-09-15 | 北京奇艺世纪科技有限公司 | The localization method and device of handle in a kind of VR all-in-ones |
CN107548470A (en) * | 2015-04-15 | 2018-01-05 | 索尼互动娱乐股份有限公司 | Nip and holding gesture navigation on head mounted display |
CN108227920A (en) * | 2017-12-26 | 2018-06-29 | 中国人民解放军陆军航空兵学院 | Move enclosure space method for tracing and tracing system |
US20180329484A1 (en) * | 2017-05-09 | 2018-11-15 | Microsoft Technology Licensing, Llc | Object and environment tracking via shared sensor |
CN109069920A (en) * | 2017-08-16 | 2018-12-21 | 广东虚拟现实科技有限公司 | Hand-held controller, method for tracking and positioning and system |
CN110573993A (en) * | 2017-04-26 | 2019-12-13 | 脸谱科技有限责任公司 | Handheld controller using LED tracking ring |
CN110572635A (en) * | 2019-08-28 | 2019-12-13 | 重庆爱奇艺智能科技有限公司 | Method, equipment and system for tracking and positioning handheld control equipment |
-
2019
- 2019-10-17 CN CN201910989132.4A patent/CN110837295A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107548470A (en) * | 2015-04-15 | 2018-01-05 | 索尼互动娱乐股份有限公司 | Nip and holding gesture navigation on head mounted display |
US20160357261A1 (en) * | 2015-06-03 | 2016-12-08 | Oculus Vr, Llc | Virtual Reality System with Head-Mounted Display, Camera and Hand-Held Controllers |
CN106326930A (en) * | 2016-08-24 | 2017-01-11 | 王忠民 | Method for determining position of tracked object in virtual reality and device and system thereof |
CN107168515A (en) * | 2017-03-31 | 2017-09-15 | 北京奇艺世纪科技有限公司 | The localization method and device of handle in a kind of VR all-in-ones |
CN110573993A (en) * | 2017-04-26 | 2019-12-13 | 脸谱科技有限责任公司 | Handheld controller using LED tracking ring |
US20180329484A1 (en) * | 2017-05-09 | 2018-11-15 | Microsoft Technology Licensing, Llc | Object and environment tracking via shared sensor |
CN109069920A (en) * | 2017-08-16 | 2018-12-21 | 广东虚拟现实科技有限公司 | Hand-held controller, method for tracking and positioning and system |
WO2019033322A1 (en) * | 2017-08-16 | 2019-02-21 | 广东虚拟现实科技有限公司 | Handheld controller, and tracking and positioning method and system |
CN108227920A (en) * | 2017-12-26 | 2018-06-29 | 中国人民解放军陆军航空兵学院 | Move enclosure space method for tracing and tracing system |
CN110572635A (en) * | 2019-08-28 | 2019-12-13 | 重庆爱奇艺智能科技有限公司 | Method, equipment and system for tracking and positioning handheld control equipment |
Non-Patent Citations (1)
Title |
---|
蔡文明等: "《Premiere/VR景观视频剪辑与设计》", 华中科技大学出版社, pages: 188 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111459279A (en) * | 2020-04-02 | 2020-07-28 | 重庆爱奇艺智能科技有限公司 | Active light filling equipment, 3DOF handle, VR equipment and tracking system |
CN112451962A (en) * | 2020-11-09 | 2021-03-09 | 青岛小鸟看看科技有限公司 | Handle control tracker |
CN112451962B (en) * | 2020-11-09 | 2022-11-29 | 青岛小鸟看看科技有限公司 | Handle control tracker |
US11712619B2 (en) | 2020-11-09 | 2023-08-01 | Qingdao Pico Technology Co., Ltd. | Handle controller |
WO2022227374A1 (en) * | 2021-04-27 | 2022-11-03 | 青岛小鸟看看科技有限公司 | Control method and apparatus for handle control tracker, and head-mounted display device |
US11896894B2 (en) | 2021-04-27 | 2024-02-13 | Qingdao Pico Technology Co., Ltd. | Control method and apparatus for handle control, and head mounted display |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7535089B2 (en) | Depth Sensing Techniques for Virtual, Augmented, and Mixed Reality Systems | |
US11625841B2 (en) | Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium | |
CN102763422B (en) | Projectors and depth cameras for deviceless augmented reality and interaction | |
CN102681958B (en) | Use physical gesture transmission data | |
JP6469706B2 (en) | Modeling structures using depth sensors | |
CN110572635A (en) | Method, equipment and system for tracking and positioning handheld control equipment | |
EP3028120A1 (en) | Ergonomic physical interaction zone cursor mapping | |
CN112971740B (en) | Method and equipment for diagnosing pulse through pulse diagnosing equipment | |
CN109584375B (en) | Object information display method and mobile terminal | |
CN110837295A (en) | Handheld control equipment and tracking and positioning method, equipment and system thereof | |
JP2021523347A (en) | Reduced output behavior of time-of-flight cameras | |
US12067157B2 (en) | Drift cancelation for portable object detection and tracking | |
CN112424832A (en) | System and method for detecting 3D association of objects | |
JP7043601B2 (en) | Methods and devices for generating environmental models and storage media | |
CN108646917B (en) | Intelligent device control method and device, electronic device and medium | |
CN112449691A (en) | Refining virtual mesh models by physical contact | |
CN104808792B (en) | A kind of information processing method and electronic equipment | |
CN105204613A (en) | Information processing method and wearable equipment | |
US20160091966A1 (en) | Stereoscopic tracking status indicating method and display apparatus | |
US20150042621A1 (en) | Method and apparatus for controlling 3d object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200225 |
|
RJ01 | Rejection of invention patent application after publication |