[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN116009687A - Virtual display device and virtual display method - Google Patents

Virtual display device and virtual display method Download PDF

Info

Publication number
CN116009687A
CN116009687A CN202211525455.6A CN202211525455A CN116009687A CN 116009687 A CN116009687 A CN 116009687A CN 202211525455 A CN202211525455 A CN 202211525455A CN 116009687 A CN116009687 A CN 116009687A
Authority
CN
China
Prior art keywords
axis
experimenter
ultrasonic
freedom
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211525455.6A
Other languages
Chinese (zh)
Inventor
杜伟华
陈丽莉
姚朝权
张�浩
韩鹏
何惠东
石娟娟
秦瑞峰
姜倩文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202211525455.6A priority Critical patent/CN116009687A/en
Publication of CN116009687A publication Critical patent/CN116009687A/en
Priority to PCT/CN2023/121571 priority patent/WO2024114071A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Acoustics & Sound (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A virtual display device, comprising: an action detection unit configured to detect pose coordinates of a body part of an experimenter with respect to a body reference point in a virtual space; a degree-of-freedom detection unit configured to detect a translation amount of a translational degree of freedom of the experimenter in the virtual space along the X-axis, the Y-axis, and the Z-axis of the spatial coordinate system with respect to the external reference point; the system is further configured to detect a rotation angle of the rotational degrees of freedom of the physical datum point of the experimenter in the virtual space relative to the external reference point around the X axis, the Y axis and the Z axis of the space coordinate system; the main control unit is configured to calculate and obtain the translation amount of the translational degree of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angle of the rotational degree of freedom around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point according to the pose coordinates, the translation amount and the rotation angle of the experimenter. The six-degree-of-freedom tracking of the whole body of the experienter is realized.

Description

Virtual display device and virtual display method
Technical Field
The invention belongs to the field of display, and particularly relates to virtual display equipment and a virtual display method.
Background
Currently, VR (Virtual Reality) glasses or VR headset with 3DOF (Degree of Freedom, i.e., degrees of freedom) can detect free rotation of the head around the rotational degrees of freedom of the spatial coordinate system X, Y, and Z, but cannot detect the amount of spatial displacement of the head and other parts of the body along the translational degrees of freedom of the spatial coordinate system X, Y, and Z.
Disclosure of Invention
Aiming at the problem that the prior VR glasses or VR head-mounted device cannot detect the space displacement of the translational degrees of freedom of the head and other parts of the body along the space coordinate system X axis, Y axis and Z axis, the embodiment of the disclosure provides a virtual display device, which comprises: the device comprises a main control unit, an action detection unit and a degree of freedom detection unit, wherein the action detection unit and the degree of freedom detection unit are respectively connected with the main control unit;
the motion detection unit is configured to detect pose coordinates of a body part of an experimenter relative to a body reference point in a virtual space;
the freedom degree detection unit is configured to detect translation amounts of translational freedom degrees of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point in the virtual space; the system is further configured to detect a rotation angle of the rotational degrees of freedom of the physical datum point of the experimenter in the virtual space relative to the external reference point around the X axis, the Y axis and the Z axis of the space coordinate system;
The main control unit is configured to calculate and obtain the translational amount of the translational degree of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angle of the rotational degree of freedom around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point according to the pose coordinates, the translational amount and the rotation angle of the experimenter.
In some embodiments, the motion detection unit includes an electromyographic signal electrode, a skin tension strain gauge, and a master control module, where the electromyographic signal electrode and the skin tension strain gauge are respectively connected to the master control module;
the electromyographic signal electrode and the skin tension strain gauge are attached to the surrounding skin around each joint part of the experimenter;
the electromyographic signal electrode is configured to detect an electromyographic current, the electromyographic current being indicative of the degree of contraction of the muscle;
the skin tension strain gauge is configured to detect skin stress changes during muscle contraction and relaxation and convert the stress changes into current signals for output;
the main control module is configured to calculate and obtain the translation amount of the translational degrees of freedom of each joint part of the experimenter along the space coordinate system X axis, Y axis and Z axis and the rotation angle of the rotational degrees of freedom around the space coordinate system X axis, Y axis and Z axis in the virtual space relative to the body datum point according to the current signals generated by the myoelectricity current and the skin stress change.
In some embodiments, the degrees-of-freedom detection unit includes a translational degree-of-freedom detection unit and a rotational degree-of-freedom detection unit,
the rotational freedom degree detection unit is configured to detect rotational angles of rotational degrees of freedom of the experimenter around an X axis, a Y axis and a Z axis of a space coordinate system relative to an external reference point in a virtual space;
the translation freedom degree detection unit comprises a plurality of groups of ultrasonic transceiver modules, and the ultrasonic transceiver modules are distributed around the waist of the experimenter;
an acute angle is formed between any two adjacent groups of ultrasonic transceiver modules;
each group at least comprises three ultrasonic transceiver modules, a plurality of ultrasonic transceiver modules in each group are sequentially arranged along the direction from top to bottom of the waist, and an acute angle is formed between any two adjacent ultrasonic transceiver modules;
the ultrasonic transceiver module includes an ultrasonic transmitter configured to transmit ultrasonic waves of a set frequency and an ultrasonic receiver configured to receive the ultrasonic waves.
In some embodiments, the ultrasonic transceiver module further comprises a main control part, a wavelet function generator, a multiplier and an integrator,
The ultrasonic receiver, the multiplier and the integrator are respectively connected with the main control part;
the wavelet function generator, the multiplier and the integrator are connected in sequence;
the wavelet function generator is configured to generate a wavelet function
Figure BDA0003972938060000031
Wherein a and τ are two variables of the wavelet function, α represents a scale, τ represents a shift amount, and t represents time;
the multiplier is configured to super theWaveform function f (x) of ultrasonic wave received by acoustic wave receiver and said wavelet function
Figure BDA0003972938060000032
Multiplying, wherein the independent variable of the waveform function f (x) of the ultrasonic wave is time, and the dependent variable is waveform amplitude;
the integrator is configured to perform a waveform function f (x) and the wavelet function on ultrasonic waves
Figure BDA0003972938060000033
Is integrated by the product of (2)>
Figure BDA0003972938060000034
Obtaining the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver and the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency; wherein the dimension alpha corresponds to the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver, and the translation tau corresponds to the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency;
The main control part is configured to receive the emission frequency of the ultrasonic emitter correspondingly received by the ultrasonic receiver and the time taken for the ultrasonic emitter to emit ultrasonic waves to the ultrasonic receiver to receive ultrasonic waves with corresponding frequencies, obtain the translational amount of the translational degrees of freedom of the ultrasonic receiver relative to an external reference point along the X axis, the Y axis and the Z axis of the space coordinate system according to calculation, and obtain the translational amount of the translational degrees of freedom of the ultrasonic receiver relative to the external reference point along the X axis, the Y axis and the Z axis of the space coordinate system according to the translational amount of the translational degrees of freedom of the ultrasonic receiver relative to the external reference point along the X axis, the Y axis and the Z axis of the space coordinate system and the coordinate of the ultrasonic receiver relative to the body reference point of the experimenter in the virtual space.
In some embodiments, the external reference point includes a location point where the external obstacle is located.
In some embodiments, the main control unit is further configured to calculate and obtain the distance between the body part of the experimenter and each point of the external obstacle surface according to the translation amount of the translational degrees of freedom of the body part of the experimenter along the X-axis, the Y-axis and the Z-axis of the spatial coordinate system relative to each point of the external obstacle surface;
The virtual display device further comprises a virtual construction unit connected with the main control unit and configured to envelope the virtual outer surface of the external obstacle according to the distance between the body part of the experimenter and each point on the surface of the external obstacle.
In some embodiments, the system further comprises a display unit connected with the main control unit and the virtual construction unit, configured to display the virtual space, and further configured to display a virtual outer surface of the external obstacle into the virtual space.
In some embodiments, the system further comprises a safety alarm unit connected with the main control unit and configured to send out an alarm signal to inform the experimenter when the distance between the body part of the experimenter and each point on the surface of the external obstacle is less than or equal to a preset safety distance.
In some embodiments, the display unit may be worn on the head of the experimenter;
the master control unit may be worn on the torso of the experimenter.
In some embodiments, the body reference point is a central location of the master control unit.
The embodiment of the disclosure also provides a virtual display method, which comprises the following steps:
detecting pose coordinates of a body part of an experimenter relative to a body reference point in a virtual space;
Detecting translation amounts of translational degrees of freedom of the physical datum point of the experimenter in the virtual space along the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point;
detecting the rotation angles of the rotational degrees of freedom of the physical datum point of the experimenter in the virtual space relative to the external reference point around the X axis, the Y axis and the Z axis of the space coordinate system;
and according to the pose coordinates, the translation amount and the rotation angle of the experimenter, calculating and obtaining the translation amount of the translation degree of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angle of the rotation degree of freedom around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point.
In some embodiments, the detecting the translational amount of the translational degrees of freedom of the physical datum point of the experimenter in the virtual space along the spatial coordinate system X-axis, Y-axis, Z-axis relative to the external reference point comprises:
the ultrasonic transmitter transmits ultrasonic waves with set frequency;
the ultrasonic receiver receives the ultrasonic wave with the set frequency and the superimposed wave mixed with the ultrasonic waves with other frequencies;
multiplying the waveform function of the superimposed wave with a wavelet function having two variables of scale and translation;
performing integral operation on the product of the waveform function of the superimposed wave and the wavelet function to complete wavelet decomposition, and obtaining the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver and the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave with the corresponding frequency; wherein the scale corresponds to the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver, and the translation amount corresponds to the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency;
According to the emission frequency of the ultrasonic emitter correspondingly received by the ultrasonic receiver and the time taken for the ultrasonic emitter to emit ultrasonic waves to the ultrasonic receiver to receive ultrasonic waves with corresponding frequency, calculating and obtaining the translational quantity of the translational degrees of freedom of the ultrasonic receiver relative to an external reference point along the X axis, the Y axis and the Z axis of a space coordinate system;
and calculating and obtaining the translation amount of the translation freedom degree of the physical datum point of the experimenter in the virtual space along the X-axis, the Y-axis and the Z-axis of the space coordinate system relative to the external reference point according to the translation amount of the translation freedom degree of the ultrasonic receiver along the X-axis, the Y-axis and the Z-axis of the space coordinate system relative to the external reference point and the coordinate of the physical datum point of the experimenter in the virtual space.
In some embodiments, the body part of the experimenter includes a foot;
by the formula
Figure BDA0003972938060000051
Calculating pose coordinates of the foot of the experimenter in the virtual space relative to the body datum point;
wherein o is 1 Representing the body reference points o 2 Representing the foot position; θ 2 Is the rotation angle of the hip joint; θ 3 Is the rotation angle of the knee joint;
Figure BDA0003972938060000052
Figure BDA0003972938060000061
Figure BDA0003972938060000062
Figure BDA0003972938060000063
for theta 2 A pose coordinate matrix relative to the body reference point;
2 Coa theta 2
2 Is sin theta 2
l1 is the vertical distance from the body reference point to the hip joint;
l2 is the horizontal distance from the body reference point to the hip joint;
l3 is the distance from the hip joint to the knee joint;
l4 is the distance from the knee joint to the ankle joint.
In some embodiments, the external reference point includes a location point where the external obstacle is located;
the virtual display method further comprises the following steps:
according to the translational amount of the translational degrees of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system relative to the points on the surface of the external obstacle, calculating and obtaining the distance between the body part of the experimenter and the points on the surface of the external obstacle;
enveloping the virtual outer surface of the external obstacle according to the distance between the body part of the experimenter and each point on the surface of the external obstacle;
displaying the virtual outer surface of the external obstacle into the virtual space.
In some embodiments, further comprising: and when the distance between the body part of the experimenter and each point on the surface of the external obstacle is less than or equal to the preset safety distance, sending out an alarm signal to inform the experimenter.
In some embodiments, further comprising: during the primary experience, recording initial body data of an experimenter through photographing, wherein the initial body data comprises initial pose coordinates of a body part of the experimenter relative to a body datum point in a virtual space and physical dimensions of the body part.
The invention has the beneficial effects that: according to the virtual display device provided by the invention, the main control unit, the action detection unit and the freedom degree detection unit are arranged, so that the relative pose coordinates of the body part of the experimenter relative to the external reference object in three translational freedom degrees and three rotational freedom degrees can be tracked, and the physical state change in the 360-degree range around the experimenter can be tracked; the full-scale tracking of the whole body of the experimenter in six degrees of freedom is realized, meanwhile, the change of dynamic objects around the experimenter can be captured in real time, the obstacle condition of 360-degree space around the experimenter can be detected in real time, and the obstacle avoidance function of the experimenter in the experience process is realized; the virtual display device capable of realizing the whole-body six-degree-of-freedom tracking of the experimenter provides a technical basis for some virtual scenes needing whole-body participation, such as football games, fight games and the like, and the virtual display device realizes the whole-body six-degree-of-freedom tracking and has great promotion effect on the popularization of virtual reality display technology.
The virtual display method provided by the invention can track the relative pose coordinates of the body part of the experimenter relative to the external reference object in three translational degrees of freedom and three rotational degrees of freedom, so that the physical state change in the 360-degree range around the experimenter can be tracked; the full-scale tracking of the whole body of the experimenter in six degrees of freedom is realized, meanwhile, the change of dynamic objects around the experimenter can be captured in real time, the obstacle condition of 360-degree space around the experimenter can be detected in real time, and the obstacle avoidance function of the experimenter in the experience process is realized; the virtual display device capable of realizing the whole-body six-degree-of-freedom tracking of the experimenter provides a technical basis for some virtual scenes needing whole-body participation, such as football games, fight games and the like, and the virtual display device realizes the whole-body six-degree-of-freedom tracking and has great promotion effect on the popularization of virtual reality display technology.
Drawings
Fig. 1 is a schematic block diagram of a virtual display device in an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of wearing effects of a virtual display device on an experimenter in an embodiment of the disclosure.
Fig. 3 is a schematic block diagram of an action detection unit in an embodiment of the present disclosure.
Fig. 4 is a functional block diagram of each unit in the virtual display device according to the embodiment of the present disclosure.
Fig. 5 is a functional block diagram of a translational degree of freedom detection unit in an embodiment of the present disclosure.
Fig. 6 is a flowchart of a virtual display method in an embodiment of the disclosure.
Fig. 7 is a schematic diagram of pose coordinate calculation of an experimenter foot in virtual space relative to a body reference point in an embodiment of the disclosure.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present invention, a virtual display device and a virtual display method according to the present invention are described in further detail below with reference to the accompanying drawings and detailed description.
In the disclosed technology, not only the free rotation of the head around the rotational degrees of freedom of the space coordinate system X-axis, Y-axis and Z-axis can be detected, but also the 6DOF (Degree of Freedom, i.e. Virtual Reality) glasses or VR headset of the translational degrees of freedom of the head along the space coordinate system X-axis, Y-axis and Z-axis can be detected, the image information of the surrounding environment is mainly collected through the double cameras or multiple cameras of the headset part, the depth information of the object is calculated according to the parallax principle through shooting of the cameras at different positions at different angles of the same object, the distance tracking of the object and the experimenter is carried out, the whole calculation process needs to be supported by a processor relying on strong calculation force, the power of the VR glasses or VR headset is increased due to high calculation force, the heat dissipation area needs to be increased, and the arrangement of the parts such as cameras leads to the larger volume of the VR headset part.
On the other hand, at present, 6DOF VR glasses or VR headset only can track hands and heads, and an experienter only can participate in hands and heads in the VR display experience process, so that experience is poor compared with full-body participation in a real environment.
In addition, VR glasses or VR head-mounted device of present 6DOF can't realize the obstacle avoidance with other reality objects at the in-process that experiences the VR and shows, so when having the object to break into the region that experimenter is located suddenly, can't avoid experimenter and this object to bump. Although the VR glasses or VR headset of current 6DOF can do a range calibration to experience environment when first using, experience in-process, experience person's range of motion can not surpass this scope, when approaching this scope, can remind in virtual display picture, but if pass through the pedestrian behind the back suddenly, VR glasses or VR headset are not perceived.
In order to solve the problem that V-glasses or VR headset track less parts of a human body in the disclosed technology, an embodiment of the present invention provides a virtual display device, and referring to fig. 1, a schematic block diagram of the virtual display device in the embodiment of the present disclosure is provided; fig. 2 is a schematic diagram of a wearing effect of a virtual display device on an experimenter in an embodiment of the disclosure; wherein the virtual display device includes: the device comprises a main control unit 1, an action detection unit 2 and a degree of freedom detection unit, wherein the degree of freedom detection unit comprises a translation degree of freedom detection unit 3 and a rotation degree of freedom detection unit 4, and the action detection unit 2, the translation degree of freedom detection unit 3 and the rotation degree of freedom detection unit 4 are respectively connected with the main control unit 1; an action detection unit 2 configured to detect pose coordinates of a body part of an experimenter with respect to a body reference point in a virtual space; a translational degree-of-freedom detecting unit 3 configured to detect a translational amount of translational degrees of freedom of the experimenter in the spatial coordinate system X-axis, Y-axis, Z-axis with respect to the external reference point in the virtual space; a rotational degree-of-freedom detecting unit 4 configured to detect a rotational angle of rotational degrees of freedom of the experimenter about the X-axis, Y-axis, Z-axis of the spatial coordinate system with respect to the external reference point in the virtual space; the main control unit 1 is configured to calculate and obtain the translation amount of the translational degree of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angle of the rotational degree of freedom around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point according to the pose coordinates, the translation amount and the rotation angle of the experimenter.
According to the virtual display device, through the arrangement of the main control unit 1, the action detection unit 2, the translational degree of freedom detection unit 3 and the rotational degree of freedom detection unit 4, the relative pose coordinates of the body part of the experimenter relative to the external reference object in three translational degrees of freedom and three rotational degrees of freedom can be tracked, so that the physical state change in the 360-degree range around the experimenter can be tracked; the full-scale tracking of the whole body of the experimenter in six degrees of freedom is realized, meanwhile, the change of dynamic objects around the experimenter can be captured in real time, the obstacle condition of 360-degree space around the experimenter can be detected in real time, and the obstacle avoidance function of the experimenter in the experience process is realized; the virtual display device capable of realizing the whole-body six-degree-of-freedom tracking of the experimenter provides a technical basis for some virtual scenes needing whole-body participation, such as football games, fight games and the like, and the virtual display device realizes the whole-body six-degree-of-freedom tracking and has great promotion effect on the popularization of virtual reality display technology.
In some embodiments, referring to fig. 3, a schematic block diagram of an action detection unit in an embodiment of the disclosure; FIG. 4 is a functional block diagram of units in a virtual display device according to an embodiment of the present disclosure; the action detection unit 2 comprises an electromyographic signal electrode 21, a skin tension strain gauge 22 and a main control module 23 (MCU), wherein the electromyographic signal electrode 21 and the skin tension strain gauge 22 are respectively connected with the main control module 23; the electromyographic signal electrode 21 and the skin tension strain gauge 22 are attached to the surrounding skin around each joint part of the experimenter; an electromyographic signal electrode 21 configured to detect an electromyographic current, which characterizes the degree of contraction of the muscle; a skin tension strain gauge 22 configured to detect skin stress changes during muscle contraction and relaxation and to convert the stress changes into a current signal output; the main control module 23 is configured to calculate and obtain the translational amount of the translational degrees of freedom of each joint part of the experimenter along the space coordinate system X-axis, Y-axis and Z-axis and the rotation angle of the rotational degrees of freedom around the space coordinate system X-axis, Y-axis and Z-axis in the virtual space relative to the body datum point according to the current signals generated by myoelectricity current and skin stress variation.
In some embodiments, for example, the elbow joint motion detection unit 2 is disposed on the inner and outer sides of the large arm, respectively. Each motion detection unit 2 is composed of one electromyographic signal electrode 21 and a skin tension strain gauge 22. When the muscle contracts, the myoelectric signal electrode 21 detects myoelectric current; the greater the degree of muscle contraction, the greater the myoelectric current produced; by detecting the magnitude of myoelectric current, the contraction degree of muscles can be calculated, and then the rotation angle of the elbow joint is calculated; the skin tension strain gauge 22 can detect skin stress change in the process of muscle contraction and relaxation, when the muscle contracts, skin receives extrusion force action, the skin tension strain gauge 22 receives pressure action, when the muscle relaxes, skin receives surface tension action, the skin tension strain gauge 22 detects tension action, and the elbow joint rotation angle is calculated more accurately by matching the surface force received by the skin detected by the skin tension strain gauge 22 with myoelectricity current signals. Meanwhile, the main control module 23 can establish a spatial coordinate system with the physical datum point of the experimenter as a center, and calculate the translation amount of the elbow joint position relative to the physical datum point along the X axis, the Y axis and the Z axis in the spatial coordinate system according to the position coordinate of the elbow joint position relative to the physical datum point in the spatial coordinate system and the initial position coordinate of the elbow joint position relative to the physical datum point, so that the gesture of the elbow joint can be obtained through the motion detection unit 2. The initial position coordinates of the elbow joint position relative to the body datum point can be obtained by recording initial body data of the experimenter through photographing when the experimenter experiences for the first time, wherein the initial body data comprise initial pose coordinates of the elbow joint of the experimenter relative to the body datum point in a virtual space and physical dimensions of the body part; in the later experience process, the body part gesture of the experimenter can be detected in real time by the action detection unit 2 by taking the initial gesture coordinates as references. Similarly, the body posture of the experimenter can be known according to the rotation angles of other joints of the human body and the translation amount relative to the body datum point, and then the tracking of all parts of the whole body of the human body is completed.
In some embodiments, referring to fig. 4, when the experimenter acts, the electromyographic signal electrode 21 and the skin tension strain gauge 22 generate weak current, the weak current signal is amplified by the band-pass amplifier 24, the analog current signal is converted into a digital signal by the analog-to-digital (a/D) sub-module 230 in the main control module 23, then a specific joint rotation angle is calculated by the calculation sub-module (not shown in the figure) in the main control module 23, and the translational amount and rotation angle of each joint part of the body are reported to the main control unit 1 (AP) by the action detection unit 2 of the body part, so as to obtain the human body posture of the experimenter.
In some embodiments, the motion detection unit 2 is disposed on the outer surface of main skeletal muscle of the human body, and the motion detection unit 2 can detect myoelectric signals generated by skeletal muscle of the human body during contraction and skin surface tension during movement of the human body, and infer the rotation angle of each joint part of the human body according to the data, so as to infer the movement posture of the human body.
In some embodiments, referring to fig. 2, 4, and 5, fig. 5 is a functional block diagram of a translational degree of freedom detection unit in an embodiment of the present disclosure; the translational degree of freedom detection unit 3 comprises a plurality of groups of ultrasonic transceiver modules 30, and the plurality of groups of ultrasonic transceiver modules 30 are distributed around the waist of the experimenter; an acute included angle is formed between any two adjacent groups of ultrasonic transceiver modules 30; each group at least comprises three ultrasonic transceiver modules 30, the plurality of ultrasonic transceiver modules 30 in each group are sequentially arranged along the direction from top to bottom of the waist, and an acute angle theta is formed between any two adjacent ultrasonic transceiver modules 30; the ultrasonic transceiver module 30 includes an ultrasonic transmitter 301 configured to transmit ultrasonic waves of a set frequency, and an ultrasonic receiver 302 configured to receive the ultrasonic waves.
Wherein, set up round ultrasonic transceiver module 30 between experimenter's waist, a plurality of ultrasonic transceiver module 30 in every group are arranged in proper order along waist from the top down's direction, and acute angle contained angle theta is formed between two arbitrary adjacent ultrasonic transceiver module 30 and between two arbitrary adjacent ultrasonic transceiver module 30, on the one hand, can realize the distance tracking of human body horizontal direction 360 degrees within range, on the other hand, can cover human body altitude direction (i.e. vertical gravity direction) distance information as far as possible. In each of the ultrasonic transceiver modules 30, the ultrasonic transmitter 301 may transmit ultrasonic waves with a set frequency, the ultrasonic receiver 302 receives ultrasonic waves with a set frequency reflected by surrounding objects, and the distance between the surrounding objects (such as obstacles) of the experimenter and the ultrasonic transceiver module 30 may be calculated according to the time between the transmission of the ultrasonic waves and the reception.
In this embodiment, in order to obtain the distance information of the surrounding objects of the experimenter, the ultrasonic transceiver modules 30 are densely distributed, but according to the effect of actual ultrasonic ranging, when the arrangement directions or positions of the plurality of ultrasonic transceiver modules 30 are close, crosstalk exists between them, that is, the ultrasonic waves received by the ultrasonic receiver 302 are not necessarily emitted by the ultrasonic emitter 301 of the user, each ultrasonic emitter 301 emits ultrasonic waves with a set frequency, and each ultrasonic receiver 302 actually receives the superposition of ultrasonic waves with different frequencies, how to extract the acoustic wave signals with the frequencies needed by each ultrasonic receiver 302 from the superimposed ultrasonic waves, and then calculate the distance between the surrounding objects of the experimenter and the ultrasonic receiver 302 according to the time of receiving the frequencies, that is, how to filter the superimposed ultrasonic waves to obtain the ultrasonic waves with the wanted frequencies, and can keep the time information of receiving the ultrasonic waves with the corresponding frequencies, thereby obtaining the position information of the surrounding objects of the experimenter with more accurate frequency.
In some embodiments, to solve the above-mentioned problem of how to filter superimposed ultrasonic waves, the ultrasonic transceiver module 30 further includes a main control unit 303 (MCU), a wavelet function generator 304, a multiplier 305, and an integrator 306, where the ultrasonic receiver 302, the multiplier 305, and the integrator 306 are respectively connected to the main control unit 303; the wavelet function generator 304, the multiplier 305 and the integrator 306 are connected in sequence; a wavelet function generator 304 configured to generate a wavelet function
Figure BDA0003972938060000121
Wherein alpha and tau are two variables of the wavelet function, alpha represents a scale, tau represents a translation amount, and t represents time; multiplier 305 configured to add a wavelet function to the waveform function f (x) of the ultrasonic wave received by ultrasonic receiver 302>
Figure BDA0003972938060000122
Multiplying, wherein the independent variable of the waveform function f (x) of the ultrasonic wave is time, and the dependent variable is waveform amplitude; an integrator 306 configured to perform a waveform function f (x) and a wavelet function for the ultrasonic wave +.>
Figure BDA0003972938060000123
Is integrated by the product of (2)>
Figure BDA0003972938060000124
Obtaining a transmission frequency of the ultrasonic transmitter 301 received correspondingly by the ultrasonic receiver 302, and a time taken for transmitting ultrasonic waves from the ultrasonic transmitter 301 to the ultrasonic receiver 302 receiving ultrasonic waves of the corresponding frequency; wherein the dimension α corresponds to the transmission frequency of the ultrasonic transmitter 301 received by the ultrasonic receiver 302, and the translation τ corresponds to the time taken from the ultrasonic transmitter 301 to the ultrasonic receiver 302 receiving the ultrasonic wave of the corresponding frequency; a main control part 303 configured to receive the super The sonic receiver 302 corresponds to the received transmission frequency of the ultrasonic transmitter 301 and the time taken for transmitting ultrasonic waves from the ultrasonic transmitter 301 to the ultrasonic receiver 302 receiving ultrasonic waves of the corresponding frequency, and obtains the translational amounts of the translational degrees of freedom of the ultrasonic receiver 302 along the spatial coordinate system X-axis, Y-axis, Z-axis with respect to the external reference point based on the calculation thereof, and is further configured to calculate the translational amounts of the translational degrees of freedom of the ultrasonic receiver 302 along the spatial coordinate system X-axis, Y-axis, Z-axis with respect to the external reference point based on the translational amounts of the translational degrees of freedom of the ultrasonic receiver 302 along the spatial coordinate system Y-axis, Z-axis with respect to the external reference point and the coordinates of the ultrasonic receiver 302 with respect to the physical reference point of the experimenter in the virtual space.
Wherein the dimension α is inversely proportional to the transmission frequency of the ultrasonic transmitter 301 that its corresponding ultrasonic receiver 302 receives. In this embodiment, by implementing wavelet transformation on superimposed ultrasonic waves received by the ultrasonic receiver 302 by using a hardware circuit, the ultrasonic receiver 302 can obtain ultrasonic waves with a desired frequency by corresponding to the transmission frequency of the received ultrasonic transmitter 301 and the time taken from the ultrasonic transmitter 301 to the ultrasonic receiver 302 to receive ultrasonic waves with a corresponding frequency, and can retain time information of receiving ultrasonic waves with a corresponding frequency, thereby obtaining more accurate position information of surrounding objects of an experimenter. By adopting the hardware circuit to perform wavelet transformation on the superimposed ultrasonic wave received by the ultrasonic receiver 302, compared with a method for realizing wavelet transformation by calculation of a wavelet transformation algorithm by a computer, the method can realize real-time processing on the superimposed ultrasonic wave received by the ultrasonic receiver 302, thereby realizing higher real-time performance of the application scene of the virtual display device.
The wavelet transform (wavelet transform, WT) is a new transform analysis method, which inherits and develops the concept of short-time Fourier transform localization, and overcomes the defects that the window size does not change with frequency, and the like, so that a 'time-frequency' window which changes with frequency can be provided, and the wavelet transform is an ideal tool for carrying out signal time-frequency analysis and processing. The method is mainly characterized in that the characteristics of certain aspects of the problems can be fully highlighted through transformation, the local analysis of time (space) frequency can be realized, the multi-scale refinement of the signals (functions) is gradually carried out through telescopic translation operation, finally, the time subdivision at high frequency is finally achieved, the frequency subdivision at low frequency is finally achieved, the requirement of time-frequency signal analysis can be automatically met, thus any details of the signals can be focused, the problem of difficulty in Fourier transformation is solved, and the method becomes a great breakthrough in a scientific method after Fourier transformation.
In some embodiments, the action detecting unit 2 and the ultrasonic transceiver module 30 are respectively provided with a plurality of action detecting units 2 and ultrasonic transceiver modules 30, in fig. 4, only a small amount of action detecting units 2 and ultrasonic transceiver modules 30 are reserved, in order to increase the processing speed of each action detecting unit 2 and each ultrasonic transceiver module 30, each action detecting unit 2 uses an independent main control module 23 to perform data processing, each ultrasonic transceiver module 30 uses an independent main control part 303 to perform data processing, all main control modules 23, all main control parts 303 and main control units 1 are connected to one network through a CAN bus, and data of each action detecting unit 2 and each ultrasonic transceiver module 30 is transmitted into the main control unit 1 through the CAN bus. The virtual display device can know the relative positions of all parts of the whole body of the experimenter and external reference points (namely surrounding objects or interactors) through the ultrasonic transceiver module 30 arranged between the waists on the premise of knowing the pose coordinates of the experimenter.
In some embodiments, the external reference point comprises a location point where the external obstacle is located.
In some embodiments, referring to fig. 1 and 4, the main control unit 1 is further configured to calculate and obtain the distance between the body part of the experimenter and each point of the external obstacle surface according to the translation amount of the translational degrees of freedom of the body part of the experimenter along the X-axis, the Y-axis and the Z-axis of the spatial coordinate system relative to each point of the external obstacle surface; the virtual display device further comprises a virtual construction unit 5 connected to the main control unit 1 configured to envelope the virtual outer surface of the external obstacle according to the distance between the body part of the experimenter and the points of the external obstacle surface.
In some embodiments, referring to fig. 1 and 4, the virtual display device further includes a display unit 6 connecting the main control unit 1 and the virtual construction unit 5, configured to display a virtual space, and further configured to display a virtual outer surface of an external obstacle into the virtual space.
In some embodiments, referring to fig. 1 and 4, the virtual display device further includes a safety alarm unit 7 connected to the main control unit 1 and configured to issue an alarm signal to notify the experimenter when the distance between the body part of the experimenter and each point on the surface of the external obstacle is less than or equal to a preset safety distance.
When the distance between each point on the surface of the external obstacle and the body part of the experimenter is within the preset safe distance range, the safe alarm unit 7 can alarm and remind in a display mode or an acoustic mode in the virtual space. The virtual display device needs to acquire real-time dynamic positions of surrounding obstacles of the experimenter on one hand, and also needs to acquire pose coordinates of the body part of the experimenter on the other hand, and acquires relative positions of points on the surface of the external obstacle and the body part of the experimenter in real time by taking the real-time dynamic positions of the surrounding obstacles of the experimenter and the pose coordinates of the body part of the experimenter as references.
In some embodiments, referring to fig. 2, the display unit 6 may be worn on the head of the experimenter; the main control unit 1 may be worn on the torso of the experimenter. Such as the master control unit 1 is worn in the chest position of the experimenter. By the arrangement, the volume and weight of the virtual display device relative to VR glasses or VR head-mounted devices in the prior art can be greatly reduced.
In some embodiments, the body reference point is the central position of the master control unit 1. In this embodiment, the virtual display device is a set of tights that is full of the motion detection unit 2.
The virtual display device provided in this embodiment converts the current 6DOF tracking of the hands and the heads of the experimenters in the VR field into the 6DOF tracking of the whole body of the experimenters, so as to realize the interaction between the pose of the whole body of the experimenters and virtual display, replaces the hand and head tracking performed by using a camera with the wearing device based on the ultrasonic transceiver module 30 combined with the motion detection unit 2, separates the main control unit 1 from the head-mounted part, and greatly reduces the volume of the head-mounted part of the virtual display device.
According to the virtual display device provided by the embodiment of the disclosure, through the arrangement of the main control unit 1, the action detection unit 2, the translational degree-of-freedom detection unit 3 and the rotational degree-of-freedom detection unit 4, the relative pose coordinates of the body part of the experimenter relative to the external reference object in three translational degrees of freedom and three rotational degrees of freedom can be tracked, so that the physical state change in the 360-degree range around the experimenter can be tracked; the full-scale tracking of the whole body of the experimenter in six degrees of freedom is realized, meanwhile, the change of dynamic objects around the experimenter can be captured in real time, the obstacle condition of 360-degree space around the experimenter can be detected in real time, and the obstacle avoidance function of the experimenter in the experience process is realized; the virtual display device capable of realizing the whole-body six-degree-of-freedom tracking of the experimenter provides a technical basis for some virtual scenes needing whole-body participation, such as football games, fight games and the like, and the virtual display device realizes the whole-body six-degree-of-freedom tracking and has great promotion effect on the popularization of virtual reality display technology.
Based on the virtual display device in the foregoing embodiment, the embodiment of the disclosure further provides a virtual display method, referring to fig. 6, which is a flowchart of the virtual display method in the embodiment of the disclosure, where the virtual display method includes: step S101: pose coordinates of a body part of an experimenter in virtual space relative to a body reference point are detected.
Step S102: and detecting the translation amount of the translation freedom degree of the physical datum point of the experimenter in the virtual space along the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point.
Step S103: and detecting the rotation angles of the rotation degrees of freedom of the physical datum point of the experimenter in the virtual space around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point.
Step S104: according to pose coordinates, translation amounts and rotation angles of the experimenters, the translation amounts of the translation degrees of freedom of the body parts of the experimenters along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angles of the rotation degrees of freedom around the X axis, the Y axis and the Z axis of the space coordinate system are calculated and obtained.
Step S102: detecting the translational amount of the translational degrees of freedom of the physical datum point of the experimenter in the virtual space along the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point comprises: the ultrasonic transmitter transmits ultrasonic waves with set frequency; the ultrasonic receiver receives the ultrasonic wave with the set frequency and the superimposed wave mixed with the ultrasonic wave with other frequencies; multiplying the waveform function of the superimposed wave with a wavelet function having two variables of scale and translation; performing integral operation on the product of the waveform function of the superimposed wave and the wavelet function to complete wavelet decomposition, and obtaining the transmitting frequency of an ultrasonic transmitter correspondingly received by an ultrasonic receiver and the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave with the corresponding frequency; wherein the scale corresponds to the emission frequency of the ultrasonic emitter received by the ultrasonic receiver, and the translation amount corresponds to the time taken from the ultrasonic emitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency; according to the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver and the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave with the corresponding frequency, calculating and obtaining the translational amount of the translational freedom degree of the ultrasonic receiver relative to the external reference point along the X axis, the Y axis and the Z axis of the space coordinate system; and calculating and obtaining the translation amount of the translational degree of freedom of the physical datum point of the experimenter in the virtual space along the X-axis, the Y-axis and the Z-axis of the space coordinate system relative to the external reference point according to the translation amount of the translational degree of freedom of the ultrasonic receiver along the X-axis, the Y-axis and the Z-axis of the space coordinate system relative to the external reference point and the coordinate of the physical datum point of the ultrasonic receiver relative to the experimenter in the virtual space.
In some embodiments, referring to fig. 7, a schematic diagram of pose coordinate calculation of an experimenter foot in virtual space relative to a body reference point in an embodiment of the disclosure; the body part of the experimenter comprises feet; by the formula
Figure BDA0003972938060000161
Calculating pose coordinates of the foot of the experimenter in the virtual space relative to the body datum point; wherein o is 1 Representing the body reference points o 2 Representing the foot position; θ 2 Is the rotation angle of the hip joint; θ 3 Is the rotation angle of the knee joint;
Figure BDA0003972938060000162
Figure BDA0003972938060000171
Figure BDA0003972938060000172
Figure BDA0003972938060000173
for theta 2 A pose coordinate matrix relative to the body reference point; cθ 2 Is cos theta 2 ;sθ 2 Is sin theta 2 The method comprises the steps of carrying out a first treatment on the surface of the l1 is the vertical distance from the body reference point to the hip joint; l2 is the horizontal distance from the body reference point to the hip joint; l3 is the distance from the hip joint to the knee joint; l4 is the distance from the knee joint to the ankle joint.
In some embodiments, the virtual display method further comprises: step S100: during the primary experience, initial body data of an experimenter are recorded through photographing, wherein the initial body data comprise initial pose coordinates of a body part of the experimenter relative to a body datum point in a virtual space and physical dimensions of the body part.
Referring to fig. 7, l1, l2, l3, l4 are initial body data of the experimenter obtained by photographing the experimenter's human body, and let the hip joint position be a, the knee joint position be b, θ 1 The angle between the line segment o1a from the physical datum point of the experimenter to the hip joint position and the vertical direction. θ 2 、θ 3 Acquired by the action detection unit. That is, when the experimenter experiences for the first time, initial body data of the experimenter is recorded through photographing, wherein the initial body data comprises initial pose coordinates and body of each joint part of the experimenter relative to a body datum point in a virtual spacePhysical dimensions of the body part; in the later experience process, the body part gesture of the experimenter can be detected in real time by the action detection unit 2 by taking the initial gesture coordinates as references.
In this embodiment, referring to fig. 7, taking the tracking of the foot position of the experimenter as an example, fig. 7 uses the center point of the main control unit as the body reference point o 1 The rotational degrees of three rotational degrees of freedom of the body reference point coordinate system relative to the external reference point (such as the ground-fixed object or the external obstacle) around the space coordinate system X axis, Y axis and Z axis can be obtained by the rotational degree of freedom detection unit in the virtual display device, the translational amount of the three translational degrees of freedom of the ultrasonic wave receiving and transmitting module relative to the external reference point along the space coordinate system X axis, Y axis and Z axis can be obtained by the ultrasonic wave receiving and transmitting module, because the relative positions of the main control unit (namely the body reference point) and the ultrasonic wave receiving and transmitting module are fixed, and the body reference point o of the experimenter can be obtained by the coordinate translational transformation 1 The translation amount of the coordinate system relative to the external reference point along three translation degrees of freedom of the space coordinate system X axis, Y axis and Z axis can obtain the body posture data of six degrees of freedom of the body datum point relative to the external reference point; in FIG. 7, only the foot position o is known 2 Point coordinate system relative to body reference point o 1 The relative position of the points can be known to the foot position o 2 The relative coordinates of the points relative to the external reference points realize the tracking of six degrees of freedom of the feet of the human body, and the tracking of six degrees of freedom of other parts of the human body is similar.
In some embodiments, the external reference points include points of locations where external obstructions are located; the virtual display method further comprises the following steps: step S105: according to the translational amount of the translational degrees of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system relative to the points on the surface of the external obstacle, calculating and obtaining the distance between the body part of the experimenter and the points on the surface of the external obstacle; enveloping the virtual outer surface of the external obstacle according to the distance between the body part of the experimenter and each point on the surface of the external obstacle; the virtual outer surface of the external obstacle is displayed into the virtual space.
In some embodiments, the virtual display method further comprises: step S106: and when the distance between the body part of the experimenter and each point on the surface of the external obstacle is less than or equal to the preset safety distance, alarming and informing the experimenter in the virtual space.
In this embodiment, on the basis of the real-time posture of the body part of the experimenter, the distance between the external obstacle and the body part of the experimenter is detected, the virtual outer surface of the external obstacle is enveloped according to the end coordinates of the distance information, and then the virtualized obstacle information is displayed in the display unit of the virtual display device, namely, the virtualized external obstacle is placed in the virtual scene displayed by the virtual display device, so that the obstacle avoidance function is realized.
The virtual display method provided by the embodiment of the disclosure can track the relative pose coordinates of the body part of the experimenter relative to the external reference object in three translational degrees of freedom and three rotational degrees of freedom, so that the physical state change in the 360-degree range around the experimenter can be tracked; the full-scale tracking of the whole body of the experimenter in six degrees of freedom is realized, meanwhile, the change of dynamic objects around the experimenter can be captured in real time, the obstacle condition of 360-degree space around the experimenter can be detected in real time, and the obstacle avoidance function of the experimenter in the experience process is realized; the virtual display device capable of realizing the whole-body six-degree-of-freedom tracking of the experimenter provides a technical basis for some virtual scenes needing whole-body participation, such as football games, fight games and the like, and the virtual display device realizes the whole-body six-degree-of-freedom tracking and has great promotion effect on the popularization of virtual reality display technology.
The virtual display device may be: VR panel, VR television, cell phone, tablet computer, notebook computer, display, notebook computer, digital photo frame, navigator, etc. any product or component having VR display function.
It is to be understood that the above embodiments are merely illustrative of the application of the principles of the present invention, but not in limitation thereof. Various modifications and improvements may be made by those skilled in the art without departing from the spirit and substance of the invention, and are also considered to be within the scope of the invention.

Claims (15)

1. A virtual display device, comprising: the device comprises a main control unit, an action detection unit and a degree of freedom detection unit, wherein the action detection unit and the degree of freedom detection unit are respectively connected with the main control unit;
the motion detection unit is configured to detect pose coordinates of a body part of an experimenter relative to a body reference point in a virtual space;
the freedom degree detection unit is configured to detect translation amounts of translational freedom degrees of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point in the virtual space; the system is further configured to detect a rotation angle of the rotational degrees of freedom of the physical datum point of the experimenter in the virtual space relative to the external reference point around the X axis, the Y axis and the Z axis of the space coordinate system;
The main control unit is configured to calculate and obtain the translational amount of the translational degree of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angle of the rotational degree of freedom around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point according to the pose coordinates, the translational amount and the rotation angle of the experimenter.
2. The virtual display device according to claim 1, wherein the motion detection unit comprises an electromyographic signal electrode, a skin tension strain gauge and a main control module, the electromyographic signal electrode and the skin tension strain gauge being respectively connected to the main control module;
the electromyographic signal electrode and the skin tension strain gauge are attached to the surrounding skin around each joint part of the experimenter;
the electromyographic signal electrode is configured to detect an electromyographic current, the electromyographic current being indicative of the degree of contraction of the muscle;
the skin tension strain gauge is configured to detect skin stress changes during muscle contraction and relaxation and convert the stress changes into current signals for output;
the main control module is configured to calculate and obtain the translation amount of the translational degrees of freedom of each joint part of the experimenter along the space coordinate system X axis, Y axis and Z axis and the rotation angle of the rotational degrees of freedom around the space coordinate system X axis, Y axis and Z axis in the virtual space relative to the body datum point according to the current signals generated by the myoelectricity current and the skin stress change.
3. The virtual display device according to claim 2, wherein the degree-of-freedom detecting unit includes a translational degree-of-freedom detecting unit and a rotational degree-of-freedom detecting unit,
the rotational freedom degree detection unit is configured to detect rotational angles of rotational degrees of freedom of the experimenter around an X axis, a Y axis and a Z axis of a space coordinate system relative to an external reference point in a virtual space;
the translation freedom degree detection unit comprises a plurality of groups of ultrasonic transceiver modules, and the ultrasonic transceiver modules are distributed around the waist of the experimenter;
an acute angle is formed between any two adjacent groups of ultrasonic transceiver modules;
each group at least comprises three ultrasonic transceiver modules, a plurality of ultrasonic transceiver modules in each group are sequentially arranged along the direction from top to bottom of the waist, and an acute angle is formed between any two adjacent ultrasonic transceiver modules;
the ultrasonic transceiver module includes an ultrasonic transmitter configured to transmit ultrasonic waves of a set frequency and an ultrasonic receiver configured to receive the ultrasonic waves.
4. The virtual display device according to claim 3, wherein the ultrasonic transceiver module further comprises a main control part, a wavelet function generator, a multiplier and an integrator,
The ultrasonic receiver, the multiplier and the integrator are respectively connected with the main control part;
the wavelet function generator, the multiplier and the integrator are connected in sequence;
the wavelet function generator is configured to generate a wavelet function
Figure FDA0003972938050000021
Wherein alpha and tau are two variables of the wavelet function, alpha represents a scale, tau represents a translation amount, and t represents time; />
The multiplier is configured to combine the waveform function f (x) of the ultrasonic wave received by the ultrasonic receiver with the wavelet function
Figure FDA0003972938050000022
Multiplying, wherein the independent variable of the waveform function f (x) of the ultrasonic wave is time, and the dependent variable is waveform amplitude;
the integrator is configured to perform a waveform function f (x) and the wavelet function on ultrasonic waves
Figure FDA0003972938050000023
Is integrated by the product of (2)>
Figure FDA0003972938050000031
Obtaining the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver and the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency; wherein the dimension alpha corresponds to the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver, and the translation tau corresponds to the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency;
The main control part is configured to receive the emission frequency of the ultrasonic emitter correspondingly received by the ultrasonic receiver and the time taken for the ultrasonic emitter to emit ultrasonic waves to the ultrasonic receiver to receive ultrasonic waves with corresponding frequencies, obtain the translational amount of the translational degrees of freedom of the ultrasonic receiver relative to an external reference point along the X axis, the Y axis and the Z axis of the space coordinate system according to calculation, and obtain the translational amount of the translational degrees of freedom of the ultrasonic receiver relative to the external reference point along the X axis, the Y axis and the Z axis of the space coordinate system according to the translational amount of the translational degrees of freedom of the ultrasonic receiver relative to the external reference point along the X axis, the Y axis and the Z axis of the space coordinate system and the coordinate of the ultrasonic receiver relative to the body reference point of the experimenter in the virtual space.
5. The virtual display device according to any one of claims 1-4, wherein the external reference point comprises a location point where an external obstacle is located.
6. The virtual display device according to claim 5, wherein the main control unit is further configured to calculate and obtain the distance between the body part of the experimenter and each point of the external obstacle surface according to the translation amount of the translational degrees of freedom of the body part of the experimenter along the X-axis, the Y-axis and the Z-axis of the spatial coordinate system relative to each point of the external obstacle surface;
The virtual display device further comprises a virtual construction unit connected with the main control unit and configured to envelope the virtual outer surface of the external obstacle according to the distance between the body part of the experimenter and each point on the surface of the external obstacle.
7. The virtual display device according to claim 6, further comprising a display unit connecting the main control unit and the virtual construction unit, configured to display the virtual space, and further configured to display a virtual outer surface of the external obstacle into the virtual space.
8. The virtual display device according to claim 7, further comprising a safety alarm unit connected to the main control unit and configured to issue an alarm signal to notify the experimenter when it is determined that the distance between the body part of the experimenter and each point on the surface of the external obstacle is less than or equal to a preset safety distance.
9. The virtual display device of claim 7, wherein the body reference point is a central location of the master control unit.
10. A virtual display method, comprising:
detecting pose coordinates of a body part of an experimenter relative to a body reference point in a virtual space;
Detecting translation amounts of translational degrees of freedom of the physical datum point of the experimenter in the virtual space along the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point;
detecting the rotation angles of the rotational degrees of freedom of the physical datum point of the experimenter in the virtual space relative to the external reference point around the X axis, the Y axis and the Z axis of the space coordinate system;
and according to the pose coordinates, the translation amount and the rotation angle of the experimenter, calculating and obtaining the translation amount of the translation degree of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system and the rotation angle of the rotation degree of freedom around the X axis, the Y axis and the Z axis of the space coordinate system relative to the external reference point.
11. The virtual display method of claim 10, wherein detecting the amount of translation of the physical datum point of the experimenter in the virtual space relative to the external datum point along the translational degrees of freedom of the spatial coordinate system X-axis, Y-axis, Z-axis comprises:
the ultrasonic transmitter transmits ultrasonic waves with set frequency;
the ultrasonic receiver receives the ultrasonic wave with the set frequency and the superimposed wave mixed with the ultrasonic waves with other frequencies;
multiplying the waveform function of the superimposed wave with a wavelet function having two variables of scale and translation;
Performing integral operation on the product of the waveform function of the superimposed wave and the wavelet function to complete wavelet decomposition, and obtaining the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver and the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave with the corresponding frequency; wherein the scale corresponds to the transmitting frequency of the ultrasonic transmitter correspondingly received by the ultrasonic receiver, and the translation amount corresponds to the time taken from the ultrasonic transmitter to the ultrasonic receiver receiving the ultrasonic wave of the corresponding frequency;
according to the emission frequency of the ultrasonic emitter correspondingly received by the ultrasonic receiver and the time taken for the ultrasonic emitter to emit ultrasonic waves to the ultrasonic receiver to receive ultrasonic waves with corresponding frequency, calculating and obtaining the translational quantity of the translational degrees of freedom of the ultrasonic receiver relative to an external reference point along the X axis, the Y axis and the Z axis of a space coordinate system;
and calculating and obtaining the translation amount of the translation freedom degree of the physical datum point of the experimenter in the virtual space along the X-axis, the Y-axis and the Z-axis of the space coordinate system relative to the external reference point according to the translation amount of the translation freedom degree of the ultrasonic receiver along the X-axis, the Y-axis and the Z-axis of the space coordinate system relative to the external reference point and the coordinate of the physical datum point of the experimenter in the virtual space.
12. The virtual display method of claim 10, wherein the body part of the experimenter comprises a foot;
by the formula
Figure FDA0003972938050000051
Calculating pose coordinates of the foot of the experimenter in the virtual space relative to the body datum point;
wherein o is 1 Representing the body reference points o 2 Representing the foot position; θ 2 Is the rotation angle of the hip joint; θ 3 Is the rotation angle of the knee joint;
Figure FDA0003972938050000061
Figure FDA0003972938050000062
Figure FDA0003972938050000063
Figure FDA0003972938050000064
for theta 2 A pose coordinate matrix relative to the body reference point;
2 is cos theta 2
2 Is sin theta 2
l1 is the vertical distance from the body reference point to the hip joint;
l2 is the horizontal distance from the body reference point to the hip joint;
l3 is the distance from the hip joint to the knee joint;
l4 is the distance from the knee joint to the ankle joint.
13. The virtual display method according to claim 11, wherein the external reference point comprises a position point of an external obstacle;
the virtual display method further comprises the following steps:
according to the translational amount of the translational degrees of freedom of the body part of the experimenter along the X axis, the Y axis and the Z axis of the space coordinate system relative to the points on the surface of the external obstacle, calculating and obtaining the distance between the body part of the experimenter and the points on the surface of the external obstacle;
enveloping the virtual outer surface of the external obstacle according to the distance between the body part of the experimenter and each point on the surface of the external obstacle;
Displaying the virtual outer surface of the external obstacle into the virtual space.
14. The virtual display method of claim 13, further comprising: and when the distance between the body part of the experimenter and each point on the surface of the external obstacle is less than or equal to the preset safety distance, sending out an alarm signal to inform the experimenter.
15. The virtual display method of claim 10, further comprising: during the primary experience, recording initial body data of an experimenter through photographing, wherein the initial body data comprises initial pose coordinates of a body part of the experimenter relative to a body datum point in a virtual space and physical dimensions of the body part.
CN202211525455.6A 2022-11-30 2022-11-30 Virtual display device and virtual display method Pending CN116009687A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211525455.6A CN116009687A (en) 2022-11-30 2022-11-30 Virtual display device and virtual display method
PCT/CN2023/121571 WO2024114071A1 (en) 2022-11-30 2023-09-26 Virtual display device and virtual display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211525455.6A CN116009687A (en) 2022-11-30 2022-11-30 Virtual display device and virtual display method

Publications (1)

Publication Number Publication Date
CN116009687A true CN116009687A (en) 2023-04-25

Family

ID=86036184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211525455.6A Pending CN116009687A (en) 2022-11-30 2022-11-30 Virtual display device and virtual display method

Country Status (2)

Country Link
CN (1) CN116009687A (en)
WO (1) WO2024114071A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024114071A1 (en) * 2022-11-30 2024-06-06 京东方科技集团股份有限公司 Virtual display device and virtual display method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102085180B1 (en) * 2013-10-08 2020-03-05 삼성전자주식회사 Method of estimating body's orientation, Computer readable storage medium of recording the method and an device
CN107992189A (en) * 2017-09-22 2018-05-04 深圳市魔眼科技有限公司 A kind of virtual reality six degree of freedom exchange method, device, terminal and storage medium
KR102299936B1 (en) * 2019-12-30 2021-09-09 주식회사 버넥트 Method and system for posture estimation about object tracking taken by camera
JP6881635B2 (en) * 2020-02-27 2021-06-02 株式会社リコー Information processing equipment, systems and programs
CN115202471A (en) * 2022-06-21 2022-10-18 京东方科技集团股份有限公司 Whole body posture tracking and touch equipment and virtual reality system
CN116009687A (en) * 2022-11-30 2023-04-25 北京京东方显示技术有限公司 Virtual display device and virtual display method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024114071A1 (en) * 2022-11-30 2024-06-06 京东方科技集团股份有限公司 Virtual display device and virtual display method

Also Published As

Publication number Publication date
WO2024114071A9 (en) 2024-08-02
WO2024114071A1 (en) 2024-06-06

Similar Documents

Publication Publication Date Title
EP3545385B1 (en) Wearable motion tracking system
US11262841B2 (en) Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing
JP6526026B2 (en) Method and system for determining motion of object
CN105608746B (en) A method of reality is subjected to Virtual Realization
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
CN206497423U (en) A kind of virtual reality integrated system with inertia action trap setting
US20090303179A1 (en) Kinetic Interface
WO2024114071A1 (en) Virtual display device and virtual display method
CN102755745A (en) Whole-body simulation game equipment
CN113268141A (en) Motion capture method and device based on inertial sensor and fabric electronics
Jovanov et al. Avatar—A multi-sensory system for real time body position monitoring
Frey et al. Off-the-shelf, real-time, human body motion capture for synthetic environments
WO2017061890A1 (en) Wireless full body motion control sensor
CN105824432A (en) Motion capturing system
CN211381367U (en) Brain rehabilitation integrated equipment
Moiz et al. A wearable motion tracker
CN115410276A (en) Yoga demonstration method, system, device and medium
CN211019070U (en) Three-dimensional video acquisition system
JP2004340882A (en) Three-dimensional coordinate measuring apparatus for entertainment using ultrasonic wave
JPH0747011Y2 (en) Eyepiece display
Ding et al. Integration of sensing and feedback components for human motion replication
CN116625213A (en) Real-time measurement system and method for three-dimensional position and posture of VR helmet tracking handle
Wright et al. Leap Motion Performance in an Augmented Reality Workspace

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination