[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107329263A - The method and device that the virtual implementing helmet depth of field is shown - Google Patents

The method and device that the virtual implementing helmet depth of field is shown Download PDF

Info

Publication number
CN107329263A
CN107329263A CN201710543925.4A CN201710543925A CN107329263A CN 107329263 A CN107329263 A CN 107329263A CN 201710543925 A CN201710543925 A CN 201710543925A CN 107329263 A CN107329263 A CN 107329263A
Authority
CN
China
Prior art keywords
virtual implementing
implementing helmet
observation
depth
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710543925.4A
Other languages
Chinese (zh)
Inventor
姜燕冰
党少军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Virtual Reality Technology Co Ltd
Original Assignee
Shenzhen Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Virtual Reality Technology Co Ltd filed Critical Shenzhen Virtual Reality Technology Co Ltd
Publication of CN107329263A publication Critical patent/CN107329263A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0257Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0012Optical design, e.g. procedures, algorithms, optimisation routines
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0161Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements
    • G02B2027/0163Electric or electronic control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Eyeglasses (AREA)

Abstract

The present invention provides the method and device that a kind of virtual implementing helmet depth of field is shown, including test cell, observation unit, elementary area and processing unit, the test cell includes virtual implementing helmet to be placed, fixed structure, the virtual implementing helmet to be placed includes display screen, the fixed structure includes clamping device and position-limit mechanism, and the clamping device, which can be opened, is put into the virtual implementing helmet.Compared with prior art, the present invention effectively simply solves the problem of depth of field is shown using test cell, observation unit, the combination of elementary area and processing unit.Observation unit is moved along eyepiece track motion by motor belt motor, can facilitate from multiple angles from carrying out, to facilitate the setting of multiple points of observation.

Description

The method and device that the virtual implementing helmet depth of field is shown
Technical field
The present invention relates to field of virtual reality, a kind of method shown more specifically to virtual implementing helmet depth of field And device.
Background technology
Distortion eyeglass has application in many fields, for example, in virtual reality system, in order to allow user visually to gather around There is real feeling of immersion, virtual reality device will cover the visual range of human eye as far as possible, therefore be accomplished by virtually existing Real equipment fills a specific sphere radian eyeglass, but when traditional image is projected using Arc lenses in the eye of people, Image is distortion, and human eye just has no idea to obtain the positioning in Virtual Space, i.e., your periphery is all to turn round in virtual reality Bent image.This problem is solved it is necessary to first torsigram picture, passes through the corresponding distortion figure of specific algorithm generation distortion eyeglass Picture, then these fault images will become normal image, so as to allow people to feel after human eye is projected by distortion eyeglass Feel real position projection and the covering of big angular field of view.Current lens manufacturer can come according to certain distortion parameter Eyeglass is made, these eyeglasses are assembled on virtual implementing helmet by the manufacturer of virtual implementing helmet.For common For the user and software developer of virtual implementing helmet, due to can not detect the instrument of eyeglass distortion parameter, except Distortion parameter can not intuitively be obtained by being asked for eyeglass manufacturer beyond distortion parameter, largely have impact on virtual reality The exploitation of software and use.Simultaneously because distortion parameter can not be obtained, the depth of field of virtual implementing helmet can not just be shown and carried out Set.
The content of the invention
The defect of the depth of field can not be set in order to solve current virtual real world devices, the present invention provides a kind of virtual implementing helmet The method that the depth of field is shown, comprises the following steps:
S100:The distortion parameter of virtual implementing helmet to be tested is stored in processing unit;
S200:The angle position of corresponding sight is calculated according to depth of field relation;
S300:Luminous point is gone out according to the angle position backwards calculation of the distortion parameter of virtual implementing helmet to be placed and sight Position on screen.
Preferably, step S10 is further comprised:Distortion first to the virtual implementing helmet before display is set is joined Number is measured.
Preferably, the measuring method of virtual implementing helmet to be tested comprises the following steps:
S11:Mobile observation unit observes virtual implementing helmet to be detected to point of observation, in the virtual reality head to be detected Pointwise shows dot pattern picture in helmet, and the image that elementary area is observed to the observation unit is handled;
S12:When described image unit, which detects the luminous point image that the observation unit observes, meets preparatory condition, institute State elementary area and transmit detection information to processing unit;
S13:The processing unit receive described image unit transmission detection information after, recording spot position with it is described The corresponding relation of observation unit position, the observation unit is moved to next point of observation and observed;
S14:The processing unit is intended according to multigroup light spot position of record with the corresponding relation of the observation unit position The distortion function in database is closed, and records the result of fitting.
Preferably, the light of the virtual implementing helmet transmitting to be detected is reflected via optical mirror slip, the observation Unit observes the light that the virtual implementing helmet to be detected is launched by simulating the angle at human eye visual angle.
Preferably, further comprise the steps:
S15:When data fitting is unsuccessful, the processing unit stores corresponding relation in the way of point function.
The device that a kind of virtual implementing helmet depth of field is shown, including test cell, observation unit, elementary area and place are provided Unit is managed, the test cell includes virtual implementing helmet to be placed, fixed structure, and the virtual implementing helmet to be placed includes Display screen, the fixed structure includes clamping device and position-limit mechanism, the clamping device can open be put into it is described virtual existing The real helmet.
Preferably, the clamping device includes torsion spring, and the torsion spring can act on institute after clamping device opening State clamping device and be allowed to closure with the fixed virtual implementing helmet.
Preferably, the observation unit includes observation eyepiece, eyepiece track and motor, and the observation eyepiece can be described Eyepiece track motion described in the drive lower edge of motor.
Preferably, the observation unit includes movable plate, observation eyepiece, shadow shield, eyepiece track and motor, the observation Eyepiece can eyepiece track motion described in the drive lower edge in the motor, the eyepiece track is arranged on the movable plate, The movable plate can drive the observation eyepiece, the motor and the eyepiece track to move together.
Preferably, the shadow shield includes loophole.
Compared with prior art, the present invention calculates view directions, and correspondence virtual reality head using according to depth of field relation The distortion data of helmet calculate correspondence display screen luminous point so that reach set the virtual implementing helmet depth of field purpose there is provided A kind of method to set up of the virtual implementing helmet depth of field.The device that the virtual implementing helmet depth of field of the present invention is shown simultaneously can be simultaneously The distortion data of virtual implementing helmet is measured, virtual implementing helmet is also carried out the depth of field in the case of no distortion data and sets Put.Depth of field setting is effectively simply solved using the combination of test cell, observation unit, elementary area and processing unit Problem.Observation unit is moved along eyepiece track motion by motor belt motor, can facilitate from multiple angles from carrying out, to facilitate multiple The setting of point of observation.
Brief description of the drawings
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the module diagram of virtual implementing helmet depth of field display device first embodiment of the present invention;
Fig. 2 is first embodiment test cell module diagram;
Fig. 3 is virtual implementing helmet depth of field display device first embodiment schematic diagram of the present invention;
Fig. 4 is virtual implementing helmet depth of field display device first embodiment side schematic view of the present invention;
Fig. 5 is virtual implementing helmet depth of field displaying principle schematic diagram of the present invention;
Fig. 6 is virtual implementing helmet depth of field display device second embodiment structural representation of the present invention;
Fig. 7 is second embodiment of the invention shade schematic diagram.
Embodiment
The defect of the depth of field can not be set in order to solve current virtual real world devices, the present invention provides a kind of virtual implementing helmet The method and device that the depth of field is shown.
In order to which technical characteristic, purpose and effect to the present invention are more clearly understood from, now compare accompanying drawing and describe in detail The embodiment of the present invention.
Refer to Fig. 1-Fig. 2, virtual implementing helmet depth of field display device of the present invention include test cell 1, observation unit 2, Elementary area 3 and processing unit 4.Wherein, test cell 1 includes trial lens 12 to be measured, fixed structure 14, and trial lens 12 to be measured can Releasably it is fixed on fixed structure 14.Elementary area 3 is electrically connected with observation unit 2, processing unit 4 and the electricity of elementary area 3 Property connection.Observation unit 2 is observed test cell 1 by way of shooting image, and it is single that observation unit 2 can shoot test The image of member 1, and the image transmitting of shooting to elementary area 3 is handled, elementary area 3 can handle observation unit 2 and clap The image taken the photograph, and result is transferred to processing unit 4 handled, what processing unit 4 can be transmitted according to elementary area 3 Data are handled.
Fig. 3-Fig. 4 shows the first embodiment of the virtual implementing helmet depth of field display device as example, display screen 16 It is fixedly installed in fixed structure 14, eyeglass installation portion 18 is provided with fixed structure 14, eyeglass installation portion 18 can be for peace Fill trial lens 12 to be measured.Observation unit 2 includes observation eyepiece 23, eyepiece track 25, eyepiece motor 271, lifting motor 272 and risen Drop bar 273, observation eyepiece 23 can under the drive of eyepiece motor 271 along the translation of eyepiece track 25, and can eyepiece electricity Rotational transform viewing angle under the drive of machine 271.Observation eyepiece 23 is connected with elevating lever 273, it is possible to follow elevating lever 273 One lifting.Elevating lever 273 can be lifted by the control of lifting motor 272 in vertical direction.When in use, eyepiece motor 271, Lifting motor 272 can be coordinated with translation to be rotated and lifts, and observation eyepiece 23 is reached different observation positions, is simulated direction of visual lines Observe the light that display screen 16 is launched.
In initial fitting distortion data, fixed structure 14 is removed first, and to be measured show on trial is installed at eyeglass installation portion 18 Fixed structure 14, is then arranged on base 21 by piece 12.Eyepiece motor 271 is resetted, eyepiece motor 271 is reached eyepiece track The initial position of 25 one end.Now, preparation is completed before detecting.After processing unit 4 receives the order for starting detection, Eyepiece motor 271 and lifting motor 272 drive observation eyepiece 23 to reach first point of observation, meanwhile, the order display of processing unit 4 Screen 16 shows detection informations, first, display screen 16 in units of column of pixels from the first end of display screen 16 to the second end by column The longitudinal light of display, first end and the second end are relative, can artificially specify as needed, generally we are specified from The direction of unit 2 to the test cell 1 after fixation sees that the left end of display screen 16 is first end, and right-hand member is the second end, when image list When the display information that member 3 detects display screen 16 reaches the calibration position of observation unit 2 after distortion, elementary area 3 is transmitted Information is to processing unit 4, and processing unit 4 records the abscissa positions of light in the now position of observation unit 2 and display screen 16. Then observation unit 2 moves to next point of observation, and the order test cell 1 of processing unit 4 shows detection information, repeats above-mentioned inspection Survey process.Point of observation quantity sets more, and eyeglass lens measurement result is finer, is just advantageously fitted in data.All After the completion of the detection of point of observation, processing unit 4 collects all corresponding relations, and in the corresponding relation fitting data storehouse according to storage The distortion function of storage.After processing unit 4 is successfully fitted one of them to several distortion functions, processing unit 4 is recorded and deposited Store up the fitting result;When processing unit 4 can not be according to the distortion function in the corresponding relation fitting data storehouse measured, processing is single Member 4 stores corresponding relation in the way of point function.
Referring to Fig. 5, Fig. 5 shows the Method And Principle schematic diagram that the virtual implementing helmet depth of field of the present invention is shown.As schemed Show, when observer forms image in vision, it is necessary to right and left eyes collaboration imaging.In Figure 5, the transmitting of display screen 16 light passes through The refraction of optical mirror slip arrives separately at right and left eyes, right and left eyes is visually felt there is image at A, and in display screen 16 On, corresponding luminous point is respectively A1And A2, material is thus formed the effect of the depth of field.
Fig. 6-Fig. 7 is referred to, Fig. 6 shows second embodiment of the invention.The second embodiment of the present invention is mainly used in pair The display depth of field of virtual implementing helmet is configured.It is to be placed virtual including virtual implementing helmet 13 to be placed, fixed structure 14 The real helmet 13 is removably mounted in fixed structure 14, and fixed structure 14 includes clamping device 142, position-limit mechanism 141 and bottom Plate 143, wherein, clamping device 142 includes torsion spring (not shown), and clamping device 142 can be opened, to be placed virtual existing when being put into After the real helmet 13, torsion spring can act on clamping device 142 and be allowed to close, and play the work of fixed virtual implementing helmet 13 to be placed With.Position-limit mechanism 141 can precisely limit the position of virtual implementing helmet 13 to be placed, prevent virtual implementing helmet 13 to be placed Position is excessively forward or influences optimum results rearward, and position-limit mechanism 141 and clamping device 142 are fixed on bottom plate 143.Observation is single Member 2 includes two groups of facilities for observations, and two groups of facilities for observations are observed left eye and the corresponding fault image of right eye respectively.Observation is single Member 2 include observation eyepiece 23, eyepiece track 25, motor 27 and shade 29, observation eyepiece 23 can motor 27 drive It is lower along the rotational transform viewing angle of eyepiece track 25.When in use, motor 27 can be around virtual left point of observation 26 and You Guan Examine and a little 28 rotate, observation eyepiece 23 is reached different observation positions, simulation direction of visual lines observes virtual implementing helmet to be placed The light of 13 transmittings.Fig. 7 shows the shade 29 as example, and shade 29 is provided through on shade 29 Slit 291, a diameter of 1mm of slit 291 or so, with certain depth, for ensureing thin image formation by rays condition, makes observation mesh Mirror 23 can accurately observe the light that respective direction is transmitted, and prevent the light in other directions from producing influence to observation result.Shading Device 29 is removably mounted on observation eyepiece 23.
Showing that we can use calculating and setting when being configured to the depth of field.When carrying out calculating and setting, we can be with The distortion parameter of virtual implementing helmet is measured first before depth of field display is set, obtained using this method measurement The distortion function of fitting, determine the viewing angle of observation unit 2 and the corresponding relation of luminous point on display screen 16, the i.e. sight of people with The corresponding relation of luminous point on display screen 16.Then the angle of left and right an eye line is calculated according to depth of field data, and according to distortion letter Number draws the light spot position on the corresponding display screen 16 of the angle.Be iteratively repeated this process can be to owning on image to be displayed The display of depth of field position is effectively set.
Compared with prior art, the present invention calculates view directions, and correspondence virtual reality head using according to depth of field relation The distortion data of helmet calculate correspondence display screen luminous point so that reach set the virtual implementing helmet depth of field purpose there is provided A kind of method to set up of the virtual implementing helmet depth of field.The device that the virtual implementing helmet depth of field of the present invention is shown simultaneously can be simultaneously The distortion data of virtual implementing helmet is measured, virtual implementing helmet is also carried out the depth of field in the case of no distortion data and sets Put.The depth of field is effectively simply solved using the combination of test cell 1, observation unit 2, elementary area 3 and processing unit 4 to set The problem of putting.Drive observation unit 2 to be moved along eyepiece track 25 by motor 27, can facilitate from multiple angles to be seen Examine, facilitate the setting of multiple points of observation.
Embodiments of the invention are described above in conjunction with accompanying drawing, but the invention is not limited in above-mentioned specific Embodiment, above-mentioned embodiment is only schematical, rather than restricted, one of ordinary skill in the art Under the enlightenment of the present invention, in the case of present inventive concept and scope of the claimed protection is not departed from, it can also make a lot Form, these are belonged within the protection of the present invention.

Claims (10)

1. a kind of method that virtual implementing helmet depth of field is shown, it is characterised in that comprise the following steps:
S100:The distortion parameter of virtual implementing helmet to be tested is stored in processing unit;
S200:The angle position of corresponding sight is calculated according to depth of field relation;
S300:Luminous point is gone out in screen according to the angle position backwards calculation of the distortion parameter of virtual implementing helmet to be placed and sight Position on curtain.
2. the method that the virtual implementing helmet depth of field according to claim 1 is shown, it is characterised in that further comprise step S10:The distortion parameter of the virtual implementing helmet is measured first before display is set.
3. the method that the virtual implementing helmet depth of field according to claim 2 is shown, it is characterised in that virtual reality to be tested The measuring method of the helmet comprises the following steps:
S11:Mobile observation unit observes virtual implementing helmet to be detected to point of observation, in the virtual implementing helmet to be detected Pointwise shows dot pattern picture, and the image that elementary area is observed to the observation unit is handled;
S12:When described image unit, which detects the luminous point image that the observation unit observes, meets preparatory condition, the figure As unit transmits detection information to processing unit;
S13:The processing unit is received after the detection information of described image unit transmission, recording spot position and the observation The corresponding relation of cell position, the observation unit is moved to next point of observation and observed;
S14:The processing unit is fitted number according to multigroup light spot position of record with the corresponding relation of the observation unit position According to the distortion function in storehouse, and record the result of fitting.
4. the method that the virtual implementing helmet depth of field according to claim 3 is shown, it is characterised in that described to be detected virtual The light of real helmet transmitting is reflected via optical mirror slip, and the observation unit is observed by simulating the angle at human eye visual angle The light of the virtual implementing helmet transmitting to be detected.
5. the method that the virtual implementing helmet depth of field according to claim 4 is shown, it is characterised in that further comprise following Step:
S15:When data fitting is unsuccessful, the processing unit stores corresponding relation in the way of point function.
6. a kind of method described in utilization claim 1 sets the device that the virtual implementing helmet depth of field of the depth of field is shown, its feature It is, including test cell, observation unit, elementary area and processing unit, the test cell includes virtual reality to be placed The helmet, fixed structure, the virtual implementing helmet to be placed include display screen, and the fixed structure includes clamping device and spacing Mechanism, the clamping device, which can be opened, is put into the virtual implementing helmet.
7. the device that the virtual implementing helmet depth of field according to claim 6 is shown, it is characterised in that the clamping device bag Torsion spring is included, the torsion spring can act on the clamping device after clamping device opening and be allowed to closure with the fixed void Intend the real helmet.
8. the device that the virtual implementing helmet depth of field according to claim 7 is shown, it is characterised in that the observation unit bag Observation eyepiece, eyepiece track and motor are included, the observation eyepiece can eyepiece track fortune described in the drive lower edge in the motor It is dynamic.
9. the device that the virtual implementing helmet depth of field according to claim 7 is shown, it is characterised in that the observation unit bag Include movable plate, observation eyepiece, shadow shield, eyepiece track and motor, the observation eyepiece can the motor drive lower edge The eyepiece track motion, the eyepiece track is arranged on the movable plate, and the movable plate can drive the observation mesh Mirror, the motor and the eyepiece track are moved together.
10. the device that the virtual implementing helmet depth of field according to claim 9 is shown, it is characterised in that the shadow shield bag Include loophole.
CN201710543925.4A 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is shown Pending CN107329263A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2016213083149 2016-11-30
CN201621308314 2016-11-30

Publications (1)

Publication Number Publication Date
CN107329263A true CN107329263A (en) 2017-11-07

Family

ID=60100336

Family Applications (35)

Application Number Title Priority Date Filing Date
CN201710543920.1A Pending CN108121068A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field laser sets the method and device of optimization display
CN201710544203.0A Pending CN107340595A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field region laser corresponding to scale
CN201710544213.4A Withdrawn CN107478412A (en) 2016-11-30 2017-07-05 Virtual implementing helmet distortion checking and the method and device of adjustment
CN201710543924.XA Pending CN107357037A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet laser assisted depth of field optimization
CN201710544195.XA Pending CN107329266A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region is set
CN201710544210.0A Pending CN107544151A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field zone approach and device corresponding to scale
CN201710544199.8A Pending CN107544150A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field laser corresponding to scale
CN201710544196.4A Pending CN107315251A (en) 2016-11-30 2017-07-05 Based on the corresponding virtual implementing helmet interpupillary distance of scale and depth of field method to set up and device
CN201710544200.7A Pending CN107479188A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization
CN201710543941.3A Pending CN107390364A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser is set
CN201710543925.4A Pending CN107329263A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is shown
CN201710543944.7A Pending CN107544147A (en) 2016-11-30 2017-07-05 The method and device that depth of field laser based on image scale is set
CN201710544198.3A Pending CN107544149A (en) 2016-11-30 2017-07-05 Region depth of field method to set up and device based on image scale
CN201710544211.5A Pending CN107300775A (en) 2016-11-30 2017-07-05 The depth of field based on image scale sets the method and device of optimization
CN201710544204.5A Withdrawn CN107464221A (en) 2016-11-30 2017-07-05 Based on the method and device of virtual reality eyeglass distortion checking and adjustment corresponding to scale
CN201710543942.8A Pending CN107329264A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with the depth of field
CN201710544212.XA Pending CN107300776A (en) 2016-11-30 2017-07-05 Interpupillary distance depth of field method to set up and device based on image scale
CN201710543923.5A Pending CN107688387A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet dispersion detection
CN201710544194.5A Pending CN107329265A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance optimizes with depth of field laser
CN201710544208.3A Pending CN107290854A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance optimizes the method and device of display with the depth of field
CN201710544202.6A Pending CN107402448A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with depth of field laser
CN201710543936.2A Pending CN107462991A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is set
CN201710543939.6A Pending CN107526167A (en) 2016-11-30 2017-07-05 The method and device optimized based on depth of field laser corresponding to scale
CN201710544197.9A Pending CN107505708A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field method to set up and device based on image scale
CN201710544192.6A Pending CN107544148A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser based on image scale is set
CN201710543937.7A Pending CN107490861A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization display
CN201710544189.4A Withdrawn CN107357039A (en) 2016-11-30 2017-07-05 Virtual reality eyeglass distortion checking and the method and device of adjustment
CN201710544201.1A Pending CN107291246A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field measurement based on image scale
CN201710543938.1A Pending CN107357038A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance and the method and device of depth of field adjustment
CN201710543918.4A Pending CN107687936A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual implementing helmet dispersion corresponding to scale
CN201710543922.0A Pending CN107462400A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual reality eyeglass dispersion corresponding to scale
CN201710543865.6A Pending CN107702894A (en) 2016-11-30 2017-07-05 The method and device of virtual reality eyeglass dispersion detection
CN201710543919.9A Pending CN107422479A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field method to set up and device corresponding to scale
CN201710544205.XA Pending CN107315252A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region laser is set
CN201710543921.6A Pending CN107300774A (en) 2016-11-30 2017-07-05 Method and device based on the corresponding virtual implementing helmet distortion checking of scale and adjustment

Family Applications Before (10)

Application Number Title Priority Date Filing Date
CN201710543920.1A Pending CN108121068A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field laser sets the method and device of optimization display
CN201710544203.0A Pending CN107340595A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field region laser corresponding to scale
CN201710544213.4A Withdrawn CN107478412A (en) 2016-11-30 2017-07-05 Virtual implementing helmet distortion checking and the method and device of adjustment
CN201710543924.XA Pending CN107357037A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet laser assisted depth of field optimization
CN201710544195.XA Pending CN107329266A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region is set
CN201710544210.0A Pending CN107544151A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field zone approach and device corresponding to scale
CN201710544199.8A Pending CN107544150A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field laser corresponding to scale
CN201710544196.4A Pending CN107315251A (en) 2016-11-30 2017-07-05 Based on the corresponding virtual implementing helmet interpupillary distance of scale and depth of field method to set up and device
CN201710544200.7A Pending CN107479188A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization
CN201710543941.3A Pending CN107390364A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser is set

Family Applications After (24)

Application Number Title Priority Date Filing Date
CN201710543944.7A Pending CN107544147A (en) 2016-11-30 2017-07-05 The method and device that depth of field laser based on image scale is set
CN201710544198.3A Pending CN107544149A (en) 2016-11-30 2017-07-05 Region depth of field method to set up and device based on image scale
CN201710544211.5A Pending CN107300775A (en) 2016-11-30 2017-07-05 The depth of field based on image scale sets the method and device of optimization
CN201710544204.5A Withdrawn CN107464221A (en) 2016-11-30 2017-07-05 Based on the method and device of virtual reality eyeglass distortion checking and adjustment corresponding to scale
CN201710543942.8A Pending CN107329264A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with the depth of field
CN201710544212.XA Pending CN107300776A (en) 2016-11-30 2017-07-05 Interpupillary distance depth of field method to set up and device based on image scale
CN201710543923.5A Pending CN107688387A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet dispersion detection
CN201710544194.5A Pending CN107329265A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance optimizes with depth of field laser
CN201710544208.3A Pending CN107290854A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance optimizes the method and device of display with the depth of field
CN201710544202.6A Pending CN107402448A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with depth of field laser
CN201710543936.2A Pending CN107462991A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is set
CN201710543939.6A Pending CN107526167A (en) 2016-11-30 2017-07-05 The method and device optimized based on depth of field laser corresponding to scale
CN201710544197.9A Pending CN107505708A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field method to set up and device based on image scale
CN201710544192.6A Pending CN107544148A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser based on image scale is set
CN201710543937.7A Pending CN107490861A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization display
CN201710544189.4A Withdrawn CN107357039A (en) 2016-11-30 2017-07-05 Virtual reality eyeglass distortion checking and the method and device of adjustment
CN201710544201.1A Pending CN107291246A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field measurement based on image scale
CN201710543938.1A Pending CN107357038A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance and the method and device of depth of field adjustment
CN201710543918.4A Pending CN107687936A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual implementing helmet dispersion corresponding to scale
CN201710543922.0A Pending CN107462400A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual reality eyeglass dispersion corresponding to scale
CN201710543865.6A Pending CN107702894A (en) 2016-11-30 2017-07-05 The method and device of virtual reality eyeglass dispersion detection
CN201710543919.9A Pending CN107422479A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field method to set up and device corresponding to scale
CN201710544205.XA Pending CN107315252A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region laser is set
CN201710543921.6A Pending CN107300774A (en) 2016-11-30 2017-07-05 Method and device based on the corresponding virtual implementing helmet distortion checking of scale and adjustment

Country Status (1)

Country Link
CN (35) CN108121068A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510549A (en) * 2018-03-27 2018-09-07 京东方科技集团股份有限公司 Distortion parameter measurement method and its device, the measuring system of virtual reality device

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977076B (en) * 2017-11-17 2018-11-27 国网山东省电力公司泰安供电公司 A kind of wearable virtual reality device
CN108008535A (en) * 2017-11-17 2018-05-08 国网山东省电力公司 A kind of augmented reality equipment
CN107942517B (en) * 2018-01-02 2020-03-06 京东方科技集团股份有限公司 VR head-mounted display device and display method thereof
CN108303798B (en) * 2018-01-15 2020-10-09 海信视像科技股份有限公司 Virtual reality helmet, virtual reality helmet interpupillary distance adjusting method and device
CN108426702B (en) * 2018-01-19 2020-06-02 华勤通讯技术有限公司 Dispersion measurement device and method of augmented reality equipment
CN108399606B (en) * 2018-02-02 2020-06-26 北京奇艺世纪科技有限公司 Image adjusting method and device
CN109186957B (en) * 2018-09-17 2024-05-10 浙江晶正光电科技有限公司 High-precision automatic detection equipment for diffusion angle of laser diffusion sheet
CN109557669B (en) * 2018-11-26 2021-10-12 歌尔光学科技有限公司 Method for determining image drift amount of head-mounted display equipment and head-mounted display equipment
US11513346B2 (en) * 2019-05-24 2022-11-29 Beijing Boe Optoelectronics Technology Co., Ltd. Method and apparatus for controlling virtual reality display device
CN110320009A (en) * 2019-06-25 2019-10-11 歌尔股份有限公司 Optical property detection method and detection device
CN113822104B (en) * 2020-07-07 2023-11-03 湖北亿立能科技股份有限公司 Artificial intelligence surface of water detecting system based on virtual scale of many candidates
CN113768240A (en) * 2021-08-30 2021-12-10 航宇救生装备有限公司 Method for adjusting imaging position of display protection helmet
CN114089508B (en) * 2022-01-19 2022-05-03 茂莱(南京)仪器有限公司 Wide-angle projection lens for detecting optical waveguide AR lens
DE102022207774A1 (en) 2022-07-28 2024-02-08 Robert Bosch Gesellschaft mit beschränkter Haftung Method for an automated calibration of a virtual retinal display for data glasses, calibration device and virtual retinal display
CN117214025B (en) * 2023-11-08 2024-01-12 广东德鑫体育产业有限公司 Helmet lens detection device

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5619373A (en) * 1995-06-07 1997-04-08 Hasbro, Inc. Optical system for a head mounted display
CN102967473B (en) * 2012-11-30 2015-04-29 奇瑞汽车股份有限公司 Driver front-view measuring device
US10228562B2 (en) * 2014-02-21 2019-03-12 Sony Interactive Entertainment Inc. Realtime lens aberration correction from eye tracking
CN104363986B (en) * 2014-10-31 2017-06-13 华为技术有限公司 A kind of image processing method and equipment
CN104808342B (en) * 2015-04-30 2017-12-12 杭州映墨科技有限公司 The optical lens structure of the wearable virtual implementing helmet of three-dimensional scenic is presented
US10271042B2 (en) * 2015-05-29 2019-04-23 Seeing Machines Limited Calibration of a head mounted eye tracking system
CN105979243A (en) * 2015-12-01 2016-09-28 乐视致新电子科技(天津)有限公司 Processing method and device for displaying stereo images
CN105979252A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Test method and device
CN105867606A (en) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 Image acquisition method and apparatus in virtual reality helmet, and virtual reality helmet
CN105869142A (en) * 2015-12-21 2016-08-17 乐视致新电子科技(天津)有限公司 Method and device for testing imaging distortion of virtual reality helmets
CN105787980B (en) * 2016-03-17 2018-12-25 北京牡丹视源电子有限责任公司 A kind of detection virtual reality shows the method and system of equipment field angle
CN106028013A (en) * 2016-04-28 2016-10-12 努比亚技术有限公司 Wearable device, display device, and display output adjusting method
CN105791789B (en) * 2016-04-28 2019-03-19 努比亚技术有限公司 The method of helmet, display equipment and adjust automatically display output
CN106441212B (en) * 2016-09-18 2020-07-28 京东方科技集团股份有限公司 Device and method for detecting field angle of optical instrument
CN106527733A (en) * 2016-11-30 2017-03-22 深圳市虚拟现实技术有限公司 Virtual-reality helmet distortion fitting-detecting method and device
CN106651954A (en) * 2016-12-27 2017-05-10 天津科技大学 Laser simulation method and device for space sight line benchmark

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108510549A (en) * 2018-03-27 2018-09-07 京东方科技集团股份有限公司 Distortion parameter measurement method and its device, the measuring system of virtual reality device
US10922795B2 (en) 2018-03-27 2021-02-16 Beijing Boe Optoelectronics Technology Co., Ltd. Method and device for measuring distortion parameter of visual reality device, and measuring system

Also Published As

Publication number Publication date
CN107390364A (en) 2017-11-24
CN107544149A (en) 2018-01-05
CN107315251A (en) 2017-11-03
CN107687936A (en) 2018-02-13
CN107526167A (en) 2017-12-29
CN107479188A (en) 2017-12-15
CN107422479A (en) 2017-12-01
CN107329264A (en) 2017-11-07
CN107315252A (en) 2017-11-03
CN107357038A (en) 2017-11-17
CN107340595A (en) 2017-11-10
CN107490861A (en) 2017-12-19
CN107544150A (en) 2018-01-05
CN107357039A (en) 2017-11-17
CN107329265A (en) 2017-11-07
CN107300776A (en) 2017-10-27
CN107329266A (en) 2017-11-07
CN107462991A (en) 2017-12-12
CN107300774A (en) 2017-10-27
CN107544147A (en) 2018-01-05
CN107464221A (en) 2017-12-12
CN107544151A (en) 2018-01-05
CN107291246A (en) 2017-10-24
CN107462400A (en) 2017-12-12
CN107290854A (en) 2017-10-24
CN107357037A (en) 2017-11-17
CN107544148A (en) 2018-01-05
CN107402448A (en) 2017-11-28
CN107478412A (en) 2017-12-15
CN107505708A (en) 2017-12-22
CN108121068A (en) 2018-06-05
CN107702894A (en) 2018-02-16
CN107300775A (en) 2017-10-27
CN107688387A (en) 2018-02-13

Similar Documents

Publication Publication Date Title
CN107329263A (en) The method and device that the virtual implementing helmet depth of field is shown
Rolland et al. Towards quantifying depth and size perception in virtual environments
US10277893B1 (en) Characterization of optical distortion in a head mounted display
CN108171673A (en) Image processing method, device, vehicle-mounted head-up-display system and vehicle
CN105828699B (en) For measuring the device and method of subjective dioptric
CN108989794B (en) Virtual image information measuring method and system based on head-up display system
CN106441822A (en) Virtual reality headset distortion detection method and device
CN106644404A (en) Virtual reality helmet distortion complete machine detection method and device
CN107888906A (en) The detecting system of crosstalk, the detection method of crosstalk, storage medium and processor
CN106644403A (en) Lens distortion detection method and apparatus
CN106527733A (en) Virtual-reality helmet distortion fitting-detecting method and device
CN106768878A (en) Optical mirror slip distortion fitting and the method and device for detecting
CN106445174A (en) Virtual reality helmet distortion verification method and device
CN206378270U (en) The device of virtual implementing helmet distortion complete machine detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20171107