CN107688387A - The method and device of virtual implementing helmet dispersion detection - Google Patents

The method and device of virtual implementing helmet dispersion detection Download PDF

Info

Publication number
CN107688387A
CN107688387A CN201710543923.5A CN201710543923A CN107688387A CN 107688387 A CN107688387 A CN 107688387A CN 201710543923 A CN201710543923 A CN 201710543923A CN 107688387 A CN107688387 A CN 107688387A
Authority
CN
China
Prior art keywords
observation
implementing helmet
virtual implementing
unit
eyepiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710543923.5A
Other languages
Chinese (zh)
Inventor
党少军
姜燕冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Virtual Reality Technology Co Ltd
Original Assignee
Shenzhen Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Virtual Reality Technology Co Ltd filed Critical Shenzhen Virtual Reality Technology Co Ltd
Publication of CN107688387A publication Critical patent/CN107688387A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0257Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0012Optical design, e.g. procedures, algorithms, optimisation routines
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06T5/80
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0161Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements
    • G02B2027/0163Electric or electronic control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery

Abstract

The present invention provides a kind of method and device of virtual implementing helmet dispersion detection, including test cell, observation unit, elementary area and processing unit, the test cell includes virtual implementing helmet to be detected, fixed structure and display screen, and described image unit and the observation unit, the processing unit are electrically connected.Compared with prior art, the present invention effectively simply solves the problems, such as dispersion detection using the combination of test cell, observation unit, elementary area and processing unit.

Description

The method and device of virtual implementing helmet dispersion detection
Technical field
The present invention relates to field of virtual reality, more specifically to a kind of method of virtual implementing helmet dispersion detection And device.
Background technology
Distortion eyeglass has application in many fields, for example, in virtual reality system, in order to allow user visually to gather around There is real feeling of immersion, virtual reality device will cover the visual range of human eye as far as possible, therefore just need virtually existing Real equipment fills a specific sphere radian eyeglass, but when traditional image is projected using Arc lenses in the eye of people, Image is distortion, and human eye just has no idea to obtain the positioning in Virtual Space, i.e., your periphery is all to turn round in virtual reality Bent image.This problem is solved it is necessary to first torsigram picture, distortion figure corresponding to distortion eyeglass is generated by specific algorithm Picture, then these fault images by distortion eyeglass project human eye after, normal image will be become, so as to allow people to feel Feel real position projection and the covering of big angular field of view.Current lens manufacturer can come according to certain distortion parameter Eyeglass is made, these eyeglasses are assembled on virtual implementing helmet by the manufacturer of virtual implementing helmet.For common For the user and software developer of virtual implementing helmet, due to can not detect the instrument of eyeglass distortion parameter, except Distortion parameter can not be intuitively obtained beyond asking for distortion parameter to eyeglass manufacturer, largely have impact on virtual reality The exploitation and use of software.
The content of the invention
In order to solve the defects of current virtual real world devices can not detect virtual implementing helmet monochrome optical dispersion distortion parameter, The present invention provides a kind of method and device of virtual implementing helmet dispersion detection.
The technical solution adopted for the present invention to solve the technical problems is:A kind of virtual implementing helmet dispersion detection is provided Method, comprise the following steps:
S1:Mobile observation unit observes virtual implementing helmet to be detected to point of observation, selects a kind of monochromatic light, is treated described Detect by color lump display patch image in virtual implementing helmet, at the image that elementary area is observed to the observation unit Reason;
S2:When described image unit detects that the color lump images that the observation unit is observed meet preparatory condition When, described image unit transmits detection information to processing unit;
S3:After the processing unit receives the detection information of described image unit transmission, record the color lump position with The corresponding relation of the observation unit position, the observation unit are moved to next point of observation and observed;
S4:The processing unit is fitted according to multigroup color lump position of record and the corresponding relation of the observation unit position Distortion function in database, and record the result of fitting.
Preferably, the observation unit observes the virtual implementing helmet hair to be detected by simulating the angle at human eye visual angle The light penetrated.
Preferably, further comprise the steps:
S5:When data fitting is unsuccessful, the processing unit stores corresponding relation in a manner of point function.
Preferably, target center is set in the picture centre of observation unit shooting, when described image unit detect it is described When the color lump images that observation unit is observed are in target center, it is single to the processing that described image unit transmits detection information Member.
Preferably, middle shaft horizontal line of the virtual implementing helmet to be detected along display screen is shown in units of color lump from described The first end of display screen shows monochromatic light color lump point by point to the second end.
A kind of device of virtual implementing helmet dispersion detection, including test cell, observation unit, elementary area and place are provided Unit is managed, the test cell includes virtual implementing helmet to be detected, fixed structure, and described image unit and the observation are single First, described processing unit is electrically connected, and the virtual implementing helmet to be detected includes display screen and optical mirror slip, described aobvious Display screen and the optical mirror slip are oppositely arranged, and the observation unit includes shadow shield, and it is single that the shadow shield is arranged on the observation Between first and described fixed structure, loophole is provided with the shadow shield.
Preferably, the fixed structure includes clamping device, position-limit mechanism and bottom plate, and the clamping device can be opened, Closed after being put into the virtual implementing helmet to be detected, the fixed virtual implementing helmet to be detected.
Preferably, the observation unit includes observation eyepiece, eyepiece track and motor, and the observation eyepiece can be described Eyepiece track translation described in the drive lower edge of motor, and can under the drive of the motor rotational transform viewing angle.
Preferably, the observation unit includes base, movable plate, observation eyepiece, eyepiece track and motor, the observation mesh Mirror can eyepiece track motion described in the drive lower edge in the motor, the eyepiece track is arranged on the movable plate, institute Stating movable plate can drive the observation eyepiece, the motor and the eyepiece track to move together.
Compared with prior art, the present invention simulates the view mode of human eye to virtual reality head to be detected using observation unit The image information that helmet plays is observed, and establishes on virtual implementing helmet to be detected the position of color lump and observation mesh on display screen The one-to-one relation of the observation position of mirror, is fitted monochromatic distortion function, there is provided Yi Zhongjian using the corresponding relation The method for surveying virtual implementing helmet dispersion to be detected.Observation unit is observed to be detected virtual existing by simulating human eye visual angle angle The light of real helmet transmitting, is advantageous to preferably simulate the direction of observation of human eye, and its result detected is also more nearly human eye The image actually seen, improves accuracy and adaptability.Function pair is established using one kind is provided by the method that color lump is shown The method that should be related to, it can easily try to achieve the coordinate that point of observation corresponds to screen position.By in the image captured by observation unit Middle setting target center, the degree of accuracy and the efficiency of detection can be increased.It is single using test cell, observation unit, elementary area and processing The combination of member effectively simply solves the problems, such as monochromatic light optics Distortion Detect.Be advantageous to prevent because of virtual reality to be detected The actual distortion data of monochromatic light of the helmet has differences and caused anamorphose and secondary colour aberration with natural light distortion data, Save substantial amounts of production cost.Clamping device is set to be convenient for changing virtual implementing helmet to be detected on fixed structure, The reuse of the convenient present invention.By eyepiece motor driven observation unit along eyepiece track motion, it is convenient from multiple angles and Position is observed, and facilitates the setting of multiple points of observation.Observation eyepiece can conveniently be driven along shifting by the setting of movable plate Dynamic plate track motion, it is convenient that next position to be detected is transferred to after a position has been detected.Two groups of facilities for observations can divide Do not measure, be favorably improved Efficiency and accuracy.
Brief description of the drawings
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the module diagram of virtual implementing helmet dispersion detection means first embodiment of the present invention;
Fig. 2 is first embodiment test cell module diagram;
Fig. 3 is virtual implementing helmet dispersion detection means first embodiment schematic diagram of the present invention;
Fig. 4 is virtual implementing helmet dispersion detection means first embodiment side schematic view of the present invention;
Fig. 5 is the module diagram of virtual implementing helmet dispersion detection means second embodiment of the present invention;
Fig. 6 is second embodiment test cell module diagram;
Fig. 7 is display screen fine structure schematic diagram;
Fig. 8 is virtual implementing helmet chromatic dispersion principle schematic diagram;
Fig. 9 is virtual implementing helmet dispersion detection means second embodiment schematic diagram of the present invention;
Figure 10 is virtual implementing helmet dispersion detection means 3rd embodiment schematic diagram of the present invention.
Embodiment
In order to solve the defects of current virtual real world devices can not detect virtual implementing helmet dispersion distortion parameter, the present invention A kind of method and device of virtual reality eyeglass helmet dispersion detection is provided.
In order to which technical characteristic, purpose and the effect of the present invention is more clearly understood, now compares accompanying drawing and describe in detail The embodiment of the present invention.
Fig. 1-Fig. 2 is referred to, virtual reality eyeglass distortion checking of the present invention and adjusting apparatus include test cell 1, observation Unit 2, elementary area 3 and processing unit 4.Wherein, test cell 1 includes trial lens 12 to be measured, fixed structure 14, to be measured to show on trial Piece 12 is removably attached on fixed structure 14.Elementary area 3 is electrically connected with observation unit 2, processing unit 4 and image list Member 3 is electrically connected with.Observation unit 2 is observed test cell 1 by way of shooting image, and observation unit 2 can be shot The image of test cell 1, and the image transmitting of shooting to elementary area 3 is handled, it is single that elementary area 3 can handle observation The image of the shooting of member 2, and result is transferred to processing unit 4 and handled, processing unit 4 can be according to elementary area 3 The data of transmission are handled.
Fig. 3-Fig. 4 is shown as the virtual reality eyeglass distortion checking of example and the first embodiment of adjusting apparatus, is shown Display screen 16 is fixedly installed in fixed structure 14, is provided with eyeglass installation portion 18 on fixed structure 14, and eyeglass installation portion 18 can be with For installing trial lens 12 to be measured.Observation unit 2 includes observation eyepiece 23, eyepiece track 25, eyepiece motor 271, lifting motor 272 and elevating lever 273, observation eyepiece 23 can be under the drive of eyepiece motor 271 along the translation of eyepiece track 25, and can be Rotational transform viewing angle under the drive of eyepiece motor 271.Observation eyepiece 23 is connected with elevating lever 273, and can follow liter The lifting of bar 273 1 is dropped.Elevating lever 273 can be lifted by the control of lifting motor 272 in vertical direction.When in use, eyepiece Motor 271, lifting motor 272 can be coordinated with translation to be rotated and lifts, and observation eyepiece 23 is reached different observation positions, is simulated The light that direction of visual lines observation display screen 16 is launched.
In initial fitting distortion data, fixed structure 14 is removed first, and to be measured show on trial is installed at eyeglass installation portion 18 Piece 12, then fixed structure 14 is arranged on base 21.Eyepiece motor 271 is resetted, eyepiece motor 271 is reached eyepiece track The initial position of 25 one end.Now, preparation is completed before detection.After processing unit 4 receives the order for starting detection, Eyepiece motor 271 and lifting motor 272 drive observation eyepiece 23 to reach first point of observation, meanwhile, the order display of processing unit 4 The display detection informations of screen 16, first, display screen 16 in units of column of pixels from the first end of display screen 16 to the second end by column Longitudinal light is shown, first end and the second end are relative, can artificially specify as needed, and generally we are specified from The direction of unit 2 to the test cell 1 after fixation sees that the left end of display screen 16 is first end, and right-hand member is the second end, when image list When member 3 detects that the display information of display screen 16 reaches the calibration position of observation unit 2 after distortion, elementary area 3 transmits For information to processing unit 4, processing unit 4 records the abscissa positions of light in the now position of observation unit 2 and display screen 16. Then observation unit 2 moves to next point of observation, and the order test cell 1 of processing unit 4 shows detection information, repeats above-mentioned inspection Survey process.Point of observation quantity sets more, and eyeglass lens measurement result is finer, is just advantageously fitted in data.All After the completion of the detection of point of observation, processing unit 4 collects all corresponding relations, and in the corresponding relation fitting data storehouse according to storage The distortion function of storage.After processing unit 4, which is successfully fitted one of them, arrives several distortion functions, processing unit 4 is recorded and deposited Store up the fitting result;When processing unit 4 can not be according to distortion function in the corresponding relation fitting data storehouse measured, processing is single Member 4 stores corresponding relation in a manner of point function.
Due to three kinds of monochromatic light of red, green, blue, refraction angle is slightly different when by trial lens 12 to be measured, can so cause color Scattered appearance.After eyeglass is arranged on into virtual implementing helmet, we can be by further to each monochromatic distortion Situation is detected.When being detected to virtual implementing helmet progress monochromatic light distortion, it would be desirable to in first embodiment The device used is improved.
Refer to Fig. 5-Fig. 8, virtual implementing helmet dispersion detection means second embodiment of the present invention include test cell 1, Observation unit 2, elementary area 3 and processing unit 4.Wherein, test cell 1 includes virtual implementing helmet 13 to be detected, fixed knot Structure 14, virtual implementing helmet 13 to be detected are removably attached on fixed structure 14.Elementary area 3 and observation unit 2 are electrical Connection, processing unit 4 are electrically connected with elementary area 3.Observation unit 2 is carried out by way of shooting image to test cell 1 Observation, observation unit 2 can shoot the image of test cell 1, and the image transmitting of shooting to elementary area 3 is handled, Elementary area 3 can handle the image of the shooting of observation unit 2, and result is transferred into processing unit 4 and handled, and handle The data that unit 4 transmits according to elementary area 3 are handled, and are fitted distortion function according to data processed result.Due to present The virtual implementing helmet overwhelming majority all use Axisymmetric Optical Systems, for Axisymmetric Optical Systems, it is horizontal to determine its The distortion parameter of axis can calculate the distortion parameter of whole optical system according to mathematical method, therefore we can be to it The distortion parameter of horizontal median axis is detected.Processing unit 4 is electrically connected with test cell 1, in use can be by The order display screen 16 of processing unit 4 shows list point by point in units of pixel from the first end of the axis of display screen 16 to the second end Coloured light color lump 161, first end and the second end are relative, can artificially specify as needed, and generally we specify wearing to treat Display screen 16 is first end relative to the left end of wearer after detection virtual implementing helmet 13, and display screen 16 is relative to wearer's Right-hand member is the second end, when elementary area 3 detects the display information of virtual implementing helmet 13 to be detected after distortion to taking things philosophically When examining the calibration position of unit 2, elementary area 3 is surveyed and carries the information to processing unit 4, processing unit 4 store the position of color lump 161 with The corresponding relation of the position of observation unit 2.Observation unit 2 moves to next observation position and observed, and elementary area 3 sees this The corresponding relation examined a little is delivered to processing unit 4.By multigroup observation, processing unit 4 is fitted according to multiple corresponding relations and stored Distortion function in database, corresponding relation is stored in a manner of point function if fitting is unsuccessful.The demarcation of observation unit 2 Position can be specified as needed, and calibration position is typically arranged on the shooting image of observation unit 2 by measurement for convenience Center, and around the position set one fixed width target center, can consider when the image of color lump 161 falls in target center to be checked The display information for surveying virtual implementing helmet 13 reaches the calibration position of observation unit 2 after distortion.
Fig. 7 shows the structure of display screen 16, and display screen 16 is made up of large number of color lump 161, and each color lump 161 is only A kind of red, green, blue color therein can be shown, adjacent color lump 161 typically shows different colors.As shown in figure 8, work as two-phase Not simultaneously as the refraction principle of light, the ray refractive index of different frequency is different, two show the color that advancing coloud nearside block 161 is shown The radiation direction of the different incident human eye after superrefraction of close color lump 161 of color is possible to larger difference, and this is allowed for For virtual implementing helmet 13 to be detected, its conventional distortion parameter should be not only provided, while need to provide monochromatic light Distortion parameter, realize the corresponding relation of color lump 161 and display direction, it is simple to consider that conventional distortion parameter easily produces vision The aberration that can substantially discover or can not substantially discover.Therefore, we will not only measure conventional distortion parameter, it is also desirable to list The distortion parameter of coloured light measures display problem caused by preventing dispersion.
Fig. 9 shows the virtual implementing helmet distortion fitting and the second embodiment of detection means as example, to be detected Virtual implementing helmet 13 is removably mounted in fixed structure 14, and fixed structure 14 includes clamping device 142, position-limit mechanism 141 With bottom plate 143, wherein, clamping device 142 includes torsion spring (not shown), and clamping device 142 can be opened, when being put into void to be detected After intending the real helmet 13, torsion spring can act on clamping device 142 and be allowed to close, and play fixed virtual implementing helmet 13 to be detected Effect.Position-limit mechanism 141 can precisely limit the position of virtual implementing helmet 13 to be detected, prevent virtual reality head to be detected The position of helmet 13 is excessively forward or influences measurement result rearward, and position-limit mechanism 141 and clamping device 142 are fixed on bottom plate 143.Treat Detection virtual implementing helmet 13 includes display screen 16, and display screen 16 can show associated picture letter according to the instruction of processing unit 4 Breath, the light that display screen 16 is launched reflect after optical mirror slip.Observation unit 2 includes observation eyepiece 23, eyepiece track 25 With motor 27, observation eyepiece 23 can be under the drive of motor 27 along the translation of eyepiece track 25, and can be in the band of motor 27 Dynamic lower rotational transform viewing angle.When in use, motor 27 can be coordinated with translation and be rotated, and observation eyepiece 23 is reached different sights Position is examined, simulation direction of visual lines observes the light that virtual implementing helmet 13 to be detected is launched.In fixed structure 14 and observation unit 2 Between be provided with shadow shield 28, the first loophole 280 and the second loophole 282 are provided with shadow shield 28, shadow shield 28 can be protected The light that card display screen 16 is sent can be entered after the first loophole 280 or the second loophole 282 by way of pinhole imaging system Enter observation unit 2, form clearly image, prevent the fuzzy of image.
Figure 10 shows the virtual implementing helmet distortion fitting and the 3rd embodiment of detection means as example, the 3rd In embodiment, test cell 1 and the structure in first embodiment are essentially identical.Virtual implementing helmet 13 to be detected is detachably installed In fixed structure 14.Observation unit 2 includes movable plate 22, observation eyepiece 23, movable plate track 24, eyepiece track 25 and motor 27, observation eyepiece 23 can move under the drive of motor 27 along eyepiece track 25, convert viewing angle.Fixed structure 14 with Shadow shield 28 is provided between observation unit 2, the first loophole 280 and the second loophole 282, shadow shield are provided with shadow shield 28 28 can ensure that the light that display screen 16 is sent can pass through pinhole imaging system after the first loophole 280 or the second loophole 282 Mode enter observation unit 2, form clearly image, prevent the fuzzy of image.Eyepiece track 25 is arranged on movable plate 22, Movable plate 22 can drive observation eyepiece 23, motor 27 and eyepiece track 25 to move together, and movable plate 22 can be corresponding first Two observation positions of the loophole 282 of loophole 280 and second are fixed.
When in use, clamping device 142 is first turned on, is put into virtual implementing helmet 13 to be detected.Motor 27 is resetted, makes electricity Machine 27 reaches the initial position of one end of eyepiece track 25.Now, preparation is completed before detection.When processing unit 4 receives After the order for starting detection, a kind of monochromatic light is detected first, and motor 27 drives observation eyepiece 23 to reach first point of observation, together When, processing unit 4 orders middle shaft horizontal line of the virtual implementing helmet 13 to be detected along display screen 16 with monochromatic light color lump 161 for list Position shows the color lump 161 of same color from the first end of display screen 16 to the second end point by point, when elementary area 3 detect it is to be detected When the display information of virtual implementing helmet 13 reaches the calibration position of observation unit 2 after distortion, the transmission information of elementary area 3 To processing unit 4, processing unit 4 records color lump 161 in the now position of observation unit 2 and virtual implementing helmet to be detected 13 Position, corresponding relation is formed, and store the corresponding relation.Then observation unit 2 moves to next point of observation, repeats above-mentioned inspection Survey process.Point of observation quantity sets more, and eyeglass lens measurement result is finer, is just advantageously fitted in data.All After the completion of the detection of point of observation, processing unit 4 collects all corresponding relations, and in the corresponding relation fitting data storehouse according to storage The distortion function of storage.After processing unit 4, which is successfully fitted one of them, arrives several distortion functions, processing unit 4 is recorded and deposited Store up the fitting result;When processing unit 4 can not be according to distortion function in the corresponding relation fitting data storehouse measured, processing is single Member 4 stores corresponding relation in a manner of point function.After the completion of the detection of red, green, blue one of which monochromatic light, to other Two kinds of monochromatic light are detected one by one.Other monochromatic methods of adjustment are same as mentioned above.Above-mentioned approximating method be for The horizontal distortion fitting of the axis of virtual implementing helmet 13 to be detected, suitable for Axisymmetric Optical Systems.Central shaft horizontal line After the fitting result that distorts determines, for Axisymmetric Optical Systems, it can be easy to draw what is observed by mathematical computations The position of display screen 16 corresponding to every bit in image.This technology is currently used prior art, be will not be repeated here.
Compared with prior art, the present invention simulates the view mode of human eye to virtual reality to be detected using observation unit 2 The image information that the helmet 13 plays is observed, and establishes on virtual implementing helmet 13 to be detected color lump 161 on display screen 16 Position and the one-to-one relation of the observation position of observation eyepiece 23, monochromatic distortion letter is fitted using the corresponding relation A kind of number, there is provided method for detecting the dispersion of virtual implementing helmet 13 to be detected.Observation unit 2 is by simulating human eye visual angle angle To observe the light that virtual implementing helmet 13 to be detected is launched, be advantageous to preferably simulate the direction of observation of human eye, it is detected Result be also more nearly the image that human eye is actually seen, improve accuracy and adaptability.Utilize the method shown by color lump A kind of method for establishing function corresponding relation is provided, can easily try to achieve the coordinate that point of observation corresponds to screen position.By Target center is set in image captured by observation unit 2, the degree of accuracy and the efficiency of detection can be increased.Utilize test cell 1, observation The combination of unit 2, elementary area 3 and processing unit 4 effectively simply solves the problems, such as monochromatic light optics Distortion Detect.Have Beneficial to preventing because the actual distortion data of monochromatic light of virtual implementing helmet 13 to be detected and natural light distortion data have differences Caused anamorphose and secondary colour aberration, save substantial amounts of production cost.Clamping device 142 is set on fixed structure 14 Virtual implementing helmet 13 to be detected, the reuse of the convenient present invention can be convenient for changing.Driven and observed by eyepiece motor 27 Unit 2 moves along eyepiece track 25, conveniently from carrying out, facilitates the setting of multiple points of observation from multiple angles and positions.It is logical Crossing the setting of movable plate 22 conveniently can drive observation eyepiece 23 to be moved along movable plate track 24, convenient to detect a position After be transferred to next position to be detected.Two groups of facilities for observations 20 can measure respectively, be favorably improved Efficiency and accuracy.
Embodiments of the invention are described above in conjunction with accompanying drawing, but the invention is not limited in above-mentioned specific Embodiment, above-mentioned embodiment is only schematical, rather than restricted, one of ordinary skill in the art Under the enlightenment of the present invention, in the case of present inventive concept and scope of the claimed protection is not departed from, it can also make a lot Form, these are belonged within the protection of the present invention.

Claims (9)

  1. A kind of 1. method of virtual implementing helmet dispersion detection, it is characterised in that comprise the following steps:
    S1:Mobile observation unit observes virtual implementing helmet to be detected to point of observation, a kind of monochromatic light is selected, described to be detected Patch image is shown by color lump in virtual implementing helmet, the image that elementary area is observed to the observation unit is handled;
    S2:When described image unit detects that the color lump images that the observation unit is observed meet preparatory condition, institute State elementary area and transmit detection information to processing unit;
    S3:After the processing unit receives the detection information of described image unit transmission, record the color lump position with it is described The corresponding relation of observation unit position, the observation unit are moved to next point of observation and observed;
    S4:The processing unit is according to multigroup color lump position of record and the corresponding relation fitting data of the observation unit position Distortion function in storehouse, and record the result of fitting.
  2. 2. the method for virtual implementing helmet dispersion detection according to claim 1, it is characterised in that the observation unit leads to Cross the light of the angle observation virtual implementing helmet transmitting to be detected at simulation human eye visual angle.
  3. 3. the method for virtual implementing helmet dispersion detection according to claim 2, it is characterised in that further comprise following Step:
    S5:When data fitting is unsuccessful, the processing unit stores corresponding relation in a manner of point function.
  4. 4. the method for virtual implementing helmet dispersion detection according to claim 3, it is characterised in that in the observation unit The picture centre of shooting sets target center, when described image unit detects that the color lump images that the observation unit is observed exist When in target center, described image unit transmits detection information to the processing unit.
  5. 5. the method for virtual implementing helmet dispersion detection according to claim 4, it is characterised in that described to be detected virtual Middle shaft horizontal line of the real helmet along display screen is shown point by point in units of color lump from the first end of the display screen to the second end Monochromatic light color lump.
  6. 6. a kind of device of virtual implementing helmet dispersion detection, it is characterised in that including test cell, observation unit, image list Member and processing unit, the test cell include virtual implementing helmet to be detected, fixed structure, described image unit and the sight Examine unit, the processing unit is electrically connected, the virtual implementing helmet to be detected includes display screen and optical mirror slip, institute State display screen and the optical mirror slip is oppositely arranged, the observation unit includes shadow shield, and the shadow shield is arranged on the sight Examine between unit and the fixed structure, loophole is provided with the shadow shield.
  7. 7. the device of virtual implementing helmet dispersion detection according to claim 6, it is characterised in that the fixed structure bag Clamping device, position-limit mechanism and bottom plate are included, the clamping device can be opened, and be closed after being put into the virtual implementing helmet to be detected Close, the fixed virtual implementing helmet to be detected.
  8. 8. the device of virtual implementing helmet dispersion detection according to claim 7, it is characterised in that the observation unit bag Observation eyepiece, eyepiece track and motor are included, the observation eyepiece eyepiece track can be put down described in the drive lower edge in the motor It is dynamic, and can under the drive of the motor rotational transform viewing angle.
  9. 9. the device of virtual implementing helmet dispersion detection according to claim 7, it is characterised in that the observation unit bag Base, movable plate, observation eyepiece, eyepiece track and motor are included, the observation eyepiece can be in the drive lower edge institute of the motor State eyepiece track motion, the eyepiece track is arranged on the movable plate, the movable plate can drive the observation eyepiece, The motor and the eyepiece track move together.
CN201710543923.5A 2016-11-30 2017-07-05 The method and device of virtual implementing helmet dispersion detection Pending CN107688387A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2016213083149 2016-11-30
CN201621308314 2016-11-30

Publications (1)

Publication Number Publication Date
CN107688387A true CN107688387A (en) 2018-02-13

Family

ID=60100336

Family Applications (35)

Application Number Title Priority Date Filing Date
CN201710544213.4A Withdrawn CN107478412A (en) 2016-11-30 2017-07-05 Virtual implementing helmet distortion checking and the method and device of adjustment
CN201710544203.0A Pending CN107340595A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field region laser corresponding to scale
CN201710543918.4A Pending CN107687936A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual implementing helmet dispersion corresponding to scale
CN201710543944.7A Pending CN107544147A (en) 2016-11-30 2017-07-05 The method and device that depth of field laser based on image scale is set
CN201710544197.9A Pending CN107505708A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field method to set up and device based on image scale
CN201710544211.5A Pending CN107300775A (en) 2016-11-30 2017-07-05 The depth of field based on image scale sets the method and device of optimization
CN201710544200.7A Pending CN107479188A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization
CN201710543919.9A Pending CN107422479A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field method to set up and device corresponding to scale
CN201710543942.8A Pending CN107329264A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with the depth of field
CN201710544192.6A Pending CN107544148A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser based on image scale is set
CN201710543937.7A Pending CN107490861A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization display
CN201710543925.4A Pending CN107329263A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is shown
CN201710544202.6A Pending CN107402448A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with depth of field laser
CN201710543865.6A Pending CN107702894A (en) 2016-11-30 2017-07-05 The method and device of virtual reality eyeglass dispersion detection
CN201710544201.1A Pending CN107291246A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field measurement based on image scale
CN201710544210.0A Pending CN107544151A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field zone approach and device corresponding to scale
CN201710543938.1A Pending CN107357038A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance and the method and device of depth of field adjustment
CN201710544205.XA Pending CN107315252A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region laser is set
CN201710543920.1A Pending CN108121068A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field laser sets the method and device of optimization display
CN201710543924.XA Pending CN107357037A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet laser assisted depth of field optimization
CN201710544208.3A Pending CN107290854A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance optimizes the method and device of display with the depth of field
CN201710543936.2A Pending CN107462991A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is set
CN201710543939.6A Pending CN107526167A (en) 2016-11-30 2017-07-05 The method and device optimized based on depth of field laser corresponding to scale
CN201710543923.5A Pending CN107688387A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet dispersion detection
CN201710544195.XA Pending CN107329266A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region is set
CN201710544204.5A Withdrawn CN107464221A (en) 2016-11-30 2017-07-05 Based on the method and device of virtual reality eyeglass distortion checking and adjustment corresponding to scale
CN201710544196.4A Pending CN107315251A (en) 2016-11-30 2017-07-05 Based on the corresponding virtual implementing helmet interpupillary distance of scale and depth of field method to set up and device
CN201710544212.XA Pending CN107300776A (en) 2016-11-30 2017-07-05 Interpupillary distance depth of field method to set up and device based on image scale
CN201710543921.6A Pending CN107300774A (en) 2016-11-30 2017-07-05 Method and device based on the corresponding virtual implementing helmet distortion checking of scale and adjustment
CN201710544198.3A Pending CN107544149A (en) 2016-11-30 2017-07-05 Region depth of field method to set up and device based on image scale
CN201710543922.0A Pending CN107462400A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual reality eyeglass dispersion corresponding to scale
CN201710544199.8A Pending CN107544150A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field laser corresponding to scale
CN201710543941.3A Pending CN107390364A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser is set
CN201710544194.5A Pending CN107329265A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance optimizes with depth of field laser
CN201710544189.4A Withdrawn CN107357039A (en) 2016-11-30 2017-07-05 Virtual reality eyeglass distortion checking and the method and device of adjustment

Family Applications Before (23)

Application Number Title Priority Date Filing Date
CN201710544213.4A Withdrawn CN107478412A (en) 2016-11-30 2017-07-05 Virtual implementing helmet distortion checking and the method and device of adjustment
CN201710544203.0A Pending CN107340595A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field region laser corresponding to scale
CN201710543918.4A Pending CN107687936A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual implementing helmet dispersion corresponding to scale
CN201710543944.7A Pending CN107544147A (en) 2016-11-30 2017-07-05 The method and device that depth of field laser based on image scale is set
CN201710544197.9A Pending CN107505708A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field method to set up and device based on image scale
CN201710544211.5A Pending CN107300775A (en) 2016-11-30 2017-07-05 The depth of field based on image scale sets the method and device of optimization
CN201710544200.7A Pending CN107479188A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization
CN201710543919.9A Pending CN107422479A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field method to set up and device corresponding to scale
CN201710543942.8A Pending CN107329264A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with the depth of field
CN201710544192.6A Pending CN107544148A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser based on image scale is set
CN201710543937.7A Pending CN107490861A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization display
CN201710543925.4A Pending CN107329263A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is shown
CN201710544202.6A Pending CN107402448A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with depth of field laser
CN201710543865.6A Pending CN107702894A (en) 2016-11-30 2017-07-05 The method and device of virtual reality eyeglass dispersion detection
CN201710544201.1A Pending CN107291246A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field measurement based on image scale
CN201710544210.0A Pending CN107544151A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field zone approach and device corresponding to scale
CN201710543938.1A Pending CN107357038A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance and the method and device of depth of field adjustment
CN201710544205.XA Pending CN107315252A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region laser is set
CN201710543920.1A Pending CN108121068A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field laser sets the method and device of optimization display
CN201710543924.XA Pending CN107357037A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet laser assisted depth of field optimization
CN201710544208.3A Pending CN107290854A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance optimizes the method and device of display with the depth of field
CN201710543936.2A Pending CN107462991A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is set
CN201710543939.6A Pending CN107526167A (en) 2016-11-30 2017-07-05 The method and device optimized based on depth of field laser corresponding to scale

Family Applications After (11)

Application Number Title Priority Date Filing Date
CN201710544195.XA Pending CN107329266A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region is set
CN201710544204.5A Withdrawn CN107464221A (en) 2016-11-30 2017-07-05 Based on the method and device of virtual reality eyeglass distortion checking and adjustment corresponding to scale
CN201710544196.4A Pending CN107315251A (en) 2016-11-30 2017-07-05 Based on the corresponding virtual implementing helmet interpupillary distance of scale and depth of field method to set up and device
CN201710544212.XA Pending CN107300776A (en) 2016-11-30 2017-07-05 Interpupillary distance depth of field method to set up and device based on image scale
CN201710543921.6A Pending CN107300774A (en) 2016-11-30 2017-07-05 Method and device based on the corresponding virtual implementing helmet distortion checking of scale and adjustment
CN201710544198.3A Pending CN107544149A (en) 2016-11-30 2017-07-05 Region depth of field method to set up and device based on image scale
CN201710543922.0A Pending CN107462400A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual reality eyeglass dispersion corresponding to scale
CN201710544199.8A Pending CN107544150A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field laser corresponding to scale
CN201710543941.3A Pending CN107390364A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser is set
CN201710544194.5A Pending CN107329265A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance optimizes with depth of field laser
CN201710544189.4A Withdrawn CN107357039A (en) 2016-11-30 2017-07-05 Virtual reality eyeglass distortion checking and the method and device of adjustment

Country Status (1)

Country Link
CN (35) CN107478412A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109557669A (en) * 2018-11-26 2019-04-02 歌尔股份有限公司 It wears the image drift method for determination of amount of display equipment and wears display equipment
CN117214025A (en) * 2023-11-08 2023-12-12 广东德鑫体育产业有限公司 Helmet lens detection device

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107977076B (en) * 2017-11-17 2018-11-27 国网山东省电力公司泰安供电公司 A kind of wearable virtual reality device
CN108008535A (en) * 2017-11-17 2018-05-08 国网山东省电力公司 A kind of augmented reality equipment
CN107942517B (en) * 2018-01-02 2020-03-06 京东方科技集团股份有限公司 VR head-mounted display device and display method thereof
CN108303798B (en) * 2018-01-15 2020-10-09 海信视像科技股份有限公司 Virtual reality helmet, virtual reality helmet interpupillary distance adjusting method and device
CN108426702B (en) * 2018-01-19 2020-06-02 华勤通讯技术有限公司 Dispersion measurement device and method of augmented reality equipment
CN108399606B (en) * 2018-02-02 2020-06-26 北京奇艺世纪科技有限公司 Image adjusting method and device
CN108510549B (en) 2018-03-27 2022-01-04 京东方科技集团股份有限公司 Distortion parameter measuring method, device and system of virtual reality equipment
CN110320009A (en) * 2019-06-25 2019-10-11 歌尔股份有限公司 Optical property detection method and detection device
CN113822104B (en) * 2020-07-07 2023-11-03 湖北亿立能科技股份有限公司 Artificial intelligence surface of water detecting system based on virtual scale of many candidates
CN113768240A (en) * 2021-08-30 2021-12-10 航宇救生装备有限公司 Method for adjusting imaging position of display protection helmet
CN114089508B (en) * 2022-01-19 2022-05-03 茂莱(南京)仪器有限公司 Wide-angle projection lens for detecting optical waveguide AR lens
DE102022207774A1 (en) 2022-07-28 2024-02-08 Robert Bosch Gesellschaft mit beschränkter Haftung Method for an automated calibration of a virtual retinal display for data glasses, calibration device and virtual retinal display

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5619373A (en) * 1995-06-07 1997-04-08 Hasbro, Inc. Optical system for a head mounted display
CN102967473B (en) * 2012-11-30 2015-04-29 奇瑞汽车股份有限公司 Driver front-view measuring device
US10228562B2 (en) * 2014-02-21 2019-03-12 Sony Interactive Entertainment Inc. Realtime lens aberration correction from eye tracking
CN104363986B (en) * 2014-10-31 2017-06-13 华为技术有限公司 A kind of image processing method and equipment
CN104808342B (en) * 2015-04-30 2017-12-12 杭州映墨科技有限公司 The optical lens structure of the wearable virtual implementing helmet of three-dimensional scenic is presented
US10271042B2 (en) * 2015-05-29 2019-04-23 Seeing Machines Limited Calibration of a head mounted eye tracking system
CN105979243A (en) * 2015-12-01 2016-09-28 乐视致新电子科技(天津)有限公司 Processing method and device for displaying stereo images
CN105979252A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Test method and device
CN105867606A (en) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 Image acquisition method and apparatus in virtual reality helmet, and virtual reality helmet
CN105869142A (en) * 2015-12-21 2016-08-17 乐视致新电子科技(天津)有限公司 Method and device for testing imaging distortion of virtual reality helmets
CN105787980B (en) * 2016-03-17 2018-12-25 北京牡丹视源电子有限责任公司 A kind of detection virtual reality shows the method and system of equipment field angle
CN106028013A (en) * 2016-04-28 2016-10-12 努比亚技术有限公司 Wearable device, display device, and display output adjusting method
CN105791789B (en) * 2016-04-28 2019-03-19 努比亚技术有限公司 The method of helmet, display equipment and adjust automatically display output
CN106441212B (en) * 2016-09-18 2020-07-28 京东方科技集团股份有限公司 Device and method for detecting field angle of optical instrument
CN106527733A (en) * 2016-11-30 2017-03-22 深圳市虚拟现实技术有限公司 Virtual-reality helmet distortion fitting-detecting method and device
CN106651954A (en) * 2016-12-27 2017-05-10 天津科技大学 Laser simulation method and device for space sight line benchmark

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《数码照相机可换镜头使用完全手册》: "《数码照相机可换镜头使用完全手册》", 31 January 2015 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109557669A (en) * 2018-11-26 2019-04-02 歌尔股份有限公司 It wears the image drift method for determination of amount of display equipment and wears display equipment
CN109557669B (en) * 2018-11-26 2021-10-12 歌尔光学科技有限公司 Method for determining image drift amount of head-mounted display equipment and head-mounted display equipment
CN117214025A (en) * 2023-11-08 2023-12-12 广东德鑫体育产业有限公司 Helmet lens detection device
CN117214025B (en) * 2023-11-08 2024-01-12 广东德鑫体育产业有限公司 Helmet lens detection device

Also Published As

Publication number Publication date
CN107329266A (en) 2017-11-07
CN107357038A (en) 2017-11-17
CN107402448A (en) 2017-11-28
CN107505708A (en) 2017-12-22
CN107544151A (en) 2018-01-05
CN107544148A (en) 2018-01-05
CN107422479A (en) 2017-12-01
CN107329263A (en) 2017-11-07
CN107478412A (en) 2017-12-15
CN107462400A (en) 2017-12-12
CN107340595A (en) 2017-11-10
CN107329264A (en) 2017-11-07
CN107290854A (en) 2017-10-24
CN107357037A (en) 2017-11-17
CN107702894A (en) 2018-02-16
CN107526167A (en) 2017-12-29
CN108121068A (en) 2018-06-05
CN107490861A (en) 2017-12-19
CN107315251A (en) 2017-11-03
CN107291246A (en) 2017-10-24
CN107390364A (en) 2017-11-24
CN107357039A (en) 2017-11-17
CN107300776A (en) 2017-10-27
CN107544149A (en) 2018-01-05
CN107687936A (en) 2018-02-13
CN107479188A (en) 2017-12-15
CN107315252A (en) 2017-11-03
CN107300774A (en) 2017-10-27
CN107544150A (en) 2018-01-05
CN107329265A (en) 2017-11-07
CN107462991A (en) 2017-12-12
CN107464221A (en) 2017-12-12
CN107544147A (en) 2018-01-05
CN107300775A (en) 2017-10-27

Similar Documents

Publication Publication Date Title
CN107688387A (en) The method and device of virtual implementing helmet dispersion detection
CN106264441A (en) A kind of novel myopia degree tester and application process
CN107884159A (en) virtual image display device photoelectric measuring device
CN108403078A (en) A kind of eye eyesight check device
CN108324239A (en) Portable intelligent optometry unit
CN106441822A (en) Virtual reality headset distortion detection method and device
CN107018398B (en) A method of for the quantization calibration of light field Three-dimensional Display
CN106527733A (en) Virtual-reality helmet distortion fitting-detecting method and device
CN106768878A (en) Optical mirror slip distortion fitting and the method and device for detecting
CN106644403A (en) Lens distortion detection method and apparatus
CN206330725U (en) Device is verified in virtual reality optical distortion
CN106445174A (en) Virtual reality helmet distortion verification method and device
CN208988837U (en) A kind of eye eyesight check device
CN208851459U (en) Portable intelligent optometry unit
CN106073695A (en) A kind of Novel astigmatic number of degrees tester and application process

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180213