CN206594361U - A kind of wear-type virtual reality device - Google Patents

A kind of wear-type virtual reality device Download PDF

Info

Publication number
CN206594361U
CN206594361U CN201720113906.3U CN201720113906U CN206594361U CN 206594361 U CN206594361 U CN 206594361U CN 201720113906 U CN201720113906 U CN 201720113906U CN 206594361 U CN206594361 U CN 206594361U
Authority
CN
China
Prior art keywords
user
virtual reality
wear
module
reality device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201720113906.3U
Other languages
Chinese (zh)
Inventor
王铁存
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201720113906.3U priority Critical patent/CN206594361U/en
Application granted granted Critical
Publication of CN206594361U publication Critical patent/CN206594361U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model discloses a kind of wear-type virtual reality device, including displacement detection module, judge module and alarm module, displacement detection module is arranged to detect the moving displacement of user;Whether judge module is arranged to judge user in safe range according to the initial position and moving displacement of user;Alarm module is arranged to send alarm to point out user when the judged result of judge module is no.So, by the real-time monitoring of the moving displacement to user, to ensure that user is in safe range, it becomes possible to protect user security, the problem of preventing user from sending collision with surrounding objects, causing to cause personal injury to user occurs.

Description

A kind of wear-type virtual reality device
Technical field
The utility model is related to virtual reality device technical field, and user security can be protected more particularly, to one kind Wear-type virtual reality device.
Background technology
Virtual reality (Virtual Reality, abbreviation VR) technology will be that support one is qualitative and be quantitatively combined, perceptual The key technology for the comprehensive integration Multi information space that understanding and rational knowledge are combined.With the lifting of the speed of network, base Just quietly come up in an Internet era of virtual reality technology, it will greatly change production and the life style of people. Its specific intension is:The interface equipment such as computer graphics system and various reality and control is comprehensively utilized, is generated on computers , the technology of sensation immersed is provided in the three-dimensional environment that can be interacted.
The feeling of immersion of virtual reality device comes from the isolation with extraneous isolation, especially vision and the sense of hearing so that big Capsules of brain is cheated, and produces the virtual immersion sense departed from real world.
After user wears wear-type virtual reality device, it will be immersed in virtual reality, can't see completely completely The situation of real world, easily collides, and in turn results in personal injury.
Utility model content
A purpose of the present utility model, which is to provide, a kind of can protect the wear-type virtual reality device of user security.
According to first aspect of the present utility model, there is provided a kind of wear-type virtual reality device, including displacement detecting mould Block, judge module and alarm module, institute's displacement detection module are arranged to detect the moving displacement of user;The judge module It is arranged to judge the user whether in safe range according to the initial position and the moving displacement of user;The report Alert module is arranged to send alarm to point out the user when the judged result of the judge module is no.
Optionally, institute's displacement detection module includes full-view camera and first processing units, and first processing is single Member is arranged to the moving displacement of the panoramic picture detection user gathered according to the full-view camera.
Optionally, institute's displacement detection module also includes Inertial Measurement Unit, and the first processing units are also set The moving displacement is modified for the data that are measured according to the Inertial Measurement Unit.
Optionally, the first processing units and the judge module are provided by a processor chips.
Optionally, the inertia detection unit includes 3-axis acceleration sensor and three-axis gyroscope sensor.
Optionally, the wear-type virtual reality device also includes initial distance detection module and safe range determines mould Block, the initial distance detection module be arranged to detect in the initial position and environment between each object it is initial away from From;The safe range determining module is arranged to determine the safe range according to the initial distance.
Optionally, the initial distance detection module is included at depth camera and second processing unit, described second Reason unit be arranged to be determined in the environment according to the gray level image that the depth camera is gathered each object with it is described just The initial distance between beginning position.
Optionally, the initial distance detection module also includes RGB cameras, and the second processing unit is also set For initial distance described in the RGB image amendment that is gathered according to the RGB cameras.
Optionally, the safe range determining module and the second processing unit are provided by a processor chips.
Optionally, the alarm module includes loudspeaker.
A beneficial effect of the present utility model is, by the real-time monitoring of the moving displacement to user, to ensure to use Family is in safe range, it becomes possible to protect user security, prevents user from sending collision with surrounding objects, causing to cause user The problem of personal injury, occurs.And the utility model both can be without by peripheral hardware, e.g. computer etc., it is not required that in real time Map is built, amount of calculation can be effectively reduced, can accurately identify whether current location is safe again, it is flexible that increase equipment is used Property.
It is of the present utility model other by referring to the drawings to the detailed description of exemplary embodiment of the present utility model Feature and its advantage will be made apparent from.
Brief description of the drawings
The accompanying drawing for being combined in the description and constituting a part for specification shows embodiment of the present utility model, and And be used to explain principle of the present utility model together with its explanation.
Fig. 1 is a kind of frame principle figure of implementation structure according to a kind of wear-type virtual reality device of the utility model;
Fig. 2 is the square frame principle of another implementation structure according to a kind of wear-type virtual reality device of the utility model Figure;
Fig. 3 is a kind of embodiment for the method for security protection for being used for wear-type virtual reality device according to the utility model Flow chart.
Description of reference numerals:
U1- displacement detection modules;U11- full-view cameras;
U12- first processing units;U13- Inertial Measurement Units;
U2- judge modules;U3- alarm modules;
U4- initial distance detection modules;U41- depth cameras;
U42- second processing units;U43-RGB cameras;
U5- safe range determining modules.
Embodiment
Various exemplary embodiments of the present utility model are described in detail now with reference to accompanying drawing.It should be noted that:Unless another Illustrate outside, the part and the positioned opposite of step, numerical expression and numerical value otherwise illustrated in these embodiments is not limited Make scope of the present utility model.
The description only actually at least one exemplary embodiment is illustrative below, never as to this practicality New and its application or any limitation used.
It may be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable In the case of, the technology, method and apparatus should be considered as a part for specification.
In shown here and discussion all examples, any occurrence should be construed as merely exemplary, without It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined, then it need not be further discussed in subsequent accompanying drawing in individual accompanying drawing.
In order to solve user present in prior art when using wear-type virtual reality device, due to invisible reality The situation in the world, there is provided a kind of wear-type virtual reality for the problem of easily colliding and then cause personal injury to user Equipment, as shown in figure 1, including displacement detection module U1, judge module U2 and alarm module U3, displacement detection module U1 is set To detect the moving displacement of user;Judge module U2 is arranged to judge that user is according to the initial position and moving displacement of user It is no to be in safe range;Alarm module U3 is arranged to send alarm to point out when judge module U2 judged result is no User.
Specifically, safe range can be centered on initial position, radius for setting value round sealed region, its In, the setting value is less than the initial position and away from the distance between its nearest object;Safe range can also be one and include The initial position but by all excluded closed area of all objects in the environment, wherein, each object is away from the safe range The minimum range on border is all higher than preset distance, and minimum range of each object away from the safe range border can be equal.Cause This by initial position and moving displacement it may determine that go out user whether exceed safe range border, that is, whether judge user In safe range.If user exceeds the border of safe range, illustrate there is what user collided with surrounding objects May, by alarm user, to ensure user security.
It is possible to further be that moving displacement is sent to judge module U2 by displacement detection module U1 in real time, to sentence in real time Whether disconnected user is in safe range;Can also be that displacement detection module U1 intervals setting time is, for example, to move interval in 1 second Dynamic displacement is sent to judge module U2, judges whether user is in safe range with interval setting time.
On this basis, judge module U2 can update initial position according to moving displacement in real time, for example, according to initial Position and the first moving displacement can obtain the first position of user, then using first position as when receiving the second moving displacement Initial position, the second place of user can be obtained according to initial position (i.e. first position) and the second moving displacement, then will The second place is as initial position when receiving three moving displacements, by that analogy.Wherein, judge module U2 is sequentially received First moving displacement, the second moving displacement and the 3rd moving displacement, and the first moving displacement, the second moving displacement and the 3rd movement Displacement is continuous.
So, by the monitoring of the moving displacement to user, to ensure that user is in safe range, it becomes possible to which protection is used Family safety, the problem of preventing user from sending collision with surrounding objects, causing to cause personal injury to user occurs.And this practicality It is new both can be without by peripheral hardware, e.g. computer etc., it is not required that build map in real time, can effectively reduce amount of calculation, Can accurately identify whether current location is safe again, the flexibility that increase equipment is used.
In a specific embodiment of the present utility model, as shown in Fig. 2 displacement detection module U1 includes panoramic shooting Head U11 and first processing units U12, first processing units U12 are arranged to the panoramic picture gathered according to full-view camera U11 Detect the moving displacement of user.
Specifically, full-view camera U11 collections can include the panoramic picture of all objects around user, this first Processing unit U12 for example can be the position of each object in the every two continuous frames panoramic picture gathered according to full-view camera U11 Put the real-time moving displacement that situation of change calculates user;First processing units U12 can also be according between full-view camera U11 The interval moving displacement of user is calculated every the change in location situation of each object in the two frame panoramic pictures that setting time is gathered.
In another specific embodiment of the present utility model, full-view camera U11 can be arranged on same water by multiple Common camera in plane is replaced, and the shooting visual angle of this multiple common camera can be spliced into 360 degree, so, so that it may Using the image mosaic that gathers each common camera as panoramic picture.
Further, displacement detection module U1 also includes Inertial Measurement Unit U13, and first processing units U12 is also set The data measured according to Inertial Measurement Unit are set to be modified moving displacement.
On this basis, Inertial Measurement Unit U13 can include 3-axis acceleration sensor and three-axis gyroscope is sensed Device.3-axis acceleration sensor can gather the steric acceleration data of user, and three-axis gyroscope sensor can gather user Space angular velocity data, by inertial navigation algorithm, the moving displacement of user can also be calculated, passed by 3-axis acceleration Sensor and three-axis gyroscope sensor are modified to the moving displacement obtained according to panoramic picture, enable to the movement measured Displacement is more accurate.
According to Fig. 2, the wear-type virtual reality device can also include initial distance detection module U4 and safe model Determining module U5 is enclosed, initial distance detection module U4 is arranged in the initial position of detection user and environment between each object Initial distance;Safe range determining module U5 is arranged to determine safe range according to initial distance.
Wherein, initial position is specially the position where user before detecting user's moving displacement;Object tool in environment Body refers in actual environment residing for user, positioned at user, all around surrounding and top can be touched in user's moving process The object contacted.
For example, when between nearest object in the initial position and environment that initial distance detection module U4 detects user away from From in the case of 1 meter, it may be determined that safe range be by the center of circle of the initial position of user, the circle that radius is 0.8 meter Domain.
For another example can detect the distance between the first object and initial position for A, the second object and initial bit It is B the distance between to put, and the distance between third body and initial position are C, the distance between the 4th object and initial position For D, the distance between the 5th object and initial position are E, and the distance between the 6th object and initial position are F, and initial bit In the case of between being surround setting in the first object, the second object, third body, the 4th object, the 5th object and the 6th object, Six calibration points can be demarcated, the first calibration point is located at the line between the first object and initial position of initial position On, the second calibration point is located on the line between the second object and initial position of initial position, and the 3rd calibration point is located at On line between the third body and initial position of initial position, the 4th calibration point is located at the 4th close to initial position On line between object and initial position, the 5th calibration point is located between the 5th object and initial position of initial position Line on, the 6th calibration point be located at close to initial position the 6th object and initial position between line on, and first mark The distance between fixed point and the distance between the first object, the second calibration point and second object, the 3rd calibration point and third body The distance between, the distance between the 4th calibration point and the 4th object, the distance between the 5th calibration point and the 5th object and The distance between six calibration points and the 6th object are that setpoint distance is, for example, 0.1 meter, and two adjacent calibration points are connected Can be just safe range the closed area that constitutes.
In a specific embodiment of the present utility model, initial distance detection module U4 can include depth camera U41 and second processing unit U42, second processing unit U42 be arranged to according to depth camera U41 gather gray level image it is true Determine the initial distance between each object and user in environment.
Specifically, can, when user is in initial position, using the initial position as the center of circle, rotate a circle, to cause depth Spend the image of all objects in camera U41 collection environment.
Due to depth camera sampling depth image, in depth image the gray value of pixel only with visual field window plane to thing The distance dependent in body surface face.Therefore, depth image has spatial color independence first, will not run into the factors such as illumination, shade Influence, secondly, the gray value of depth image and the horizontal stroke of image, ordinate are combined, in certain spatial dimension, can be with For representing coordinate of the object in 3d space, therefore it can overcome and block or overlapping with equivalent into 3d space pattern-recognition Problem.Importantly, the image-forming principle of depth image video camera can ensure the robustness of camera calibration well, adapt to each The change of environment is planted, easily self-regulation is allowed to and re-scales and demarcation thing need not be measured.
Therefore, according to the gray value of each pixel in current gray level image, it is possible to calculate each object and user it Between initial distance, such as in the case that distance between the diverse location of jobbie and the initial position of user is different, The minimum range between the object and initial position can be regard as the object and the initial distance of initial position.
Further, initial distance detection module U4 also includes RGB camera U43, and second processing unit U42 is also set The RGB image gathered according to RGB cameras is set to be modified initial distance.
Depth camera U41 is used to measure the object depth information and profile information in visual range, and RGB cameras are used for The texture information of body surface is obtained, second processing unit U42 is by being combined progress to the image that the two cameras are gathered Processing just can be very good judgment object distance shape, so as to get initial distance it is more accurate.
Initial distance detection module U4 can be provided by range sensor, in another specific reality of the present utility model Apply in example, initial distance detection module U4 is that, by common camera and the offer of the 3rd processing unit, the 3rd processing unit is set The change in location situation for being set to each object in the every two continuous frames image gathered according to common camera calculates the shifting of user Dynamic displacement.Specifically, above-mentioned judge module U2, first processing units U12, safe range determining module U5 and second processing unit U42 can be provided by a processor chips.The processor chips for example can be CPU processor chip either microprocessor MCU chip etc..
Further, alarm module U3 can include loudspeaker, when judge module U2 judges that user exceeds safe model , can be with controlling loudspeaker sounding, such as can be " drop drop " sound when enclosing.Alarm module U3 can also include display screen, sentence When disconnected module U2 judges that user exceeds safe range, carrying similar to " exceeding safety zone " can be shown with control display screen Show.Alarm module U3 can also be motor, when judge module U2 judges that user exceeds safe range, motor can be controlled to shake It is dynamic.
Fig. 3 is a kind of embodiment party according to the method for security protection of the present utility model for wear-type virtual reality device The flow chart of formula.
According to Fig. 3, the method for security protection comprises the following steps:
Step S301, detects the moving displacement of user.
Further, the method for the moving displacement of detection user is specifically as follows:
Gather the panoramic picture of two framed user's local environments;
The moving displacement of user is calculated according to the change in location situation of each object in two frame panoramic pictures.
Step S302, judges whether user exceeds safe range, in this way, then according to the initial position of moving displacement and user Perform step S303;If not, continuing executing with step S301.
Step S303, sends alarm sounds user.
In a specific embodiment of the present utility model, the method for security protection also includes:
Detect the initial distance between each object in initial position and environment;
Safe distance is determined according to initial distance.
The various embodiments described above primary focus describes identical similar between the difference with other embodiment, each embodiment Part cross-reference.
Although some specific embodiments of the present utility model are described in detail by example, this area It is to be understood by the skilled artisans that example above is merely to illustrate, rather than in order to limit scope of the present utility model.This Field it is to be understood by the skilled artisans that can be in the case where not departing from scope and spirit of the present utility model, to above example Modify.Scope of the present utility model is defined by the following claims.

Claims (10)

1. a kind of wear-type virtual reality device, it is characterised in that including displacement detection module, judge module and alarm module, Institute's displacement detection module is arranged to detect the moving displacement of user;The judge module is arranged to according to the user's Whether initial position and the moving displacement judge the user in safe range;The alarm module is arranged in institute The judged result for stating judge module sends alarm to point out the user when being no.
2. wear-type virtual reality device according to claim 1, it is characterised in that institute's displacement detection module includes complete Scape camera and first processing units, the first processing units are arranged to the panorama sketch gathered according to the full-view camera The moving displacement as detecting the user.
3. wear-type virtual reality device according to claim 2, it is characterised in that institute's displacement detection module also includes Inertial Measurement Unit, the data that the first processing units are also configured to be measured according to the Inertial Measurement Unit are moved to described Dynamic displacement is modified.
4. wear-type virtual reality device according to claim 2, it is characterised in that first processing units and described Judge module is provided by a processor chips.
5. wear-type virtual reality device according to claim 3, it is characterised in that the inertia detection unit includes three Axle acceleration sensor and three-axis gyroscope sensor.
6. wear-type virtual reality device according to claim 1, it is characterised in that the wear-type virtual reality device Also include initial distance detection module and safe range determining module, it is described that the initial distance detection module is arranged to detection Initial distance in initial position and environment between each object;The safe range determining module is arranged to according to described first The distance that begins determines the safe range.
7. wear-type virtual reality device according to claim 6, it is characterised in that the initial distance detection module bag Depth camera and second processing unit are included, the second processing unit is arranged to the ash gathered according to the depth camera Degree image determines the initial distance in the environment between each object and the initial position.
8. wear-type virtual reality device according to claim 7, it is characterised in that the initial distance detection module is also Including RGB cameras, the second processing unit is also configured to the RGB image amendment institute gathered according to the RGB cameras State initial distance.
9. wear-type virtual reality device according to claim 7, it is characterised in that the safe range determining module and The second processing unit is provided by a processor chips.
10. wear-type virtual reality device according to claim 1, it is characterised in that the alarm module includes raising one's voice Device.
CN201720113906.3U 2017-02-06 2017-02-06 A kind of wear-type virtual reality device Active CN206594361U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720113906.3U CN206594361U (en) 2017-02-06 2017-02-06 A kind of wear-type virtual reality device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720113906.3U CN206594361U (en) 2017-02-06 2017-02-06 A kind of wear-type virtual reality device

Publications (1)

Publication Number Publication Date
CN206594361U true CN206594361U (en) 2017-10-27

Family

ID=60128843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720113906.3U Active CN206594361U (en) 2017-02-06 2017-02-06 A kind of wear-type virtual reality device

Country Status (1)

Country Link
CN (1) CN206594361U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113920688A (en) * 2021-11-24 2022-01-11 青岛歌尔声学科技有限公司 Collision early warning method and device, VR head-mounted equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113920688A (en) * 2021-11-24 2022-01-11 青岛歌尔声学科技有限公司 Collision early warning method and device, VR head-mounted equipment and storage medium
WO2023092641A1 (en) * 2021-11-24 2023-06-01 歌尔股份有限公司 Collision early warning method and apparatus, and head-mounted vr device, and storage medium

Similar Documents

Publication Publication Date Title
CN106919254A (en) A kind of wear-type virtual reality device and method for security protection
CN107836012B (en) Projection image generation method and device, and mapping method between image pixel and depth value
CN109255813B (en) Man-machine cooperation oriented hand-held object pose real-time detection method
CN102448681B (en) Operating space presentation device, operating space presentation method, and program
US9696859B1 (en) Detecting tap-based user input on a mobile device based on motion sensor data
KR102209873B1 (en) Perception based predictive tracking for head mounted displays
CN106295581A (en) Obstacle detection method, device and virtual reality device
EP3662350A1 (en) Transitioning into a vr environment and warning hmd users of real-world physical obstacles
TW201911133A (en) Controller tracking for multiple degrees of freedom
WO2015123774A1 (en) System and method for augmented reality and virtual reality applications
US9547412B1 (en) User interface configuration to avoid undesired movement effects
EP3644826A1 (en) A wearable eye tracking system with slippage detection and correction
CN107485100A (en) A kind of intelligent helmet and its rescue air navigation aid for being rescued in building
WO2019171557A1 (en) Image display system
CN108789500B (en) Human-machine safety protection system and safety protection method
CN109813317A (en) A kind of barrier-avoiding method, electronic equipment and virtual reality device
CN109164802A (en) A kind of robot maze traveling method, device and robot
CN107346174A (en) A kind of exchange method and system of actual environment and virtual environment
CN206594361U (en) A kind of wear-type virtual reality device
CN108629842A (en) A kind of unmanned equipment moving information provides and motion control method and equipment
CN105701963A (en) Hazard warning method and mobile device with application of hazard warning method
KR101141686B1 (en) rotation angle estimation apparatus and method for rotation angle estimation thereof
KR101438514B1 (en) Robot localization detecting system using a multi-view image and method thereof
CN115514885A (en) Monocular and binocular fusion-based remote augmented reality follow-up perception system and method
CN107526566A (en) The display control method and device of a kind of mobile terminal

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201022

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Patentee before: GOERTEK TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221227

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.

TR01 Transfer of patent right