CN114511655B - Take VR AR of smell reproduction to feel device and smell formation system - Google Patents

Take VR AR of smell reproduction to feel device and smell formation system Download PDF

Info

Publication number
CN114511655B
CN114511655B CN202210141087.9A CN202210141087A CN114511655B CN 114511655 B CN114511655 B CN 114511655B CN 202210141087 A CN202210141087 A CN 202210141087A CN 114511655 B CN114511655 B CN 114511655B
Authority
CN
China
Prior art keywords
odor
image
smell
eye
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210141087.9A
Other languages
Chinese (zh)
Other versions
CN114511655A (en
Inventor
雷鸣
刘建曦
王发容
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hengbida Electronic Technology Co ltd
Original Assignee
Shenzhen Hengbida Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hengbida Electronic Technology Co ltd filed Critical Shenzhen Hengbida Electronic Technology Co ltd
Priority to CN202210141087.9A priority Critical patent/CN114511655B/en
Publication of CN114511655A publication Critical patent/CN114511655A/en
Application granted granted Critical
Publication of CN114511655B publication Critical patent/CN114511655B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L9/00Disinfection, sterilisation or deodorisation of air
    • A61L9/14Disinfection, sterilisation or deodorisation of air using sprayed or atomised substances including air-liquid contact processes

Abstract

The invention provides a VR/AR body feeling device with odor reproduction and an odor forming system, which comprise a head display device, an image recognition module, an odor recognition module and an odor generator: the head display device is used for displaying perspective images on the left eye screen and the right eye screen to form a three-dimensional animation; the image recognition module is wirelessly connected with the head display equipment through a Bluetooth technology, performs image recognition on the three-dimensional animation in real time, judges whether the image needs to perform virtual olfactory sensation generation or not, and acquires a virtual olfactory sensation image; the odor identification module is electrically connected with the image identification module, qualitatively and quantitatively analyzes the virtual olfactory image through an odor algorithm, and generates an odor formula; the smell generator generates the smell of the smell formula after receiving the smell formula, and the ultrasonic atomization device is used for emitting the smell.

Description

Take VR AR of smell reproduction to feel device and smell formation system
Technical Field
The invention relates to the technical field of VR/AR body sensing devices, in particular to a VR/AR body sensing device with odor reproduction and an odor forming system.
Background
The VR/AR device constructs a virtual stereoscopic environment by blocking human vision to enable people to participate in the environment to achieve a feeling of experience of being personally on the scene, but the technology of the VR/AR device is not completely mature at present, and although it can construct a stereoscopic animation to enable people to perform virtual scene design from the aspect of vision, as described in the article Huang Xin-yuan, sun wei, qi dong-xu.
However, the virtual reality scene is constructed only by considering the visual aspect, and needs to be comprehensively considered in combination with multiple aspects, so that when the virtual reality scene is constructed through the VR/AR device, the odor of an object in the virtual scene can be reproduced, and a more real virtual scene is constructed, which is a direction that people need to explore.
Disclosure of Invention
The invention provides a VR/AR body feeling device with odor reproduction and an odor forming system, which are used for solving the problem that the odor in a virtual scene cannot be reproduced when the virtual scene is constructed.
The utility model provides a take smell to reproduce VR/AR somatosensory device, includes first apparent equipment, image recognition module, smell recognition module and smell generator:
the head display device is used for displaying perspective images on the left eye screen and the right eye screen to form a three-dimensional animation;
the image recognition module is wirelessly connected with the head display equipment through a Bluetooth technology, performs image recognition on the three-dimensional animation in real time, judges whether the image needs to perform virtual olfactory sensation generation or not, and acquires a virtual olfactory sensation image;
the odor identification module is electrically connected with the image identification module, qualitatively and quantitatively analyzes the virtual olfactory image through an odor algorithm, and generates an odor formula;
after the odor generator receives the odor formula, the odor generator generates the odor of the odor formula, and the ultrasonic atomization device is used for emitting the odor.
As an embodiment of the invention: the head display device includes: the device comprises an aspheric lens, a display screen, a position sensor, wireless connection equipment and a battery module;
the aspheric lens is used for visually superposing the perspective images through the aspheric lens to form a three-dimensional animation; wherein the aspheric lens includes: a left aspheric lens and a right aspheric lens;
the display screen projects the three-dimensional animation on the display screen through a micro-projection technology;
the position sensor is used for tracking the position of the head action of the user to acquire position information;
and the battery module is used for supplying power to the whole head display equipment.
As an embodiment of the invention: the head display device further comprises: eye tracker and wireless connection device:
the eye tracker comprises a camera, an eye parameter extraction device and eye tracking prejudgment equipment;
the camera tracks the eyes of the user in real time and sends the shot eye images to the eye parameter extraction device in real time;
the eye parameter extraction device is electrically connected with the camera, and eye movement parameters are extracted through the eye image; wherein the eye movement parameters include: pupil parameters, purkinje spot parameters;
the eye tracking prejudging equipment is electrically connected with the eye parameter extracting device and estimates the eye fixation point of the user in real time according to the eye movement parameters;
the wireless connection equipment is connected with the position sensor, the display screen and the eye tracker and is used for transmitting real-time information of the position sensor, the display screen and the eye tracker.
As an embodiment of the invention: the image recognition module includes: image discriminator, image analyzer:
the image discriminator is used for synchronously receiving the three-dimensional animation, carrying out real-time contrast identification on the images in the three-dimensional animation according to a virtual olfactory image library, and dividing the images into virtual olfactory images and non-virtual olfactory images;
the image analyzer is used for calculating the concentration of the smell of the virtual image according to the image depth of the virtual olfactory image.
As an embodiment of the invention: characterized in that said odor recognition module comprises:
an odor confirmation unit: the formula code is used for determining the name of an object in the image according to the virtual olfactory image and calling the name of the object from an odor library;
the odor formula constitutes a unit: the odor formula generation device is used for obtaining proportions of odor compounds forming gas and various odor compounds according to the formula codes, obtaining the using amount of the odor compounds according to the concentration of the odor of the virtual image and generating the odor formula. Wherein the odor formulation comprises: odor compound number, odor compound ratio, amount of odor compound used.
As an embodiment of the present invention: the smell confirmation unit includes:
an odor code determination subunit: the object name of the virtual olfactory image is identified, and a formula code of the object name of the object in an odor library is obtained;
an odor bank: the scent storage device is used for storing scent formulas of all objects and storing the scent formulas in a coding mode based on scent names.
As an embodiment of the invention: the scent generator includes: the device comprises a smell adjusting device, a smell box and an ultrasonic atomization device;
the odor regulating device is used for receiving an odor formula and driving the odor box to perform odor mixing according to the odor formula;
the odor box is electrically connected with the odor regulating device and used for storing odor compounds and mixing the odor compounds according to the odor formula to generate odor;
the ultrasonic atomization device is communicated with the smell box, and the smell is uniformly dispersed through the ultrasonic atomization device.
As an embodiment of the invention: the ultrasonic atomization device comprises: nose tracker, nebulizer and odor sprayer:
the nose tracker is used for identifying and tracking the nose position of the user;
the atomizer is used for generating radio waves through a surface acoustic wave device, and the radio waves atomize the smell;
the smell sprayer is respectively connected with the nose tracker and the atomizer and used for spraying atomized smell to the nose position of a user.
A scent-forming system with scent reproduction comprising:
an odor regulating module: the odor box is used for receiving odor formulas and driving the odor box to perform odor mixing according to the odor formulas; wherein the odor formulation comprises: odorous compounds, proportion of odorous compounds, amount of odorous compounds used;
a smell box module: the odor compound storage device is used for storing odor compounds and mixing the odor compounds according to the odor formula to generate odor;
ultrasonic atomization module: used for uniformly dispersing the smell by an ultrasonic atomization device.
As an embodiment of the present invention: the ultrasonic atomization module comprises: nose tracking unit, atomizing unit and smell injection unit:
the nose tracking unit is used for identifying and tracking the nose position of the user;
the atomization unit is used for generating radio waves through a surface acoustic wave device, and the radio waves atomize the odor;
the odor spraying unit is used for spraying atomized odor to the nose position of a user.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a block diagram of a VR/AR somatosensory device with odor reproduction in an embodiment of the invention;
FIG. 2 is a block diagram of an odor generating system with odor reproduction in accordance with an embodiment of the present invention;
fig. 3 is a schematic diagram of a VR head display apparatus according to an embodiment of the present invention.
In fig. 3, 1-charging data interface, 2-battery box, 3-loudspeaker, 4-on-off key, 5-camera, 6-head-wearing elastic regulating wheel, 7-battery box and 8-smell generator.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it should be understood that they are presented herein only to illustrate and explain the present invention and not to limit the present invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly or indirectly connected to the other element.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the device or element so referred to must be in a particular orientation, constructed or operated in a particular orientation, and is not to be construed as limiting the invention.
Moreover, it is noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions, and "a plurality" means two or more unless specifically limited otherwise. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Example 1:
the embodiment of the invention provides a VR/AR body feeling device with odor reproduction, which comprises a head display device, an image recognition module, an odor recognition module and an odor generator, wherein the image recognition module comprises:
the head display device is used for displaying perspective images on the left eye screen and the right eye screen to form a three-dimensional animation;
the image recognition module is wirelessly connected with the head display equipment through a Bluetooth technology, performs image recognition on the three-dimensional animation in real time, judges whether the image needs to perform virtual olfactory sensation generation or not, and acquires a virtual olfactory sensation image;
the odor identification module is electrically connected with the image identification module, qualitatively and quantitatively analyzes the virtual olfactory image through an odor algorithm, and generates an odor formula;
after the odor generator receives the odor formula, the odor generator generates the odor of the odor formula and emits the odor through the ultrasonic atomization device.
The working principle of the technical scheme is as follows: in the prior art, a VR/AR device is widely applied to various aspects of society, particularly in viewing and three-dimensional space game experiences, the VR/AR technology is rapidly developed, however, although the existing VR/AR technology is developed well, a virtual three-dimensional visual environment is constructed by sealing human vision, people can participate in the environment to achieve the experience feeling of being personally on the scene, in the experiences, real feeling reduction of the nature, such as smell, temperature, humidity, force feedback and the like, are lacked.
The beneficial effects of the above technical scheme are: according to the invention, through the head display device, the image identification module, the smell identification module and the smell generator, not only is a virtual stereoscopic visual environment established, but also multiple environmental indexes in the natural environment can be restored, multiple environmental experiences of vision and taste are formed, and an experiencer can feel the smell of the environment more personally.
Example 2:
in one embodiment, the head display apparatus includes: the device comprises an aspheric lens, a display screen, a position sensor, wireless connection equipment and a battery module;
the aspheric lens is used for visually superposing the perspective images through the aspheric lens to form a three-dimensional animation; wherein the aspheric lens includes: a left aspheric lens and a right aspheric lens;
the display screen projects the three-dimensional animation on the display screen through a micro-projection technology;
the position sensor is used for tracking the position of the head action of the user to acquire position information;
and the battery module is used for supplying power to the whole head display equipment.
The working principle of the technical scheme is as follows: in the prior art, most VR glasses select spherical lenses, although the cost of the spherical lenses is low, the aberration of the spherical lenses is too large, and the virtual stereoscopic vision environment displayed after wearing the glasses is easily blurred, so that the use feeling of a user is reduced. Participant's head, the hand, the trunk position, position sensor in the first equipment that shows is used for confirming and action catching the position of participant's head, when the electric quantity exhausts, accessible electromagnetic induction carries out wireless charging to the first equipment that shows, also can switch on the battery of battery case through the data interface that charges, in addition, as shown in fig. 3, the first equipment that shows is beaten through the on & off switch, the sound in the speaker conveying VR video, the camera of following spot is used for light identification and pursuit, prevent the light leak phenomenon from appearing when wearing head equipment, through wearing the fixed band of the first equipment of tight regulation wheel regulation, odor generator fixes on the first equipment that shows.
The beneficial effects of the above technical scheme are: according to the invention, the aspheric lens is adopted for left-right visual coincidence, the aspheric lens can correct three-dimensional animation fuzzy double images in vision, and solve the problems of distortion of visual field of virtual stereoscopic vision environment and the like, the position of the head of a participant is determined and captured in real time through the position sensor of the head display device, so that the action behavior of a user can be judged, the head display device is charged wirelessly through electromagnetic induction, and the head display device can be charged conveniently and rapidly through a wireless charging mode.
Example 3:
in one embodiment, the head display apparatus further comprises: eye tracker and wireless connection device:
the eye tracker comprises a camera, an eye parameter extraction device and eye tracking prejudgment equipment;
the camera tracks the eyes of the user in real time and sends the shot eye images to the eye parameter extraction device in real time;
the eye parameter extraction device is electrically connected with the camera and is used for extracting eye movement parameters through the eye image; wherein the eye movement parameters include: pupil parameter, purkinje spot parameter;
the eye tracking prejudging equipment is electrically connected with the eye parameter extracting device and estimates the eye fixation point of the user in real time according to the eye movement parameters;
the wireless connection equipment is connected with the position sensor, the display screen and the eye tracker and is used for transmitting real-time information of the position sensor, the display screen and the eye tracker.
The working principle of the technical scheme is as follows: in the prior art, because the eye distances of any person are different, after the VR/AR equipment is worn, the position of a lens needs to be adjusted to a proper position to see a perfect virtual reality scene.
The beneficial effects of the above technical scheme are: according to the method, the positions of the left and right lenses are automatically moved to appropriate positions through the eye distance parameter adjustment, automatic lens adjustment is carried out, correct adjustment can be rapidly and effectively carried out on the lenses, the problem that the positions of the lenses cannot be adjusted by a user and time is wasted is avoided, the emotion and the degree of frightening of the user are judged according to the pupil parameter and the celebration parameter of the user, pictures with thrilling and bloody creeps are experienced in some games, the emotion and the degree of frightening of the user are mastered to a certain extent through observation of the pupil parameter of the user, the user is reminded at the emotion critical value of the user, and frightening of the user is avoided.
As an embodiment of the present invention: when eye tracking is performed, extracting eye parameters of a user, including an eye distance parameter, a pupil parameter and a Purkinje spot parameter, by a horizontal projection method, and judging the emotion and the panic degree of the user according to the eye parameters of the user:
the method comprises the following steps: capturing a user eye state diagram shot by a camera, and determining the external width of two eyes and the interpupillary distance of the left eye and the right eye:
Figure BDA0003506962160000111
Figure BDA0003506962160000112
where i denotes the measured distance, X denotes the two eye positions, Q X Representing width values of both eyes, X 0 Indicating the starting position, X, of the left eye measurement 1 Representing the end position of the right eye measurement, X representing the two pupil positions, K Y Representing the value of the separation of the two pupils, Y 0 Indicating the starting position of the left eye measurement, Y 1 Represents the end position of the right eye measurement;
step two: acquiring pupil parameters of a user:
Figure BDA0003506962160000113
wherein O represents an eye parameter, G (O) represents a pupil parameter of the user, N1 represents a length of a pupil region, N2 represents a width of the pupil region, N2 × N1 is an area of the pupil of the user, (X, Y) represents position coordinates in the pupil, U G (X, Y) represents a gradation value of a position coordinate in the pupil.
Step three: acquiring a Purkinje spot parameter of a user:
Figure BDA0003506962160000114
wherein, O represents an eye parameter, V (O) represents a Purkinje spot parameter of a user, M1 represents the length of an eye pattern light spot, M2 represents the width of the eye pattern light spot, M2 multiplied by M1 is the area of the eye pattern light spot of the user, (X, Y) represents a position coordinate in the eye pattern light spot, U G (X, Y) represents the gray scale value of the position coordinates in the eye spot.
The beneficial effects of the above technical scheme are: the method comprises the steps of capturing a user eye state diagram shot by a camera, determining the outer width of two eyes and the distance between pupils of left and right eyes, automatically adjusting a lens according to the outer width of the two eyes and the distance between the pupils, adjusting the position of the lens to the optimal position of the vision of a user, obtaining pupil parameters and Purkinje spot parameters of the user through a horizontal projection method, and judging the emotion and the panic degree of the user according to the parameters.
Example 4:
in one embodiment, the image recognition module comprises: image discriminator, image analyzer:
the image discriminator is used for synchronously receiving the three-dimensional animation, carrying out real-time contrast identification on the images in the three-dimensional animation according to a virtual olfactory image library, and dividing the images into virtual olfactory images and non-virtual olfactory images;
the image analyzer is used for calculating the concentration of the odor of the virtual image according to the image depth of the virtual olfactory image.
The working principle of the technical scheme is as follows: in the prior art, a virtual reality scene of three-dimensional display is generated through an image generation and display system, and a series of technical supports are used for enabling the virtual reality scene to be more real and enabling people to be personally on the scene, but the experience often achieves a real effect through vision, and a false surface of the virtual reality scene is easily exposed from other senses.
The beneficial effects of the above technical scheme are: in the invention, the object in the virtual reality scene is quickly discriminated by the image discriminator, the virtual olfactory image is identified, and the virtual olfactory image is sent to the image analyzer for calculating the concentration of the object odor, so that not only the object odor is generated, but also the concentration of the object needs to be grasped.
As an embodiment of the present invention: in the image discriminator, according to the image characteristics of the virtual olfactory image, in the images in the three-dimensional stereo animation, judging the virtual olfactory image and the non-virtual olfactory image for all the images, and determining the virtual olfactory image:
the method comprises the following steps: acquiring the pixel of each point in the image:
Figure BDA0003506962160000131
where M is the length of the image, N is the width of the image,
Figure BDA0003506962160000132
representing element values of pixels in an image neighborhood;
step two: extracting image characteristic parameters in the image:
Figure BDA0003506962160000133
where (x, y) denotes the coordinates of points in the image, n denotes the image, T n (x, y) is an image characteristic parameter of the nth image at the (x, y) coordinate, Q is the number of standard pixels, Q n Is the n-thThe number of pixels in each image, sigma represents an image weight value, e represents an image characteristic parameter, e (x, y) represents an image characteristic parameter of an (x, y) point, and n belongs to {1,2,3 \8230; n };
step three: comparing the image characteristic parameters of the image with the image characteristic parameters in a standard virtual olfactory image library, and judging whether the image characteristic parameters are virtual olfactory images:
Figure BDA0003506962160000141
wherein, U represents the ratio of image characteristic parameters of the image to image characteristic parameters in a standard virtual olfactory image library, bn represents the nth standard virtual olfactory image, and T represents the maximum value of the number of the standard virtual olfactory images Bn When (x, y) is expressed as an image characteristic parameter of the nth image at (x, y) coordinates and U =1, the image is determined to be a virtual olfactory image, and when U ≠ 1, the image is determined to be a non-virtual olfactory image.
The beneficial effects of the above technical scheme are: in the invention, the images of each object are intercepted in the three-dimensional animation, the image characteristic parameters in the images are extracted and compared with the image characteristic parameters in the standard virtual olfactory image library, so that all the images and the virtual olfactory images are judged.
Example 5:
in one embodiment, characterized in that the scent recognition module comprises:
an odor confirmation unit: the formula code is used for determining the name of an object in the image according to the virtual olfactory image and calling the name of the object from an odor library;
the odor formula constitutes a unit: the odor formula generation device is used for obtaining proportions of odor compounds forming gas and various odor compounds according to the formula codes, obtaining the using amount of the odor compounds according to the concentration of the odor of the virtual image and generating the odor formula. Wherein the odor formulation comprises: odor compound number, odor compound ratio, amount of odor compound used.
The working principle of the technical scheme is as follows: in the prior art, an odor formula is usually called in an odor library, and the odor concentration is not adjusted, and is usually a fixed value, so that when a user sees a relatively distant flower in a virtual reality scene, the user may ask about a strong flower fragrance and cause unreal experience.
The beneficial effects of the above technical scheme are: according to the invention, through two devices, corresponding odor can be generated according to an actual virtual reality scene, so that not only is the odor reappearance realized, but also appropriate and dense odor can be generated according to the distance between an object and a user.
Example 6:
in one embodiment, the scent confirmation unit includes:
an odor code determination subunit: the system comprises a virtual olfactory image, a formula code and a storage module, wherein the virtual olfactory image is used for identifying an object name of the virtual olfactory image and acquiring the formula code of the object name of an object in an odor library;
an odor library: the scent storage device is used for storing scent formulas of all objects and storing the scent formulas in a coding mode based on scent names.
The working principle of the technical scheme is as follows: in the invention, the odor formula of each object is stored in the odor library, the odor library collects most of plant odors, each odor is named and coded to generate a formula code, and the odor formula in the odor library is called through the formula code.
The beneficial effects of the above technical scheme are: according to the invention, the odor library is used for recording the vast majority of plant tastes in nature, and the odor formula can be quickly called through the formula code generated in the odor code determining subunit.
Example 7:
in one embodiment, the scent generator includes: the device comprises a smell adjusting device, a smell box and an ultrasonic atomization device;
the odor regulating device is used for receiving an odor formula and driving the odor box to perform odor mixing according to the odor formula;
the odor box is electrically connected with the odor regulating device and used for storing odor compounds and mixing the odor compounds according to the odor formula to generate odor;
the ultrasonic atomization device is communicated with the smell box, and the smell is uniformly dispersed through the ultrasonic atomization device.
The working principle of the technical scheme is as follows: in the prior art, the odor cards are placed on a semiconductor, and the heating of a semiconductor module is controlled to realize the emission of the odor cards, so that the odor is not accurate enough, the temperature is difficult to rise and fall quickly, the residual odor causes poor experience effect influence on users, the number of the odor cards is limited, each odor card only has one odor, most of object odors of virtual reality scenes are difficult to reproduce, and the odor cards have weight, the more the odor cards are, the heavier the weight is, and the odor cards are not suitable for being worn on the heads of the users.
The beneficial effects of the above technical scheme are: according to the invention, the final smell formula received by the Bluetooth wireless transmission technology in the smell adjusting device is mixed with smell compounds according to the final smell formula, so that new smell is generated quickly, and the formed new smell is uniformly sprayed around the nose of a user by the ultrasonic atomization device.
Example 8:
in one embodiment, the ultrasonic atomization apparatus comprises: nose tracker, nebulizer and odor sprayer:
the nose tracker is used for identifying and tracking the nose position of the user;
the atomizer is used for generating radio waves through a surface acoustic wave device, and the radio waves atomize the smell;
the smell sprayer is respectively connected with the nose tracker and the atomizer and used for spraying atomized smell to the nose position of a user.
The working principle of the technical scheme is as follows: in the prior art of the present invention, the odor generator generally sprays the odor using the direct transmission odor device, but it is difficult to refine the amount of the smell sprayed by the direct spraying device, and in the present invention, the position of the user's nose is recognized and tracked by the nose tracker, radio waves are generated through the surface acoustic wave device, the radio waves atomize the odor, and the atomized gas is sprayed around the user's nose in the form of an air cannon through the odor sprayer.
The beneficial effects of the above technical scheme are: in the invention, the nose tracker is used for identifying and tracking the nose position of a user, the nose of the user is determined, the gas is sprayed, the phenomenon that the user cannot smell the gas because the gas is too far away from the nose of the user after being sprayed is avoided, and the atomized gas is sprayed around the nose of the user in the form of an air cannon through the smell sprayer, so that the air cannon gas which is uniformly distributed is formed.
Example 9:
the embodiment of the invention provides an odor forming system with odor reproduction, which comprises:
an odor adjustment module: the odor box is used for receiving odor formulas and driving the odor box to perform odor mixing according to the odor formulas; wherein the odor formulation comprises: odorous compounds, proportion of odorous compounds, amount of odorous compounds used;
a smell box module: the odor compound storage device is used for storing odor compounds and mixing the odor compounds according to the odor formula to generate odor;
ultrasonic atomization module: used for uniformly dispersing the smell by an ultrasonic atomization device.
The beneficial effects of the above technical scheme are: in the prior art, the odor card is placed on a semiconductor, and the emission of the odor card is realized by controlling the heating of a semiconductor module, so that the odor is not accurate enough, and the residual odor has poor experience effect on a user.
Example 10:
in one embodiment, the ultrasonic atomization module comprises: nose tracking unit, atomizing unit and smell injection unit:
the nose tracking unit is used for identifying and tracking the nose position of the user;
the atomization unit is used for generating radio waves through a surface acoustic wave device, and the radio waves atomize the odor;
the odor spraying unit is used for spraying atomized odor to the position of the nose of a user.
The beneficial effects of the above technical scheme are: in the prior art, the odor generator generally adopts a direct transmission odor device to spray odor, but the spraying amount of the odor is difficult to be refined through the direct spraying device.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. The utility model provides a take smell to reproduce VR AR body to feel device which characterized in that shows equipment, image recognition module, smell recognition module and smell generator including the head:
the head display device is used for displaying perspective images on the left eye screen and the right eye screen to form a three-dimensional animation;
the image recognition module is wirelessly connected with the head display equipment through a Bluetooth technology, performs image recognition on the three-dimensional animation in real time, judges whether the image needs to perform virtual olfactory sensation generation or not, and acquires a virtual olfactory sensation image;
the odor identification module is electrically connected with the image identification module, qualitatively and quantitatively analyzes the virtual olfactory image through an odor algorithm, and generates an odor formula;
after the odor generator receives the odor formula, generating odor of the odor formula, and emitting the odor through an ultrasonic atomization device;
wherein, the head display device further comprises: eye tracker and wireless connection device:
the eye tracker comprises a camera, an eye parameter extraction device and eye tracking prejudgment equipment;
the camera tracks the eyes of the user in real time and sends the shot eye image to the eye parameter extraction device in real time;
the eye parameter extraction device is electrically connected with the camera, and eye movement parameters are extracted through the eye image; wherein the eye movement parameters include: pupil parameter, purkinje spot parameter;
the eye tracking prejudging equipment is electrically connected with the eye parameter extracting device and estimates the eye fixation point of the user in real time according to the eye movement parameters;
the wireless connection equipment is connected with the position sensor, the display screen and the eye tracker and is used for transmitting real-time information of the position sensor, the display screen and the eye tracker;
wherein the scent generator comprises: the device comprises a smell adjusting device, a smell box and an ultrasonic atomization device;
the odor regulating device is used for receiving an odor formula and driving the odor box to perform odor mixing according to the odor formula;
the odor box is electrically connected with the odor regulating device and used for storing odor compounds and mixing the odor compounds according to the odor formula to generate odor;
the ultrasonic atomization device is communicated with the smell box, and the smell is uniformly dispersed through the ultrasonic atomization device.
2. The VR/AR somatosensory device with odor reproduction of claim 1, wherein the head-up device comprises: the device comprises an aspheric lens, a display screen, a position sensor, wireless connection equipment and a battery module;
the aspheric lens is used for visually superposing the perspective images through the aspheric lens to form a three-dimensional animation; wherein the aspheric lens includes: a left aspheric lens and a right aspheric lens;
the display screen projects the three-dimensional animation on the display screen through a micro-projection technology;
the position sensor is used for tracking the position of the head action of the user and acquiring position information;
and the battery module is used for supplying power to the whole head display equipment.
3. The VR/AR somatosensory device with odor reproduction of claim 1, wherein the image recognition module comprises: image discriminator, image analyzer:
the image discriminator is used for synchronously receiving the three-dimensional animation, carrying out real-time contrast identification on the images in the three-dimensional animation according to a virtual olfactory image library, and dividing the images into virtual olfactory images and non-virtual olfactory images;
the image analyzer is used for calculating the concentration of the smell of the virtual image according to the image depth of the virtual olfactory image.
4. The VR/AR somatosensory device with scent reproduction of claim 1, wherein the scent recognition module comprises:
an odor confirmation unit: the formula code is used for determining the name of an object in the image according to the virtual olfactory image and calling the name of the object from an odor library;
the odor formula constitutes a unit: the odor formula is generated by obtaining proportions between the odor compounds forming the gas and the respective odor compounds according to the formula codes and obtaining the usage amount of the odor compounds according to the concentration of the odor of the virtual olfactory image, wherein the odor formula comprises: odor compound number, odor compound ratio, amount of odor compound used.
5. The VR/AR somatosensory device with scent reproduction of claim 4, wherein the scent confirmation unit comprises:
an odor code determination subunit: the system comprises a virtual olfactory image, a formula code and a storage module, wherein the virtual olfactory image is used for identifying an object name of the virtual olfactory image and acquiring the formula code of the object name of an object in an odor library;
an odor bank: the scent storage device is used for storing scent formulas of all objects and storing the scent formulas in a coding mode based on scent names.
6. The VR/AR somatosensory device with odor reproduction of claim 1, wherein the ultrasonic atomizing device comprises: nose tracker, nebulizer and odor sprayer:
the nose tracker is used for identifying and tracking the nose position of a user;
the atomizer is used for generating radio waves through a surface acoustic wave device, and the radio waves atomize the smell;
the smell sprayer is respectively connected with the nose tracker and the atomizer and used for spraying atomized smell to the nose position of a user.
7. The VR/AR somatosensory device with odor reproduction of claim 1, comprising:
an odor regulating module: the odor box is used for receiving odor formulas and driving the odor box to perform odor mixing according to the odor formulas; wherein the odor formulation comprises: odorous compounds, proportion of odorous compounds, amount of odorous compounds used;
an odor box module: the odor compound storage device is used for storing odor compounds and mixing the odor compounds according to the odor formula to generate odor;
ultrasonic atomization module: used for uniformly dispersing the odor through an ultrasonic atomization device.
8. The VR/AR somatosensory device with odor reproduction of claim 7, wherein the ultrasonic atomization module comprises: nose tracking unit, atomizing unit and smell injection unit:
the nose tracking unit is used for identifying and tracking the nose position of the user;
the atomization unit is used for generating radio waves through a surface acoustic wave device, and the radio waves atomize the odor;
the odor spraying unit is used for spraying atomized odor to the nose position of a user.
CN202210141087.9A 2022-02-16 2022-02-16 Take VR AR of smell reproduction to feel device and smell formation system Active CN114511655B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210141087.9A CN114511655B (en) 2022-02-16 2022-02-16 Take VR AR of smell reproduction to feel device and smell formation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210141087.9A CN114511655B (en) 2022-02-16 2022-02-16 Take VR AR of smell reproduction to feel device and smell formation system

Publications (2)

Publication Number Publication Date
CN114511655A CN114511655A (en) 2022-05-17
CN114511655B true CN114511655B (en) 2022-11-04

Family

ID=81551747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210141087.9A Active CN114511655B (en) 2022-02-16 2022-02-16 Take VR AR of smell reproduction to feel device and smell formation system

Country Status (1)

Country Link
CN (1) CN114511655B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330975A (en) * 2014-10-20 2015-02-04 浙江理工大学 Portable virtual odor generation device
CN104881123A (en) * 2015-06-06 2015-09-02 深圳市虚拟现实科技有限公司 Virtual reality-based olfactory simulation method, device and system
CN206224387U (en) * 2016-11-22 2017-06-06 包磊 The smell generating means of proprioceptive simulation apparatus
CN106980278A (en) * 2017-04-10 2017-07-25 陈柳华 A kind of virtual smell implementation method based on virtual reality
CN109799909A (en) * 2019-01-26 2019-05-24 温州大学 A kind of olfactory analog system and method based on Virtual Reality
CN110673498A (en) * 2019-09-25 2020-01-10 赵尹龙 Smell analogue means, air conditioning equipment, multimedia equipment and terminal equipment
CN210091103U (en) * 2019-09-05 2020-02-18 福建师范大学协和学院 Virtual reality system with real smell sense and device thereof
CN113420696A (en) * 2021-07-01 2021-09-21 四川邮电职业技术学院 Odor generation control method and system and computer readable storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105589551A (en) * 2014-10-22 2016-05-18 褚秀清 Eye tracking method for human-computer interaction of mobile device
WO2016098406A1 (en) * 2014-12-17 2016-06-23 ソニー株式会社 Information processing apparatus, information processing method and program
CN105892053A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Virtual helmet lens interval adjusting method and device
CN106056092B (en) * 2016-06-08 2019-08-20 华南理工大学 The gaze estimation method for headset equipment based on iris and pupil
EP3422146A1 (en) * 2017-06-28 2019-01-02 Nokia Technologies Oy An apparatus and associated methods for presenting sensory scenes
CN108010402B (en) * 2018-01-16 2020-05-08 重庆工程学院 Teaching training device based on VR system
CN108681399B (en) * 2018-05-11 2020-07-10 北京七鑫易维信息技术有限公司 Equipment control method, device, control equipment and storage medium
CN109828663A (en) * 2019-01-14 2019-05-31 北京七鑫易维信息技术有限公司 Determination method and device, the operating method of run-home object of aiming area
CN113395438B (en) * 2020-03-12 2023-01-31 Oppo广东移动通信有限公司 Image correction method and related device for eyeball tracking technology
CN114049525A (en) * 2021-11-29 2022-02-15 中国科学技术大学 Fusion neural network system, device and method for identifying gas types and concentrations

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330975A (en) * 2014-10-20 2015-02-04 浙江理工大学 Portable virtual odor generation device
CN104881123A (en) * 2015-06-06 2015-09-02 深圳市虚拟现实科技有限公司 Virtual reality-based olfactory simulation method, device and system
CN206224387U (en) * 2016-11-22 2017-06-06 包磊 The smell generating means of proprioceptive simulation apparatus
CN106980278A (en) * 2017-04-10 2017-07-25 陈柳华 A kind of virtual smell implementation method based on virtual reality
CN109799909A (en) * 2019-01-26 2019-05-24 温州大学 A kind of olfactory analog system and method based on Virtual Reality
CN210091103U (en) * 2019-09-05 2020-02-18 福建师范大学协和学院 Virtual reality system with real smell sense and device thereof
CN110673498A (en) * 2019-09-25 2020-01-10 赵尹龙 Smell analogue means, air conditioning equipment, multimedia equipment and terminal equipment
CN113420696A (en) * 2021-07-01 2021-09-21 四川邮电职业技术学院 Odor generation control method and system and computer readable storage medium

Also Published As

Publication number Publication date
CN114511655A (en) 2022-05-17

Similar Documents

Publication Publication Date Title
KR101691633B1 (en) Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
US7809160B2 (en) Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
CN105608746B (en) A method of reality is subjected to Virtual Realization
US10628478B2 (en) Method and device thereof for user interaction based on virtual objects and non-volatile storage medium
CN109416842A (en) Geometric match in virtual reality and augmented reality
JP2018007828A (en) Program and electronic apparatus
CN109683701A (en) Augmented reality exchange method and device based on eye tracking
CN105247879A (en) Client device, control method, system and program
WO2020027226A1 (en) Display control system, display control method, and display control program
US20200242827A1 (en) Information processing apparatus and method
CN105183147A (en) Head-mounted smart device and method thereof for modeling three-dimensional virtual limb
CN109663343A (en) A kind of augmented reality AR game device and implementation method
CN108074286A (en) A kind of VR scenario buildings method and system
CN105225270B (en) A kind of information processing method and electronic equipment
CN111768478A (en) Image synthesis method and device, storage medium and electronic equipment
CN114511655B (en) Take VR AR of smell reproduction to feel device and smell formation system
KR101974130B1 (en) Realtime Responsive Contents Providing System Using Emotional Information of Audience
CN114253393A (en) Information processing apparatus, terminal, method, and computer-readable recording medium
CN111243070B (en) Virtual reality presenting method, system and device based on 5G communication
JP2019101457A (en) Method executed by computer for providing information via head mount device, program for causing computer to execute the same, and information processing device
CN107783639A (en) Virtual reality leisure learning system
GB2606346A (en) System and method of head mounted display personalisation
JP2019083029A (en) Information processing method, information processing program, information processing system, and information processing device
US20240028123A1 (en) Information processing device, information processing method, program, and information processing system
US20240004471A1 (en) Information processing device, information processing method, program, and information processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant