CN113552943A - AR interaction system and device for health and health science popularization - Google Patents
AR interaction system and device for health and health science popularization Download PDFInfo
- Publication number
- CN113552943A CN113552943A CN202110821330.7A CN202110821330A CN113552943A CN 113552943 A CN113552943 A CN 113552943A CN 202110821330 A CN202110821330 A CN 202110821330A CN 113552943 A CN113552943 A CN 113552943A
- Authority
- CN
- China
- Prior art keywords
- human body
- module
- unit
- interaction
- health
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 62
- 230000036541 health Effects 0.000 title claims abstract description 38
- 238000012545 processing Methods 0.000 claims abstract description 20
- 230000009471 action Effects 0.000 claims description 19
- 229920006395 saturated elastomer Polymers 0.000 claims description 12
- 230000003238 somatosensory effect Effects 0.000 claims description 12
- 238000000034 method Methods 0.000 claims description 11
- 230000003416 augmentation Effects 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 8
- 230000006698 induction Effects 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 7
- 230000037237 body shape Effects 0.000 claims description 6
- 238000001514 detection method Methods 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000009792 diffusion process Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 2
- 230000001939 inductive effect Effects 0.000 claims 1
- 230000002265 prevention Effects 0.000 abstract description 9
- 230000000007 visual effect Effects 0.000 abstract description 8
- 238000007654 immersion Methods 0.000 abstract description 3
- 230000006872 improvement Effects 0.000 description 10
- 230000002452 interceptive effect Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 208000035473 Communicable disease Diseases 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 208000017667 Chronic Disease Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006806 disease prevention Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000035764 nutrition Effects 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 208000030212 nutrition disease Diseases 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 238000006748 scratching Methods 0.000 description 1
- 230000002393 scratching effect Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to the technical field of AR interaction, in particular to an AR interaction system and an AR interaction device for health and health science popularization. The device comprises a human body capturing unit, an image processing unit, a diversified interaction unit and a sharing unit. The human body capture unit is adopted to capture the body type and the motion of the human body, so that a corresponding virtual object and a human body image superposition are formed through the system according to the motion of the human body, an operator can conveniently improve the immersion feeling and the bringing-in feeling of the operator when using the AR, the hygiene and epidemic prevention knowledge of people is improved in a hidden way, meanwhile, the flexibility of the human body in different scenes is improved through the arranged movable body sensing camera shooting, the actual form of any screen is not limited, the image and camera visual angle deviation caused by fields or equipment is avoided, and the immersion feeling and the bringing-in feeling of user experience are improved.
Description
Technical Field
The invention relates to the technical field of AR interaction, in particular to an AR interaction system and an AR interaction device for health and health science popularization.
Background
The epidemic prevention is a name established by food sanitation and disease prevention organizations. China adheres to the construction of the health epidemic prevention cause from the country construction, obtains great results, particularly controls and monitors various infectious diseases, and gradually eliminates the development and the popularity of various infectious diseases.
Along with the continuous improvement of living standard, the demand for strengthening consciousness of human health and epidemic prevention is increasing day by day, at present, the popular science of health and epidemic prevention is mostly spread by adopting a book and video mode, the effect of the popular science of health and epidemic prevention cannot influence the human to be subjected to subtotal impersonation, so that the later stage comes out to adopt an AR game mode to carry out the popular science of health and epidemic prevention so as to improve the effect of UI human subtotal impersonation.
However, the camera of the all-in-one machine with the AR concept is fixed in position and placed at the top of the screen, so that the image visual angle is overlooked excessively or deviates from the normal visual angle, the user experience is not strong, and the difficulty in post-image processing is high. Meanwhile, when the existing AR is used by an operator, the scene props or system commands inside the AR cannot interact with the operator, and the feeling of AR experience of the operator is reduced.
Disclosure of Invention
The present invention is directed to an AR interactive system and an AR interactive device for health science popularization, so as to solve the problems mentioned in the background art.
In order to achieve the above objects, one of the objects of the present invention is to provide an AR interactive system for health and wellness science popularization, comprising a human body capturing unit, an image processing unit, a diversified interactive unit and a sharing unit;
the human body capturing unit is used for capturing the motion and the body form of a human body and forming a virtual object corresponding to the motion of the human body;
the camera shooting unit is used for shooting and recording the actions of the human body;
the motion sensing module is used for sensing the body shape edge of the human body and the motion of the human body;
the motion sensing module describes the motion of a human body by adopting infrared sensing in optical sensing, and determines saturated pixel points on the human body in the process of sensing the motion of the human body by adopting the infrared sensing;
the acquisition module is used for acquiring and capturing the human body actions and body shape edges sensed by the somatosensory sensing module;
the reality augmentation data module reads the human body actions and body types captured by the acquisition module by adopting a three-view and multi-view tracking technology, and virtual objects are superposed according to the human body actions and body types;
the image processing unit is used for deducting the portrait captured by the human body capturing unit from the original scene;
the diversified interaction unit is used for controlling commands in the system through the human body actions captured by the human body capturing unit;
the sharing unit is used for storing the interaction process of the human body in the diversified interaction unit.
As a further improvement of the technical solution, the formula described by the somatosensory induction module for the saturated pixel points of the human body is as follows:
N=ηqSPt′/(hvQth)+wi
wherein i is the number of outward diffusion turns of the current carrier, Q is the electronic charge, P is the incident light power, t' is the time of infrared light reaching human body, and QthAs a carrier overflow limit, wiThe number of saturated pixels on the crosstalk line is i circles, N is the total charge number of the saturated pixels, and S is the area of the infrared irradiation object.
As a further improvement of the technical solution, the three-view and multi-view tracking technology of the reality augmentation data module adopts a Harris corner detection algorithm, and the formula thereof is as follows:
wherein cim is the angular point quantity of pixel I, Ix,IyThe amount of change of the pixel I in the horizontal and vertical directions, respectively.
As a further improvement of the technical solution, the image processing unit includes an image adjusting module and a portrait matting module;
the image adjusting module is used for reducing the influence effect of the human body background and the ambient light so as to facilitate the image matting module to buckle off the human body in the streaming media in the conventional environment;
the image matting module is used for buckling away a human body captured in the human body capturing unit.
As a further improvement of the technical scheme, the diversified interaction unit comprises a scene establishing module, a human-computer interaction module and an air-separating operation module;
the scene establishing module is used for setting a scene according to the selected scene and putting the portrait deducted by the image processing unit into the scene;
the human-computer interaction module is used for controlling a virtual object formed by the human body capturing unit by the human body;
and the air-separating operation module is used for controlling the virtual object to operate the object appearing in the system by the human body.
As a further improvement of the technical scheme, the diversified interaction unit further comprises a scene customization module, and the scene customization module is used for customizing and inputting 2D/3D scene resources.
As a further improvement of the technical scheme, the sharing unit comprises an automatic screen capturing module, a real-time sharing module and a picture saving module;
the automatic screen capturing module is used for capturing the screen of the human body in the diversified interaction unit;
the real-time sharing module is used for sharing the picture shot by the automatic screen shooting module into social media;
the picture storage module is used for storing the picture captured by the automatic screen capturing module.
The invention also aims to provide an AR interaction device for health and health science popularization, which comprises the AR interaction system for health and health science popularization, a control host, a motion sensing camera and a large screen display.
As a further improvement of the technical scheme, the human body capturing unit, the image processing unit, the diversified interaction unit and the sharing unit are all arranged inside the control host.
As the further improvement of this technical scheme, the body feels the camera and articulates its circuit and control the host computer and connect in one side of controlling the host computer, the signal line of body feeling the camera with control the host computer and connect.
Compared with the prior art, the invention has the beneficial effects that:
1. in the AR interaction system and the AR interaction device for health and health science popularization, the human body capture unit is adopted to capture the body type and the motion of a human body, so that the corresponding virtual object and the human body image are overlapped through the system according to the motion of the human body, the immersion feeling and the bringing-in feeling of an operator are improved when the operator uses AR, and the hygiene and epidemic prevention knowledge of people is improved in a hidden and acquiescent manner.
2. In this an AR interaction system and device for health science popularization, detain the human body from former environment through the image processing unit who sets up and leave to in adding the human image that detains and leaving to the scene, deepen the operator sensation, adopt infrared sensing to describe the action of human body simultaneously, reduce the influence of light and background when scratching in former scene, reduce later stage image processing's the degree of difficulty.
3. This an AR interaction system and device for health science popularization, mobilizable body through the setting is felt and is made a video recording, need not be restricted to the reality form of any screen, improves the flexibility that the human body is suitable for in different scenes, avoids because place or equipment reason cause image and camera visual angle skew, improves user experience's the sense of immersing and brings into the sense.
Drawings
FIG. 1 is a block diagram of system modules according to embodiment 1 of the present invention;
FIG. 2 is a schematic view of the placement of the apparatus according to embodiment 1 of the present invention;
FIG. 3 is a schematic view of the placement of the apparatus according to embodiment 2 of the present invention;
fig. 4 is a schematic view of the placement of the device in embodiment 3 of the present invention.
The various reference numbers in the figures mean:
1. a human body capturing unit; 11. an image pickup unit; 12. a somatosensory induction module; 13. an acquisition module; 14. a reality augmentation data module;
2. an image processing unit; 21. an image adjustment module; 22. a portrait matting module;
3. a diversified interaction unit; 31. a scene establishing module; 32. a human-computer interaction module; 33. an air-separating operation module; 34. a scene customization module;
4. a sharing unit; 41. an automatic screen capture module; 42. a real-time sharing module; 43. a picture saving module;
5. controlling a host; 6. a motion sensing camera; 7. a large screen display.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the equipment or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Example 1
Referring to fig. 1-2, the AR interactive system for health science popularization according to the present invention comprises a human body capturing unit 1, an image processing unit 2, a diversification interactive unit 3 and a sharing unit 4, wherein:
the human body capturing unit 1 is used for capturing the motion and the body shape of a human body and forming a virtual object corresponding to the motion of the human body;
the human body capturing unit 1 comprises a camera unit 11, a somatosensory induction module 12, an acquisition module 13 and a reality augmentation data module 14;
the camera unit 11 is used for shooting and recording the action of the human body;
the body feeling sensing module 12 is used for sensing body shape edges of a human body and actions of the human body;
the motion sensing module 12 describes the motion of the human body by using infrared sensing in optical sensing, and determines saturated pixel points on the human body in the process of sensing the motion of the human body by using infrared sensing;
the formula described by the somatosensory induction module 12 to the saturated pixel points of the human body is as follows:
N=ηqSPt′/(hvQth)+wi
wherein i is the number of outward diffusion turns of the current carrier, Q is the electronic charge, P is the incident light power, t' is the time of infrared light reaching human body, and QthAs a carrier overflow limit, wiThe number of saturated pixels on the crosstalk line is i circles, N is the total charge number of the saturated pixels, and S is the area of the infrared irradiation object;
the acquisition module 13 is used for acquiring and capturing the human body actions and body shape edges sensed by the somatosensory sensing module 12;
the reality augmentation data module 14 reads the human body actions and body types captured by the acquisition module 13 by adopting a three-view and multi-view tracking technology, and virtual objects are superposed according to the human body actions and body types to realize real-time interaction of people and the virtual objects;
the three-view and multi-view tracking technique of the reality augmentation data module 14 adopts a Harris corner detection algorithm, and the formula is as follows:
wherein cim is the angular point quantity of pixel I, Ix,IyThe variation of the pixel I in the horizontal direction and the vertical direction respectively;
when the cim is larger than the set threshold and the cim is a local maximum value in a certain neighborhood simultaneously in the matrix cim, the cim is determined as an angular point;
the method comprises the steps of adopting multi-view tracking, extracting images of markers from all angles of a real environment, extracting image characteristic points by applying a Harris angular point detection algorithm, calculating depth information of a real object in a scene, displaying a scene image combining virtual and real after determining the shielding relation between a virtual object and the real environment, enabling the real scene and the virtual object to realize fusion in a more natural space-time range, and enabling a user to freely interact with the virtual object.
The image processing unit 2 is used for deducting the portrait captured by the human body capturing unit 1 from the original scene;
the image processing unit 2 comprises an image adjusting module 21 and a portrait matting module 22;
the image adjusting module 21 is used for reducing the influence effect of the human body background and the ambient light, so that the image matting module 22 can conveniently buckle off the human body in the streaming media in the conventional environment;
the image matting module 22 is used to buckle off the human body captured in the human body capturing unit 1 so as to bring the human body into a designated environment.
The diversified interaction unit 3 is used for controlling commands in the system through the human body actions captured by the human body capturing unit 1;
the diversified interaction unit 3 comprises a scene establishing module 31, a man-machine interaction module 32 and an air-separating operation module 33;
the scene establishing module 31 is configured to set a scene according to the selected scene and put the portrait deducted by the image processing unit 2 into the scene;
the human-computer interaction module 32 is used for controlling the virtual object formed by the human body capturing unit 1 by the human body;
the spaced operation module 33 is used for human body control of virtual objects to operate on objects appearing in the system.
As a further improvement of the technical solution, the diversified interaction unit 3 further includes a scene customizing module 34, and the scene customizing module 34 is configured to perform customized input on 2D/3D scene resources, so that, in addition to application standard experience contents, customized scene and personalized interaction solutions developed and developed according to customer requirements can be fused and displayed for different industry characteristics.
By deeply improving and optimizing the user interface and the interaction mode in the aspects of usability, convenience and the like, on the basis of establishing a unique design style and natural man-machine interaction, the method supports the air separation operation through a series of simple gestures/actions, breaks through the function limitation caused by contact interaction of a traditional mouse, a keyboard, a touch screen and the like, and achieves the purposes of clear experience effect and more complete and smooth visual design and operation process.
The sharing unit 4 is used for storing the interaction process of the human body in the diversified interaction unit 3.
The sharing unit 4 comprises an automatic screen capture module 41, a real-time sharing module 42 and a picture saving module 43;
the automatic screen capturing module 41 is used for capturing the screen of the human body in the diversified interaction unit 3;
the real-time sharing module 42 is configured to share the picture captured by the automatic screen capturing module 41 in social media;
the picture storage module 43 is used for storing the picture captured by the automatic screen capturing module 41, and can perform instant printing through external equipment, and meanwhile, can also extract and interact in real time according to the scanned public number.
The invention also provides an AR interaction device for health and health science popularization, which comprises the AR interaction system for health and epidemic prevention science popularization, a control host 5, a motion sensing camera 6 and a large screen display 7. Human capture unit 1, image processing unit 2, diversified interactive unit 3 and share unit 4 and all set up in the inside of controlling host computer 5, and body sensing camera 6 articulates its circuit in one side of controlling host computer 5 and is connected with controlling host computer 5, and body sensing camera 6's signal line is connected with controlling host computer 5.
Will feel the body and show camera 6 and control host computer 5 and place when the bottom that shows the ware 7 at the big screen display, show the height that shows the ware 7 and ground for 1350mm with the big screen display to it looks like to carry out the photography to the human body to feel camera 6, shows the ware 7 with the big screen display and the ground between be 1350 mm's reason for: when a person stands in front of the screen, the person experiences the best effect when the imaging height in the screen occupies screen height 2/3. According to the report of Chinese resident nutrition and chronic disease conditions issued by the State administration 2015 at 30.6 months, the average height of Chinese adults is 161 centimeters (167.1 centimeters for male and 155.8 centimeters for female). In actual tests, when a person 161 cm in height stands at a distance of 210 cm from the screen, the imaging height of the person accounts for 2/3 of the total height of the screen. From the design angle of products, the imaging in the screen is expected to have a mirror surface effect, and when the camera is placed at the horizontal height of human eyes according to an optical principle, the collected picture is close to the picture of mirror surface imaging. The average height of the eyes of the Chinese is 149-147 cm when the eyes are 12-14 cm away from the vertex. Based on the above data, we performed tests for multiple sets of height data: samples were taken every 5 cm from 130 cm to 160 cm. This height of 135 cm was finally selected.
Example 2
In order to place the large screen display 7 of installation and have a space to install on the wall, when different from embodiment 1, refer to fig. 3, control host computer 5 and body sensing camera 6 and place in the upper end of large screen display 7, control body sensing camera 6 and keep a distance of 1350mm from ground, its large screen display 7 then is closer to ground, makes things convenient for the installation of large screen display 7.
Example 3
In order to increase the game experience of different visual angles of the human body, the difference with embodiment 1 is that, referring to fig. 4, the large screen display 7 is installed on the wall, and the control host 5 and the motion sensing camera 6 are installed on one side of the large screen display 7 through a bracket, so as to increase different visual angles of the human body in the game, and to obtain a better experience visual angle.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and the preferred embodiments of the present invention are described in the above embodiments and the description, and are not intended to limit the present invention. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (10)
1. An AR interaction system for health and wellness science popularization, characterized by: the device comprises a human body capturing unit (1), an image processing unit (2), a diversified interaction unit (3) and a sharing unit (4);
the human body capturing unit (1) is used for capturing the motion and the body form of a human body and forming a virtual object corresponding to the motion of the human body;
the human body capturing unit (1) comprises a camera unit (11), a somatosensory induction module (12), an acquisition module (13) and a reality augmentation data module (14);
the camera shooting unit (11) is used for shooting and recording the action of a human body;
the somatosensory induction module (12) is used for inducing body shape edges of a human body and actions of the human body;
the motion sensing module (12) describes the motion of a human body by adopting infrared sensing in optical sensing, and determines saturated pixel points on the human body in the process of sensing the motion of the human body by adopting the infrared sensing;
the acquisition module (13) is used for acquiring and capturing human body actions and body type edges sensed by the somatosensory sensing module (12);
the reality augmentation data module (14) reads the human body actions and body types captured by the acquisition module (13) by adopting a three-view and multi-view tracking technology, and superimposes virtual objects according to the human body actions and body types;
the image processing unit (2) is used for deducting the portrait captured by the human body capturing unit (1) from the original scene;
the diversified interaction unit (3) is used for controlling commands in the system through human body actions captured by the human body capturing unit (1);
the sharing unit (4) is used for storing the interaction process of the human body in the diversified interaction unit (3).
2. The AR interaction system for health and wellness science popularization of claim 1, wherein: the somatosensory induction module (12) describes saturated pixel points of a human body according to the following formula:
N=ηqSPt′/(hvQth)+wi
wherein i is the number of outward diffusion turns of the current carrier, Q is the electronic charge, P is the incident light power, t' is the time of infrared light reaching human body, and QthAs a carrier overflow limit, wiThe number of saturated pixels on the crosstalk line is i circles, N is the total charge number of the saturated pixels, and S is the area of the infrared irradiation object.
3. The AR interaction system for health care science popularization of claim 2, wherein: the three-view and multi-view tracking technique of the reality augmentation data module (14) adopts Harris corner detection algorithm, and the formula is as follows:
wherein cim is the angular point quantity of pixel I, Ix,IyThe amount of change of the pixel I in the horizontal and vertical directions, respectively.
4. The AR interaction system for health and wellness science popularization of claim 1, wherein: the image processing unit (2) comprises an image adjusting module (21) and a portrait matting module (22);
the image adjusting module (21) is used for reducing the influence effect of human body background and ambient light, so that the image matting module (22) can conveniently buckle off the human body in the streaming media in the conventional environment;
the image matting module (22) is used for buckling the human body captured in the human body capturing unit (1).
5. The AR interaction system for health and wellness science popularization of claim 1, wherein: the diversified interaction unit (3) comprises a scene establishing module (31), a man-machine interaction module (32) and an air-separating operation module (33);
the scene establishing module (31) is used for setting a scene according to the selected scene and putting the portrait deducted by the image processing unit (2) into the scene;
the human-computer interaction module (32) is used for controlling a virtual object formed by the human body capturing unit (1) by a human body;
the air-separating operation module (33) is used for controlling the virtual object to operate the object appearing in the system by the human body.
6. The AR interaction system for health and wellness science popularization of claim 5, wherein: the diversified interaction unit (3) further comprises a scene customization module (34), and the scene customization module (34) is used for customizing and inputting 2D/3D scene resources.
7. The AR interaction system for health and wellness science popularization of claim 1, wherein: the sharing unit (4) comprises an automatic screen capturing module (41), a real-time sharing module (42) and a picture saving module (43);
the automatic screen capturing module (41) is used for capturing the screen of the human body in the diversified interaction unit (3);
the real-time sharing module (42) is used for sharing the pictures shot by the automatic screen shooting module (41) into social media;
the picture saving module (43) is used for saving the picture captured by the automatic screen capturing module (41).
8. An apparatus for AR interaction for health and wellness science popularization, characterized by: the AR interaction system for the health science popularization as claimed in any one of claims 1 to 7, the control host (5), the motion sensing camera (6) and the large screen display (7) are included.
9. The apparatus for AR interaction for health and wellness science popularization of claim 8, wherein: the human body capturing unit (1), the image processing unit (2), the diversified interaction unit (3) and the sharing unit (4) are all arranged inside the control host (5).
10. The apparatus for AR interaction for health and wellness science popularization of claim 8, wherein: somatosensory camera (6) is hinged on one side of control host (5) and is connected with control host (5), and the signal line of somatosensory camera (6) is connected with control host (5).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110821330.7A CN113552943A (en) | 2021-07-20 | 2021-07-20 | AR interaction system and device for health and health science popularization |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110821330.7A CN113552943A (en) | 2021-07-20 | 2021-07-20 | AR interaction system and device for health and health science popularization |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113552943A true CN113552943A (en) | 2021-10-26 |
Family
ID=78103612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110821330.7A Pending CN113552943A (en) | 2021-07-20 | 2021-07-20 | AR interaction system and device for health and health science popularization |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113552943A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090111434A1 (en) * | 2007-10-31 | 2009-04-30 | Motorola, Inc. | Mobile virtual and augmented reality system |
CN103177468A (en) * | 2013-03-29 | 2013-06-26 | 渤海大学 | Three-dimensional motion object augmented reality registration method based on no marks |
CN104503686A (en) * | 2014-12-25 | 2015-04-08 | 宋小波 | System device for supporting large screen interaction |
CN104731343A (en) * | 2015-04-14 | 2015-06-24 | 上海云富网络科技有限公司 | Virtual reality man-machine interaction children education experience system based on mobile terminal |
CN106097435A (en) * | 2016-06-07 | 2016-11-09 | 北京圣威特科技有限公司 | A kind of augmented reality camera system and method |
CN108492633A (en) * | 2018-03-26 | 2018-09-04 | 山东英才学院 | A method of realizing children's complementary education using AR |
CN108845670A (en) * | 2018-06-27 | 2018-11-20 | 苏州馨镜家园软件科技有限公司 | A kind of online virtual fitness entertainment systems and method based on somatosensory device |
CN110362209A (en) * | 2019-07-23 | 2019-10-22 | 辽宁向日葵教育科技有限公司 | A kind of MR mixed reality intelligent perception interactive system |
CN110427100A (en) * | 2019-07-03 | 2019-11-08 | 武汉子序科技股份有限公司 | A kind of movement posture capture system based on depth camera |
-
2021
- 2021-07-20 CN CN202110821330.7A patent/CN113552943A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090111434A1 (en) * | 2007-10-31 | 2009-04-30 | Motorola, Inc. | Mobile virtual and augmented reality system |
CN103177468A (en) * | 2013-03-29 | 2013-06-26 | 渤海大学 | Three-dimensional motion object augmented reality registration method based on no marks |
CN104503686A (en) * | 2014-12-25 | 2015-04-08 | 宋小波 | System device for supporting large screen interaction |
CN104731343A (en) * | 2015-04-14 | 2015-06-24 | 上海云富网络科技有限公司 | Virtual reality man-machine interaction children education experience system based on mobile terminal |
CN106097435A (en) * | 2016-06-07 | 2016-11-09 | 北京圣威特科技有限公司 | A kind of augmented reality camera system and method |
CN108492633A (en) * | 2018-03-26 | 2018-09-04 | 山东英才学院 | A method of realizing children's complementary education using AR |
CN108845670A (en) * | 2018-06-27 | 2018-11-20 | 苏州馨镜家园软件科技有限公司 | A kind of online virtual fitness entertainment systems and method based on somatosensory device |
CN110427100A (en) * | 2019-07-03 | 2019-11-08 | 武汉子序科技股份有限公司 | A kind of movement posture capture system based on depth camera |
CN110362209A (en) * | 2019-07-23 | 2019-10-22 | 辽宁向日葵教育科技有限公司 | A kind of MR mixed reality intelligent perception interactive system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102473041B (en) | Image recognition device, operation determination method, and program | |
TWI534661B (en) | Image recognition device and operation determination method and computer program | |
JP5896578B2 (en) | Data input device | |
TWI701941B (en) | Method, apparatus and electronic device for image processing and storage medium thereof | |
KR101815020B1 (en) | Apparatus and Method for Controlling Interface | |
US20160325096A1 (en) | System and method for processing sensor data for the visually impaired | |
EP2950180B1 (en) | Method for determining screen display mode and terminal device | |
WO2014075418A1 (en) | Man-machine interaction method and device | |
CN106101687A (en) | VR image capturing device and VR image capturing apparatus based on mobile terminal thereof | |
US20150033157A1 (en) | 3d displaying apparatus and the method thereof | |
JP5341126B2 (en) | Detection area expansion device, display device, detection area expansion method, program, and computer-readable recording medium | |
CN110308832A (en) | Display control apparatus and its control method and storage medium | |
US20150009123A1 (en) | Display apparatus and control method for adjusting the eyes of a photographed user | |
CN113552943A (en) | AR interaction system and device for health and health science popularization | |
WO2006097722A2 (en) | Interface control | |
CN113552944B (en) | Wisdom propaganda system | |
CN103995586B (en) | Non- wearing based on virtual touch screen refers to gesture man-machine interaction method | |
CN116129526A (en) | Method and device for controlling photographing, electronic equipment and storage medium | |
CN110858095A (en) | Electronic device capable of being controlled by head and operation method thereof | |
US20230010947A1 (en) | Electronic apparatus, and method for displaying image on display device | |
CN205946041U (en) | A mobile terminal for taking VR image and VR image imaging system thereof | |
KR101720607B1 (en) | Image photographing apparuatus and operating method thereof | |
KR102138620B1 (en) | 3d model implementation system using augmented reality and implementation method thereof | |
CN105975165A (en) | Display control method for fisheye menu | |
CN114898505A (en) | Ultra-thin contact-free WYSIWYG intelligent image acquisition equipment and acquisition method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |