CN105334959B - Gesture motion control system and method in a kind of reality environment - Google Patents
Gesture motion control system and method in a kind of reality environment Download PDFInfo
- Publication number
- CN105334959B CN105334959B CN201510695303.4A CN201510695303A CN105334959B CN 105334959 B CN105334959 B CN 105334959B CN 201510695303 A CN201510695303 A CN 201510695303A CN 105334959 B CN105334959 B CN 105334959B
- Authority
- CN
- China
- Prior art keywords
- data
- gesture motion
- reality environment
- gesture
- action
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses the gesture motion control system and method in a kind of reality environment, which includes: gesture-capture module, data transfer module, parsing identification module and interactive controlling module.Technical solution of the present invention is transferred to parsing identification module in such a way that wired and wireless connection combines, improves gesture identification efficiency and accuracy by the action data at acquisition hand major joint;By Android system and Unity game engine combine in the way of construct the mapping table of action command in gesture motion and reality environment, it inquires the mapping table and obtains action command of the gesture motion in the reality environment, realize the gesture motion interactive controlling of reality environment, allow user can it is on the spot in person as controlled in reality environment, bring true, comfortable, accurate operating experience to user.
Description
Technical field
The present invention relates to human-computer interaction technique field, in particular to the gesture motion in a kind of reality environment controls system
System and method.
Background technique
In recent years, gradually mature with virtual reality technology, the relevant hardware of various virtual realities, software product is successively
Occur, but the relevant operating technology of virtual reality is also very immature, present main way is still by traditional interactive mode
Such as keyboard, mouse, handle etc., but these interactive modes be all unable to reach in reality environment good interaction effect with
User experience.Although the emerging technology based on reality environment is developed, such as the generation of the interactive mode of gesture identification,
It is that there are no the mature intersection control routines that gesture identification and virtual reality are combined closely on the market at present.
Summary of the invention
Lack the mature intersection control routine that gesture identification and virtual reality are combined closely in view of the prior art
The problem of, the gesture motion control system and method in a kind of reality environment of the invention are proposed, it is above-mentioned to overcome
Problem at least is partially solved the above problem.
According to one aspect of the present invention, the gesture motion control system in a kind of reality environment is provided, this is
System includes: gesture-capture module, data transfer module, parsing identification module and interactive controlling module;
The gesture-capture module utilizes the body-sensing catcher being mounted at user's hand major joint, real-time capture
Gesture motion in effective coverage, and record action data;
The data transfer module records in such a way that wired connection and being wirelessly connected combines by described in real time
Action data is transferred to the parsing identification module;
The parsing identification module identifies corresponding gesture motion for parsing the action data recorded in real time;
The interactive controlling module, is preset with the corresponding relationship of the action command in gesture motion and reality environment
Table inquires the mapping table and obtains action command of the gesture motion in the reality environment, according to described dynamic
Make instruction and interacts control with the reality environment.
Optionally, the parsing identification module includes culling unit;
The culling unit, for rejecting redundancy and invalid number before parsing the action data recorded in real time
According to, the redundancy and invalid data include repeated data and before the very big data of existing data and curves deviation, be transmitted across
The data that the wrong data and timestamp generated in journey is not inconsistent.
Optionally, the parsing identification module further includes grouped element and resolution unit;
The grouped element, for according to computing capability, the culling unit is rejected that treated, and data grouping is put into
Gesture data caches in sequence;
The resolution unit identifies that one has for parsing to the gesture data in the caching sequence after grouping
The gesture motion of effect.
Optionally, the gesture motion obtains under Android system, dynamic in the gesture motion and reality environment
It is default using Unity game engine to make the mapping table instructed.
Optionally, the mapping table of the gesture motion and the action command in reality environment can be according to difference
Reality environment modify.
According to another aspect of the invention, the gesture motion control method in a kind of reality environment is provided, it should
Method includes:
Using the body-sensing catcher being mounted at user's hand major joint, the gesture in real-time capture effective coverage is dynamic
Make, and records action data;
In such a way that wired connection and wireless connection combine, the action data recorded in real time is transmitted;
The action data recorded in real time is parsed, identifies corresponding gesture motion;
The mapping table for inquiring the action command in preset gesture motion and reality environment obtains the gesture
The action command in reality environment is acted, interacts control according to the action command and the reality environment
System.
Optionally, the parsing action data recorded in real time, identifies that corresponding gesture motion includes:
Before parsing the action data recorded in real time, redundancy and invalid data are rejected, the redundancy and invalid
Data include repeated data and the very big data of existing data and curves deviation, the error number generated in transmission process before
According to the data not being inconsistent with timestamp.
Optionally, the parsing action data recorded in real time, identifies corresponding gesture motion further include:
According to computing capability, rejecting that treated, data grouping is put into gesture data caching sequence;
Gesture data in caching sequence after grouping is parsed, identifies an effective gesture motion.
Optionally, the gesture motion is obtained under Android system, presets the gesture motion using Unity game engine
With the mapping table of the action command in reality environment.
Optionally, according to different reality environments to the action command in the gesture motion and reality environment
Mapping table modify.
In conclusion technical solution of the present invention acquires the action data at hand major joint by body-sensing catcher,
The action data recorded in real time is transmitted in such a way that wired connection and wireless connection combine to improve gesture motion identification speed
Degree and precision;By the mapping table of preset gesture motion and action command in inquiry reality environment, gesture is obtained
The action command in reality environment is acted, interacts control according to the action command and reality environment, allowing makes
User can it is on the spot in person as operated in reality environment, bring true, comfortable, accurate operation to user
Experience.
Detailed description of the invention
Fig. 1 is the gesture motion control system signal in a kind of reality environment provided by one embodiment of the present invention
Figure;
Fig. 2 is to parse in gesture motion control system in a kind of reality environment provided by one embodiment of the present invention
Identification module schematic diagram;
Fig. 3 is the gesture motion control method process in a kind of reality environment provided by one embodiment of the present invention
Figure;
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with attached drawing to embodiment party of the present invention
Formula is described in further detail.
Fig. 1 is the gesture motion control system signal in a kind of reality environment provided by one embodiment of the present invention
Figure, as shown in Figure 1, the gesture motion control system 100 in the reality environment includes: gesture-capture module 110, data biography
Pass module 120, parsing identification module 130 and interactive controlling module 140.
Gesture-capture module 110, using the body-sensing catcher being mounted at user's hand major joint, real-time capture has
The gesture motion in region is imitated, and records action data.
Since the hand motion of human body is sufficiently complex, in a short time, hand each section of human body is all in position, angle,
Size, a large amount of variation has occurred in shape etc., completely using the data of personage's hand whole as gesture data, in existing rank
Section is unpractical.The present invention is intercepted using key point, is mounted with catcher on multiple major joints of personage's hand, due to
Part between each joint of human body is rigid body, therefore the entirety that entire hand can be taken out by the data variation in joint is dynamic
Make;It is just effective in effective operable area that hand motion is defined simultaneously, and effective hand motion can just be recorded.This
Sample reduces data collection capacity on the basis of not reducing action recognition, can acquire within a short period of time more true, effective
Data improve the speed and precision that gesture motion captures.
Data transfer module 120, in such a way that wired connection and being wirelessly connected combines, the movement number that will record in real time
According to being transferred to parsing identification module.
Physical connection line transmission have the advantages that high speed, it is stable and anti-interference, can be improved data transmission speed, and
Cope with the concurrent transmission of mass data.But physical connection line has a length limitation, and physical connection line occupy it is certain
Physical space region causes the zone of action of user to be restricted, and therefore, we provide the side of wirelessly transmitting data simultaneously
Formula, such as Bluetooth transmission and purple honeybee (ZigBee) transmission, are transmitted, wireless transmission is in stability and speed compared to physical connection line
Have partial sacrifice, but be greatly improved in using flexible, can opereating specification and movable region obtained it is very big
Raising, better experience can be provided in specific virtual reality applications scene.
It parses identification module 130 and identifies corresponding gesture motion for parsing the action data recorded in real time.
Interactive controlling module 140 is preset with the mapping table of the action command in gesture motion and reality environment,
It inquires the mapping table and obtains action command of the gesture motion in reality environment, according to action command and virtual reality
Environment interacts control.
By real-time capture to gesture motion be modeled as in reality environment some of personage or object and specifically act, allow
User can it is on the spot in person as control is interacted in reality environment, it is true, comfortable, accurate to bring to user
Operating experience.
Fig. 2 is to parse in gesture motion control system in a kind of reality environment provided by one embodiment of the present invention
Identification module schematic diagram.As shown in Fig. 2, parsing identification module 130 includes culling unit 131, grouped element 132 and resolution unit
133。
Culling unit 131, it is superfluous for before the action data that parsing records in real time, rejecting redundancy and invalid data
Remaining and invalid data include repeated data and before the very big data of existing data and curves deviation, generate in transmission process
Wrong data and the data that are not inconsistent of timestamp.
Grouped element 132, for according to computing capability, culling unit 131 is rejected that treated, and data grouping is put into hand
In gesture data buffer storage sequence.
Resolution unit 133, for being parsed to the gesture data in the caching sequence after grouping, identify one effectively
Gesture motion.
Due to eliminating a large amount of redundancy and invalid data before data parsing identification, calculating can be greatly reduced
Amount improves computational efficiency, also, only needs to handle the action data in caching sequence every time, ensure that current movement can be with
Quickly and efficiently parsing identifies.With being continuously increased for computing capability, it is only necessary to when adjusting the longest identification of single gesture
Between, can constantly extend the corresponding various gesture motions to become increasingly complex or even combinative movement, more people teamwork etc. into
Row identification, further realizes the interactive controlling more accurately refined convenient for control system of the invention.
In one embodiment of the invention, gesture motion obtains under Android system, gesture motion and virtual reality ring
The mapping table of action command in border is default using Unity game engine.
Unity be by Unity Technologies is developed one allow player easily create such as 3 D video game,
The multi-platform comprehensive development of games tool for building the types interaction contents such as visualization, realtime three dimensional animation is one comprehensive
The professional game engine of integration.By Android system and Unity game engine combine in the way of, realize gesture motion and virtual existing
The corresponding interaction of action command in real environment, can be improved the ease for use of the present invention in use, make interaction control of the invention
System understanding processed is convenient, using flexible.
On this basis, in one embodiment of the invention, the action command in gesture motion and reality environment
Mapping table can be modified according to different reality environments.Therefore, intersection control routine of the invention can be with
Realize that same gesture corresponds to different action commands in different reality environments.And what be can be convenient passes through gesture content
Extension, respective extension goes out more interactive actions in reality environment.
Fig. 3 is the gesture motion control method process in a kind of reality environment provided by one embodiment of the present invention
Figure, as shown in figure 3, this method comprises:
Step S310 utilizes the body-sensing catcher being mounted at user's hand major joint, real-time capture effective coverage
Interior gesture motion, and record action data.
Step S320 transmits the action data recorded in real time in such a way that wired connection and wireless connection combine.
Step S330 parses the action data recorded in real time, identifies corresponding gesture motion.
Step S340 inquires the mapping table of the action command in preset gesture motion and reality environment, obtains
Action command of the gesture motion in reality environment is obtained, interacts control according to action command and reality environment.
In one embodiment of the invention, the parsing action data recorded in real time, identifies corresponding gesture
Movement includes:
Step S331 rejects redundancy and invalid data before the action data that records in real time of parsing, redundancy and invalid
Data include repeated data and the very big data of existing data and curves deviation, the error number generated in transmission process before
According to the data not being inconsistent with timestamp.
Step S332, according to computing capability, rejecting that treated, data grouping is put into gesture data caching sequence.
Step S333 parses the gesture data in the caching sequence after grouping, identifies an effective gesture
Movement.
In one embodiment of the invention, the gesture motion is obtained under Android system, utilizes Unity game engine
Preset the mapping table of the action command in the gesture motion and reality environment.
In one embodiment of the invention, according to different reality environments to gesture motion and reality environment
In the mapping table of action command modify.
It should be noted that the embodiment of method shown in Fig. 3 phase corresponding with each embodiment of figure 1 above-system shown in Figure 2
Together, it has been described in detail above, details are not described herein.
In conclusion technical solution of the present invention utilizes the body-sensing catcher being mounted at user's hand major joint,
Quick Acquisition gesture motion critical data, it is wired, wireless transmission combine by way of transmit data, to collected data into
Row rejecting processing, and gesture motion is identified by the parsing of remaining valid data, inquire gesture motion and reality environment
In action command mapping table, action command of the gesture motion in reality environment is obtained, with virtual reality ring
Border interacts control.
The present invention has the advantages that the crucial gesture data at 1, acquisition personage's hand major joint, improves gesture-capture
Speed and precision.2, it in such a way that wired, wireless transmission combines, on the one hand adapts to the concurrent transmission of mass data, one
Aspect improves the scope of activities of user in the case where requiring low to transmission speed, improves user experience.3, it is parsed in data
Redundancy and invalid data are rejected before identification, reduces calculation amount, improves the speed and accuracy of parsing identification, and are only needed
Adjust the longest recognition time of single gesture, so that it may the gesture motion to become increasingly complex is coped with, it is convenient using extension.4, accurate
Reality environment is arrived in capture hand motion and quickly transmitting parsing, reflection, more flows so that using in reality environment
Freely, experience is outstanding, and different gesture motion interactive strategies can be provided for different reality environments, improves user
Interactive experience, strong applicability is widely used.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the scope of the present invention.It is all
Any modification, equivalent replacement, improvement and so within the spirit and principles in the present invention, are all contained in protection scope of the present invention
It is interior.
Claims (8)
1. the gesture motion control system in a kind of reality environment, which is characterized in that the system includes: gesture-capture mould
Block, data transfer module, parsing identification module and interactive controlling module;
The gesture-capture module, using the body-sensing catcher being mounted at user's hand major joint, real-time capture is effective
Gesture motion in region, and record action data;
The data transfer module, in such a way that wired connection and being wirelessly connected combines, the action data that will record in real time
It is transferred to the parsing identification module;
The parsing identification module identifies corresponding gesture motion for parsing the action data recorded in real time;
The interactive controlling module, is preset with the mapping table of the action command in gesture motion and reality environment, looks into
It askes the mapping table and obtains action command of the gesture motion in the reality environment, according to the action command
Control is interacted with the reality environment;
The parsing identification module includes culling unit;The culling unit, for parsing the movement number recorded in real time
According to before, redundancy and invalid data are rejected, the redundancy and invalid data include repeated data, generate in transmission process
The data that wrong data and timestamp are not inconsistent.
2. control system as described in claim 1, which is characterized in that the parsing identification module further includes grouped element reconciliation
Analyse unit;
The grouped element, for according to computing capability, the culling unit is rejected that treated, and data grouping is put into gesture
In data buffer storage sequence;
The resolution unit, for being parsed to the gesture data in the caching sequence after grouping, identify one it is effective
Gesture motion.
3. control system as described in claim 1, which is characterized in that the gesture motion obtains under Android system, described
The mapping table of action command in gesture motion and reality environment is default using Unity game engine.
4. control system as described in any one of claims 1-3, which is characterized in that the gesture motion and reality environment
In the mapping table of action command can be modified according to different reality environments.
5. the gesture motion control method in a kind of reality environment, which is characterized in that this method comprises:
Using the body-sensing catcher being mounted at user's hand major joint, gesture motion in real-time capture effective coverage,
And record action data;
In such a way that wired connection and wireless connection combine, the action data recorded in real time is transmitted;
The action data recorded in real time is parsed, identifies corresponding gesture motion;
The mapping table for inquiring the action command in preset gesture motion and reality environment obtains the gesture motion
Action command in reality environment interacts control according to the action command and the reality environment;
Before parsing the action data recorded in real time, redundancy and invalid data, the redundancy and invalid number are rejected
According to including data that repeated data, the wrong data generated in transmission process and timestamp are not inconsistent.
6. control method as claimed in claim 5, which is characterized in that the parsing action data recorded in real time is known
Not corresponding gesture motion further include:
According to computing capability, rejecting that treated, data grouping is put into gesture data caching sequence;
Gesture data in caching sequence after grouping is parsed, identifies an effective gesture motion.
7. control method as claimed in claim 5, which is characterized in that obtain the gesture motion under Android system, utilize
Unity game engine presets the mapping table of the action command in the gesture motion and reality environment.
8. such as the described in any item control methods of claim 5-7, which is characterized in that according to different reality environments to institute
The mapping table for stating the action command in gesture motion and reality environment is modified.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510695303.4A CN105334959B (en) | 2015-10-22 | 2015-10-22 | Gesture motion control system and method in a kind of reality environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510695303.4A CN105334959B (en) | 2015-10-22 | 2015-10-22 | Gesture motion control system and method in a kind of reality environment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105334959A CN105334959A (en) | 2016-02-17 |
CN105334959B true CN105334959B (en) | 2019-01-15 |
Family
ID=55285558
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510695303.4A Active CN105334959B (en) | 2015-10-22 | 2015-10-22 | Gesture motion control system and method in a kind of reality environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105334959B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105955469A (en) * | 2016-04-26 | 2016-09-21 | 乐视控股(北京)有限公司 | Control method and device of virtual image |
CN106095068A (en) * | 2016-04-26 | 2016-11-09 | 乐视控股(北京)有限公司 | The control method of virtual image and device |
CN106371602A (en) * | 2016-09-14 | 2017-02-01 | 惠州Tcl移动通信有限公司 | Method and system for controlling virtual reality device based on intelligent wearable device |
CN107885316A (en) * | 2016-09-29 | 2018-04-06 | 阿里巴巴集团控股有限公司 | A kind of exchange method and device based on gesture |
CN106683528A (en) * | 2017-01-13 | 2017-05-17 | 北京黑晶科技有限公司 | Teaching method and system based on VR/AR |
CN107281750A (en) * | 2017-05-03 | 2017-10-24 | 深圳市恒科电子科技有限公司 | VR aobvious action identification methods and VR show |
CN110058673A (en) * | 2018-01-17 | 2019-07-26 | 广西米克尔森科技股份有限公司 | A kind of virtual reality and augmented reality show exchange technology |
CN110209451A (en) * | 2019-05-28 | 2019-09-06 | 南京南方电讯有限公司 | A kind of horse race lamp display system and method based on the superposition of different display engines |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6452584B1 (en) * | 1997-04-23 | 2002-09-17 | Modern Cartoon, Ltd. | System for data management based on hand gestures |
US7084884B1 (en) * | 1998-11-03 | 2006-08-01 | Immersion Corporation | Graphical object interactions |
CN104238738A (en) * | 2013-06-07 | 2014-12-24 | 索尼电脑娱乐美国公司 | Systems and Methods for Generating an Augmented Virtual Reality Scene Within A Head Mounted System |
CN104756045A (en) * | 2012-10-04 | 2015-07-01 | 微软公司 | Wearable sensor for tracking articulated body-parts |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3397772B2 (en) * | 2001-03-13 | 2003-04-21 | キヤノン株式会社 | Sensor mounting device, sensor or marker mounting device |
US20070132722A1 (en) * | 2005-12-08 | 2007-06-14 | Electronics And Telecommunications Research Institute | Hand interface glove using miniaturized absolute position sensors and hand interface system using the same |
US10585478B2 (en) * | 2013-09-13 | 2020-03-10 | Nod, Inc. | Methods and systems for integrating one or more gestural controllers into a head mounted wearable display or other wearable devices |
-
2015
- 2015-10-22 CN CN201510695303.4A patent/CN105334959B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6452584B1 (en) * | 1997-04-23 | 2002-09-17 | Modern Cartoon, Ltd. | System for data management based on hand gestures |
US7084884B1 (en) * | 1998-11-03 | 2006-08-01 | Immersion Corporation | Graphical object interactions |
CN104756045A (en) * | 2012-10-04 | 2015-07-01 | 微软公司 | Wearable sensor for tracking articulated body-parts |
CN104238738A (en) * | 2013-06-07 | 2014-12-24 | 索尼电脑娱乐美国公司 | Systems and Methods for Generating an Augmented Virtual Reality Scene Within A Head Mounted System |
Also Published As
Publication number | Publication date |
---|---|
CN105334959A (en) | 2016-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105334959B (en) | Gesture motion control system and method in a kind of reality environment | |
CN203941499U (en) | A kind of action collection and feedback system based on stereoscopic vision | |
CN104057450B (en) | A kind of higher-dimension motion arm teleoperation method for service robot | |
CN105555486B (en) | Position/force control device, position/force control method | |
CN108777081A (en) | A kind of virtual Dancing Teaching method and system | |
US20120223956A1 (en) | Information processing apparatus, information processing method, and computer-readable storage medium | |
CN105739525A (en) | System of matching somatosensory operation to realize virtual flight | |
CN103399637A (en) | Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect | |
CN108983636A (en) | Human-machine intelligence's symbiosis plateform system | |
CN108983982A (en) | AR aobvious equipment and terminal device combined system | |
CN105148514A (en) | Device and method for controlling game view angle | |
WO2021003994A1 (en) | Control method for virtual character, and related product | |
CN106095094A (en) | The method and apparatus that augmented reality projection is mutual with reality | |
CN106406875A (en) | Virtual digital sculpture method based on natural gesture | |
CN107932510A (en) | NAO robot system based on action collection | |
Vu et al. | Emotion recognition based on human gesture and speech information using RT middleware | |
CN207676287U (en) | A kind of virtual reality experience system | |
CN107158659A (en) | A kind of long-range rehabilitation training of upper limbs system and method for game type based on Kinect | |
CN109806580A (en) | Mixed reality system and method based on wireless transmission | |
CN115699198A (en) | Digitization of operating rooms | |
CN106512391A (en) | Two-hand gesture recognition method, and simulation driving system and method based on two-hand gesture recognition method | |
CN111031577B (en) | Multi-node wireless motion capture node expansion method | |
CN111134974B (en) | Wheelchair robot system based on augmented reality and multi-mode biological signals | |
CN106385681A (en) | Virtual reality entertainment system and method thereof | |
CN106155328A (en) | A kind of wearable singly finger manipulates wireless mouse apparatus and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |