CN108282615A - Surrounding enviroment scan method and system - Google Patents
Surrounding enviroment scan method and system Download PDFInfo
- Publication number
- CN108282615A CN108282615A CN201810084872.9A CN201810084872A CN108282615A CN 108282615 A CN108282615 A CN 108282615A CN 201810084872 A CN201810084872 A CN 201810084872A CN 108282615 A CN108282615 A CN 108282615A
- Authority
- CN
- China
- Prior art keywords
- surrounding enviroment
- data
- capture
- scan method
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Abstract
The invention discloses a kind of surrounding enviroment scan methods and system, method to include:The surrounding enviroment data of multiple directions are captured in the same time;The surrounding enviroment data of capture are handled, wherein the processing includes splicing the surrounding enviroment data of multiple directions in real time;Compound surrounding enviroment information is generated according to the result of processing.System includes main frame, multiple modules of data capture and processing module.The present invention can capture the surrounding enviroment data of multiple directions in the same time, and the limitation of not examined range and gathered data is more efficient and can guarantee scanning accuracy;The surrounding enviroment data that multiple directions can be spliced when handling the surrounding enviroment data of capture in real time, realize the integration of multiple scan datas, real-time is more preferable.It the composite can be widely applied to environmental scanning field.
Description
Technical field
The present invention relates to environmental scanning field, especially a kind of indoor surrounding enviroment scan method and system.
Background technology
In recent years, laser technology, technical development of computer are rapid, and environmental scanning technology is in Context awareness, navigation and positioning
Etc. application it is also more and more extensive.By taking Google Earth and Google's streetscape as an example, the height based on GPS positioning information can be provided
360 degree of distant view photographs of precision, greatly facilitate user navigate, the operations such as path planning, application expanded to it is empty
Between be distributed related all various aspects, such as natural environment monitoring and analysis, resource investigation and exploitation, communication navigation.However, at present
Environmental scanning technology be directed to outdoor environment mostly, it is more rare to the sweeping scheme of indoor environment.In digital city, answer
Under the huge applications demand driving of anxious response simulation and training, Digital Cultural Heritage, exhibition etc., especially anti-terrorism, fire-fighting,
In the typical cases such as exhibition, wisdom building, the indoor environment information that indoor environment scanning technique obtains is indispensable basis
Data.Different with outdoor, indoor environment has the characteristics that short distance, more corners, easily blocks, illumination complexity, lacks absolute fix,
It efficiently and accurately obtains indoor environment information and is challenging project.The solution being commonly used is by making
Either scans are carried out to indoor environment with scanning device, but the scan efficiency of this artificial scan mode is low (especially big
Indoor environment scanning under scale scene, such as carries out environmental scanning in 10,000 square metres of museum) and it is difficult to ensure scanning essence
Degree.And the limitation of current either scans' mode examined range and gathered data, it can not be in the multiple sides of same time sweep
To indoor environment data, can not also carry out the integration of multiple scan datas, real-time is poor.
Invention content
In order to solve the above technical problems, it is an object of the invention to:A kind of efficient, high and real-time peripheral ring of precision is provided
Border scan method and system.
The first technical solution for being taken of the present invention is:
Surrounding enviroment scan method, includes the following steps:
The surrounding enviroment data of multiple directions are captured in the same time;
The surrounding enviroment data of capture are handled, wherein the processing includes splicing the periphery of multiple directions in real time
Environmental data;
Compound surrounding enviroment information is generated according to the result of processing.
Further, the direction quantity in the multiple direction is 4.
Further, the surrounding enviroment data in the multiple direction are captured by autonomous robot.
Further, the autonomous robot includes at least four camera, and at least four camera is used for from 4 directions
Capture surrounding enviroment data.
Further, the camera includes fish eye lens, and the fish eye lens is used to capture the peripheral ring of non-spherical structure
Border data.
Further, the surrounding enviroment are indoor environments.
Further, the surrounding enviroment data in the multiple direction include the environmental data of overlapping region, the overlapping region
Public capture region when referring to the surrounding enviroment data of capture different directions.
Further, the processing is further comprising the steps of:
The environmental data of overlapping region is matched and identified, and the weight when the environmental data of overlapping region has differences
The new environmental data for capturing overlapping region.
The second technical solution for being taken of the present invention is:
Surrounding enviroment scanning system, it is characterised in that:Including:
Main frame, for providing support;
Multiple modules of data capture on main frame, the environmental data for capturing multiple directions;
Processing module is communicatively coupled to the multiple modules of data capture, is used for the environmental data of multiple directions
Spliced, to generate compound surrounding enviroment information.
Further, the main frame, multiple modules of data capture and processing module form autonomous robot.
Further, each in the multiple modules of data capture is camera.
Further, the camera includes fish eye lens, and the fish eye lens is for capturing spherical image.
Further, the compound surrounding enviroment information generates in real time.
The beneficial effects of the invention are as follows:Surrounding enviroment scan method and system of the present invention can capture multiple in the same time
The surrounding enviroment data in direction, the limitation of not examined range and gathered data are more efficient and can guarantee scanning accuracy;To catching
The surrounding enviroment data caught can splice the surrounding enviroment data of multiple directions in real time when being handled, realize multiple scanning numbers
According to integration, real-time is more preferable.
Description of the drawings
Fig. 1 is the optimum embodiment structural schematic diagram of surrounding enviroment scanning system of the present invention;
Fig. 2 is a kind of preferred embodiment structural schematic diagram of surrounding enviroment scanning system of the present invention;
Fig. 3 is the internal structure block diagram of processing module of the present invention;
Fig. 4 is a kind of overall flow figure of embodiment of surrounding enviroment scan method of the present invention.
Specific implementation mode
The present invention is further explained and is illustrated with specific embodiment with reference to the accompanying drawings of the specification.
With reference to figure 1, manual control may be used in a kind of scanning system for generating real-time surrounding enviroment information of the present embodiment
The combination of mode processed, robot autonomous control mode or both, can also carry out remote control using mobile application.As shown in Figure 1,
The scanning system includes mainly main frame 102 and multiple support legs 104.As shown in Figure 1, being equipped on the main frame 102 more
A modules of data capture 106, each modules of data capture in the multiple modules of data capture 106 is for capturing corresponding direction
Environmental data.Wherein, main frame 102 can by any one in timber, metal, alloy, plastics, rubber and fiber or appoint
The combination for anticipating several is constituted.The shape of main frame 102 shown in FIG. 1 only plays a part of to facilitate explanation, those skilled in the art
It is appreciated that main frame 102 can have any shape and size.The multiple support leg 104 is used to provide for main frame 102
Support, to adjust the height of multiple modules of data capture 106 so that multiple modules of data capture 106 are to entire environmental area
Multi-direction scanning can disposably be completed in a certain height.
It is further used as preferred embodiment, each modules of data capture in the multiple modules of data capture is to take the photograph
As head.
It is further used as preferred embodiment, the camera includes fish eye lens.Wherein, fish eye lens, for catching
The spherical view or non-spherical structure view for catching corresponding direction region enhance visual impact to protrude area-of-interest.
It is further used as preferred embodiment, the multiple support leg includes at least one for mobile entire scanning system
The mobile device of system.Wherein, mobile device can select wheel, which can freely slide in any direction, so as to drive
Dynamic entire scanning system automatically moves and (corresponds to robot autonomous control mode) or controlled movement is (corresponding to the side of manually controlling
Formula) target location is arrived, to carry out real time kinematics scanning, solves existing either scans' mode and cannot achieve real time kinematics scanning
Problem.
The scanning system of the present embodiment further includes processing module (not shown in figure 1), processing module and multiple data captures
Module 106 connects, and receives the multi-direction environmental data of the capture of multiple modules of data capture 106 and processing is further processed (such as
Alignment and splicing etc.).
With reference to figure 2, the scanning system that the present embodiment is used to generate real-time surrounding enviroment information can select 4 data captures
Module.As shown in Fig. 2, 4 modules of data capture 106 can capture the data from indoor environment and send it to processing mould
Block 204 is further processed.
It is further used as preferred embodiment, 4 modules of data capture 106 (including modules of data capture 106A, data
Capture module 106B, modules of data capture 106C, modules of data capture 106D) in the outside of each can have corresponding scanning
Extension 202A-202D (this 4 extensions are referred to as scanning extension 202).Wherein, scanning extension 202 is for passing through tool
There is the camera (such as fish-eye camera) in the wider visual field to extend the visual field of shooting, prevent the data of indoor environment by scanning leakage,
Improve scanning quality.
It is further used as preferred embodiment, is corresponded in 4 scanning extensions 202 of 4 modules of data capture 106
Each can overlap each other so that indoor environment scans no detail data and is missed.The overlapping for scanning extension 202 can
To be realized by fixing manually, can also be automatically performed by processing module 204 before scanning starts.Therefore, 4 data are caught
Catching module 106 can log-on data capture-process together.
With reference to figure 3, the internal component of the processing module 204 of the present embodiment includes camera control module 302, camera number
According to capture module 304, data comparison module 306, combinational environment generator 308 and memory 310.The processing module of the present embodiment
204 can be the internal independent processor with various hardwares processing components or 204 internal component of processing module is software life
At.The processing module 204 of the present embodiment can also be the set of multiple processors, and common realization is identical with independent processor
Function.
Wherein, camera control module 302, it is each in multiple modules of data capture or multiple cameras for controlling
It is a.Camera control module 302 can set the visual field of camera, angle, diffusion parameter and moving parameter to a certain extent
Etc. parameters.
Camera modules of data capture 304, for receiving the multi-direction environment number captured by multiple modules of data capture 106
According to identifying that the data of reception are captured by which modules of data capture, and by the data of reception storage phase in memory 310
It answers in the time list under modules of data capture.
Data comparison module 306 identifies the difference of these data for comparing the data from multiple modules of data capture
It is different, and the difference that will identify that is stored in 310 corresponding diff list of memory.Data comparison module 306 will also check
With compare captured come from scanning extension overlapping region data, if coming from same overlapping region but by different numbers
It is had differences according to the data of capture module, then scans the overlapping region again using identical modules of data capture, so that should
The data of overlapping region are consistent.
Combinational environment generator 308 is used for the various views of indoor surrounding enviroment (by the number of multiple modules of data capture
According to) be aligned and be stitched together, and generate the complex three-dimensional map in indoor surrounding enviroment region in real time and store it in
In memory 310.
With reference to figure 4, a kind of detailed process of scan method for scanning surrounding enviroment of the present embodiment is as follows:
Step 402:Initialization starts multiple modules of data capture 106 by processing module 204 and is scanned with and start to catch
Catch ambient data.The instruction that the present embodiment controls multiple modules of data capture 106 can store in memory 310.Separately
Outside, the present embodiment processing module 204 may also receive from the instruction of remote equipment (such as mobile application), to control multiple numbers
According to capture module 106.
Step 404:Handle the data that multiple modules of data capture 106 are captured.Specifically processing may include step 404
Step 4042:The data captured from multiple directions match/be aligned and be stitched together.
Step 406:Combinational environment data are generated according to the data that splicing obtains, which passes through 3-D view
Mode indoor surrounding enviroment are described.
Step 408:The new environment information captured according to multiple modules of data capture 106 to the combinational environment data of generation into
Row update.
The process of 402~step 408 of above-mentioned steps composition is repeated with preset interval, until scanning is completed and is made a reservation for
The capture data of quantity do not change in subsequent periodically update in information.In addition, as described above, can also be according to remotely setting
Standby instruction stops scanning manually.
The embodiment of the present invention is described in detail in the flowchart and or block diagram of above method and system.The technology of this field
Personnel are readily appreciated that, the combination of each frame in above-mentioned flowchart and or block diagram and the frame in flowchart and or block diagram can be with
It is realized by computer program instructions.These computer program instructions may be implemented at one of flowchart and or block diagram or
The means for the action specified in multiple frames, can be provided to all-purpose computer, at special purpose computer or other programmable datas
The processor of device is managed to generate the machine instruction that can be executed by computer or the processor of other programmable data processing units.
These computer program instructions are also stored in computer-readable memory, which can instruct
Computer or other programmable data processing units operate in a specific way so that the finger being stored in computer-readable memory
Enable the device for generating the action that can realize that the instruction is specified in one or more boxes of flowchart and or block diagram.Computer
Program instruction can also be loaded into computer or other programmable data processing units, make computer or other programmable dresses
Set execution sequence of operations, to computer or other programmable devices according to the instruction of load realize flow chart and/
Or specified action or step in one or more frames of block diagram.
In addition, the step number in the embodiment of the present invention or module number, are arranged only for the purposes of illustrating explanation, it is right
The connection relation of sequence or intermodule between step does not do any restriction, the execution sequence and module of each step in embodiment
Between connection relation can be adaptively adjusted according to the understanding of those skilled in the art.
It is to be illustrated to the preferable implementation of the present invention, but the present invention is not limited to the embodiment above, it is ripe
Various equivalent variations or replacement can also be made under the premise of without prejudice to spirit of that invention by knowing those skilled in the art, this
Equivalent deformation or replacement are all contained in the application claim limited range a bit.
Claims (10)
1. surrounding enviroment scan method, it is characterised in that:Include the following steps:
The surrounding enviroment data of multiple directions are captured in the same time;
The surrounding enviroment data of capture are handled, wherein the processing includes splicing the surrounding enviroment of multiple directions in real time
Data;
Compound surrounding enviroment information is generated according to the result of processing.
2. surrounding enviroment scan method according to claim 1, it is characterised in that:The direction quantity in the multiple direction is
4。
3. surrounding enviroment scan method according to claim 2, it is characterised in that:The surrounding enviroment number in the multiple direction
It is captured according to by autonomous robot.
4. surrounding enviroment scan method according to claim 3, it is characterised in that:The autonomous robot includes at least 4
A camera, at least four camera are used to capture surrounding enviroment data from 4 directions.
5. surrounding enviroment scan method according to claim 4, it is characterised in that:The camera includes fish eye lens,
The fish eye lens is used to capture the surrounding enviroment data of non-spherical structure.
6. surrounding enviroment scan method according to claim 1, it is characterised in that:The surrounding enviroment number in the multiple direction
According to the environmental data including overlapping region, the overlapping region refers to public when capturing the surrounding enviroment data of different directions catches
Catch region.
7. surrounding enviroment scan method according to claim 6, it is characterised in that:The processing is further comprising the steps of:
The environmental data of overlapping region is matched and identified, and is caught again when the environmental data of overlapping region has differences
Catch the environmental data of overlapping region.
8. surrounding enviroment scanning system, it is characterised in that:Including:
Main frame, for providing support;
Multiple modules of data capture on main frame, the environmental data for capturing multiple directions;
Processing module is communicatively coupled to the multiple modules of data capture, for carrying out the environmental data of multiple directions
Splicing, to generate compound surrounding enviroment information.
9. surrounding enviroment scanning system according to claim 8, it is characterised in that:In the multiple modules of data capture
Each is camera.
10. surrounding enviroment scanning system according to claim 9, it is characterised in that:The camera includes fish eye lens,
The fish eye lens is for capturing spherical image.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762580468P | 2017-11-02 | 2017-11-02 | |
US62/580,468 | 2017-11-02 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108282615A true CN108282615A (en) | 2018-07-13 |
CN108282615B CN108282615B (en) | 2021-01-05 |
Family
ID=62805568
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810084872.9A Active CN108282615B (en) | 2017-11-02 | 2018-01-29 | Method and system for scanning surrounding environment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108282615B (en) |
WO (1) | WO2019085497A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112017407A (en) * | 2020-09-08 | 2020-12-01 | 中国科学院上海微系统与信息技术研究所 | Integrated wireless landmark monitoring device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5023725A (en) * | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5137238A (en) * | 1991-09-09 | 1992-08-11 | Hutten Friedrich W | Fast access camera mounting device |
CN1745337A (en) * | 2002-12-05 | 2006-03-08 | 索尼株式会社 | Imaging device |
CN101577795A (en) * | 2009-06-17 | 2009-11-11 | 深圳华为通信技术有限公司 | Method and device for realizing real-time viewing of panoramic picture |
CN105678729A (en) * | 2016-02-24 | 2016-06-15 | 段梦凡 | Splicing method for panoramic images of fish-eye lenses |
CN106161937A (en) * | 2016-07-23 | 2016-11-23 | 徐荣婷 | A kind of panoramic shooting machine people |
CN106572313A (en) * | 2016-10-31 | 2017-04-19 | 腾讯科技(深圳)有限公司 | Video recording box and video recording method and device |
US20170187956A1 (en) * | 2015-12-29 | 2017-06-29 | VideoStitch Inc. | System for processing data from an omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
CN107027338A (en) * | 2014-05-06 | 2017-08-08 | 扎卡里亚·尼亚齐 | Imaging system, methods and applications |
CN107219615A (en) * | 2017-07-31 | 2017-09-29 | 武汉赫天光电股份有限公司 | Panoramic optical systems and electronic equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101437155A (en) * | 2007-11-12 | 2009-05-20 | 詹雨明 | Omnidirection circumstance video monitoring system |
US10531071B2 (en) * | 2015-01-21 | 2020-01-07 | Nextvr Inc. | Methods and apparatus for environmental measurements and/or stereoscopic image capture |
-
2018
- 2018-01-29 CN CN201810084872.9A patent/CN108282615B/en active Active
- 2018-06-15 WO PCT/CN2018/091535 patent/WO2019085497A1/en active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5023725A (en) * | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5137238A (en) * | 1991-09-09 | 1992-08-11 | Hutten Friedrich W | Fast access camera mounting device |
CN1745337A (en) * | 2002-12-05 | 2006-03-08 | 索尼株式会社 | Imaging device |
CN101577795A (en) * | 2009-06-17 | 2009-11-11 | 深圳华为通信技术有限公司 | Method and device for realizing real-time viewing of panoramic picture |
CN107027338A (en) * | 2014-05-06 | 2017-08-08 | 扎卡里亚·尼亚齐 | Imaging system, methods and applications |
US20170187956A1 (en) * | 2015-12-29 | 2017-06-29 | VideoStitch Inc. | System for processing data from an omnidirectional camera with multiple processors and/or multiple sensors connected to each processor |
CN105678729A (en) * | 2016-02-24 | 2016-06-15 | 段梦凡 | Splicing method for panoramic images of fish-eye lenses |
CN106161937A (en) * | 2016-07-23 | 2016-11-23 | 徐荣婷 | A kind of panoramic shooting machine people |
CN106572313A (en) * | 2016-10-31 | 2017-04-19 | 腾讯科技(深圳)有限公司 | Video recording box and video recording method and device |
CN107219615A (en) * | 2017-07-31 | 2017-09-29 | 武汉赫天光电股份有限公司 | Panoramic optical systems and electronic equipment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112017407A (en) * | 2020-09-08 | 2020-12-01 | 中国科学院上海微系统与信息技术研究所 | Integrated wireless landmark monitoring device |
Also Published As
Publication number | Publication date |
---|---|
WO2019085497A1 (en) | 2019-05-09 |
CN108282615B (en) | 2021-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6879891B2 (en) | Methods and systems for completing point clouds using plane segments | |
CN110136208A (en) | A kind of the joint automatic calibration method and device of Visual Servoing System | |
CA3120725C (en) | Surveying and mapping system, surveying and mapping method and device, apparatus and medium | |
RU2741443C1 (en) | Method and device for sampling points selection for surveying and mapping, control terminal and data storage medium | |
CN108789421B (en) | Cloud robot interaction method based on cloud platform, cloud robot and cloud platform | |
CN107438152B (en) | Method and system for quickly positioning and capturing panoramic target by motion camera | |
CN111766951B (en) | Image display method and apparatus, computer system, and computer-readable storage medium | |
CN108287345A (en) | Spacescan method and system based on point cloud data | |
CN113781664B (en) | VR panorama construction display method, system and terminal based on three-dimensional model | |
WO2020211427A1 (en) | Segmentation and recognition method, system, and storage medium based on scanning point cloud data | |
CN112469967B (en) | Mapping system, mapping method, mapping device, mapping apparatus, and recording medium | |
CN106797455A (en) | A kind of projecting method, device and robot | |
CN109523499A (en) | A kind of multi-source fusion full-view modeling method based on crowdsourcing | |
CN108332660A (en) | Robot three-dimensional scanning system and scan method | |
CN108340405A (en) | A kind of robot three-dimensional scanning system and method | |
CN110275179A (en) | A kind of building merged based on laser radar and vision ground drawing method | |
CN108364340A (en) | The method and system of synchronous spacescan | |
CN110969696A (en) | Method and system for three-dimensional modeling rapid space reconstruction | |
CN108282615A (en) | Surrounding enviroment scan method and system | |
CN108347561A (en) | Laser aiming scanning system and scan method | |
JP6725736B1 (en) | Image specifying system and image specifying method | |
CN111868656B (en) | Operation control system, operation control method, device, equipment and medium | |
US20220321780A1 (en) | Systems and Methods for Capturing and Generating Panoramic Three-Dimensional Models and Images | |
CN111050128A (en) | Video fusion method, system, device and storage medium based on outdoor scene | |
CN108287549A (en) | A kind of method and system improving spacescan time performance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
PP01 | Preservation of patent right | ||
PP01 | Preservation of patent right |
Effective date of registration: 20220704 Granted publication date: 20210105 |