CN113052241A - Multi-sensor data fusion method and device and automobile - Google Patents

Multi-sensor data fusion method and device and automobile Download PDF

Info

Publication number
CN113052241A
CN113052241A CN202110329460.9A CN202110329460A CN113052241A CN 113052241 A CN113052241 A CN 113052241A CN 202110329460 A CN202110329460 A CN 202110329460A CN 113052241 A CN113052241 A CN 113052241A
Authority
CN
China
Prior art keywords
parameter
weight coefficient
parameter set
class weight
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110329460.9A
Other languages
Chinese (zh)
Inventor
熊新立
任凡
王宽
陈剑斌
李涛
邓皓匀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202110329460.9A priority Critical patent/CN113052241A/en
Publication of CN113052241A publication Critical patent/CN113052241A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/38Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation
    • G06F7/48Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation using non-contact-making devices, e.g. tube, solid state device; using unspecified devices
    • G06F7/544Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation using non-contact-making devices, e.g. tube, solid state device; using unspecified devices for evaluating functions by calculation
    • G06F7/5443Sum of products

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The scheme provides a multi-sensor data fusion method, a multi-sensor data fusion device and an automobile, wherein the method comprises the following steps: acquiring a first parameter set acquired for a target to be fused and a second parameter set acquired for the target to be fused; determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first predetermined corresponding relation between each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second predetermined corresponding relation between each parameter in the second parameter set and the weight coefficient; multiplying each parameter in the first parameter set by the corresponding first class weight coefficient respectively; multiplying each parameter in the second parameter set by the corresponding second class weight coefficient; and adding the parameters according to the previous multiplication result and the next multiplication result to obtain a fusion numerical value of each parameter related to the target to be fused.

Description

Multi-sensor data fusion method and device and automobile
Technical Field
The invention relates to a multi-sensor data fusion method, in particular to a multi-sensor data fusion method, a multi-sensor data fusion device and an automobile.
Background
In recent years, with the development of scientific technology, especially the rapid development of integrated circuits and their design technology, computer technology, sensor technology and artificial intelligence technology, the application of automatic driving technology has entered a new stage. The application of the automatic driving technology needs to solve key technologies such as multi-sensor fusion, decision making, control and the like, wherein the multi-sensor fusion technology is the basis of the automatic driving application. At present, the multi-sensor data fusion method mainly comprises various methods such as particle filtering, Kalman filtering and the like.
Through research, the Kalman filtering method is influenced by uncertain interference signals of a model, the influence can cause that the Kalman filtering algorithm loses optimality and the estimation precision is greatly reduced, and on the other hand, when a target in the same scene is observed by a plurality of sensors at the same time and data fusion is carried out by continuously using Kalman filtering, the time complexity is possibly high.
Disclosure of Invention
The scheme provides a multi-sensor data fusion method and device and an automobile, and aims to solve the problems that the existing sensor target fusion is low in precision and complex.
The technical scheme of the invention is as follows:
the embodiment of the invention provides a multi-sensor data fusion method, which comprises the following steps:
step S1, acquiring a first parameter set acquired by a forward-looking camera for a target to be fused and a second parameter set acquired by a forward radar for the target to be fused;
step S2, determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first predetermined corresponding relation table of each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second predetermined corresponding relation table of each parameter in the second parameter set and the weight coefficient;
step S3, multiplying each parameter in the first parameter set with the corresponding first class weight coefficient; multiplying each parameter in the second parameter set by the corresponding second class weight coefficient respectively;
step S4, adding the previous multiplication result and the next multiplication result of each parameter obtained in step S3 to obtain a fusion numerical value of each parameter related to the target to be fused; in the first predetermined correspondence table and the second predetermined correspondence table, the sum of the first-class weight coefficient and the second-class weight coefficient corresponding to the same parameter is 1.
Wherein, before step S2, the method further comprises:
step S5, respectively performing coordinate system transformation on the first parameter set and the second parameter set, so that the first parameter set and the second parameter set are transformed into the same coordinate system.
In the first predetermined correspondence table and the second predetermined correspondence table in step S2, the first class weight coefficient and the second class weight coefficient corresponding to each parameter are obtained based on a preliminary experiment; wherein, when the experiment is carried out in advance, under each test scene:
firstly, acquiring a first value, a second value and a standard value which are respectively acquired by a forward-looking camera, a forward radar and a standard sensor aiming at each parameter of a test target;
then calculating a first deviation between the first value of each parameter and the standard value and a second deviation between the second value of each parameter and the standard value;
finally, based on the comparison of the first deviation and the second deviation associated with each parameter, distributing a first class weight coefficient corresponding to each parameter acquired by the front-view camera and a second class weight coefficient corresponding to each parameter acquired by the front radar; wherein the smaller the deviation, the larger the assigned weight coefficient.
When a first class of weight coefficients are distributed for each parameter collected by the forward-looking camera and a second class of weight coefficients are distributed for each parameter collected by the front radar, the following requirements are met: making the square sum of the difference value between the fusion value corresponding to each parameter and the standard value smaller than the set error; the fusion value is the sum of the product of the first value of the parameter and the assigned first class weight coefficient and the product of the second value and the assigned second class weight coefficient.
The embodiment of the invention also provides a multi-sensor data fusion device, which comprises:
the acquisition module is used for acquiring a first parameter set acquired by the foresight camera aiming at the target to be fused and a second parameter set acquired by the foresight radar aiming at the target to be fused;
the determining module is used for determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first preset corresponding relation table of each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second preset corresponding relation table of each parameter in the second parameter set and the weight coefficient;
the multiplication module is used for multiplying each parameter in the first parameter set with the corresponding first class weight coefficient; multiplying each parameter in the second parameter set by the corresponding second class weight coefficient respectively;
the fusion module is used for adding the previous multiplication result and the next multiplication result obtained by each parameter to obtain a fusion numerical value of each parameter related to the target to be fused; in the first predetermined correspondence table and the second predetermined correspondence table, the sum of the first-class weight coefficient and the second-class weight coefficient corresponding to the same parameter is 1.
The embodiment of the invention also provides an automobile which comprises the multi-sensor data fusion device.
The invention has the beneficial effects that:
the invention does not need Kalman filtering processing, and can avoid the problem of low precision caused by filtering processing. The invention realizes the fusion of the output attributes of the forward-looking camera and the forward radar based on scene division, and performs qualitative and quantitative comparative analysis on the fusion result, thereby showing the feasibility and reliability of the method provided by the invention. Compared with other methods, the method not only realizes the fusion of the data of the multiple sensors, but also has easy operation and relatively good effect. Based on the above description, the invention has high use value in the multi-sensor data fusion in the intelligent driving automobile.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a multi-sensor data fusion method, which is applied to a fusion processor, and specifically includes:
step S101, a first parameter set acquired by the forward-looking camera for the target to be measured and a second parameter set acquired by the forward radar for the target to be measured are acquired.
Specifically, a front-view camera and a front radar which are installed on a vehicle respectively carry out data acquisition in the running process of the vehicle. After the forward-looking camera collects the image, the forward-looking camera can determine the type of the target in the image, the transverse distance and the longitudinal distance between the forward-looking camera and the target in the image, and parameter information such as the transverse acceleration, the longitudinal acceleration and the target course angle of the forward-looking camera relative to the target based on image analysis. Similarly, the front radar also obtains the information based on the processing.
After the processing, the forward-looking camera can output all the associated parameter information for different targets, in this embodiment, all the parameter information output by the forward-looking camera for one target is referred to as a first parameter set, and the first parameter set includes all the parameter information of the forward-looking camera for the target.
Similarly, after the processing, the front radar can output all the associated parameter information for different targets, in this embodiment, all the parameter information output by the front radar for one target is referred to as a second parameter set, and the second parameter set includes all the parameter information of the front-view camera for the target.
Because the forward-looking camera and the forward radar are in the data acquisition process, a plurality of targets can be acquired. When the fusion processor receives information of a plurality of targets to be fused sent by the forward-looking camera and the forward radar, a first parameter set and a second parameter set of the same target to be fused need to be determined.
And step S102, respectively carrying out coordinate system conversion on the first parameter set and the second parameter set of the target to be fused, so that the coordinate systems of the first parameter set and the second parameter set are converted into the same coordinate system.
Specifically, in this embodiment, the coordinate system of the first parameter set may be converted into the reference coordinate system of the front radar, and the coordinate system of the second parameter set may also be converted into the reference coordinate system of the front camera.
The front radar adopted in the embodiment of the invention takes the vehicle as a reference coordinate system, and in order to facilitate fitting, a coordinate system of a first parameter set acquired by a front-looking camera needs to be transformed to be under a front radar coordinate system. Since the mounting angle and position of the forward looking camera are known, the present invention uses stereo vision in computer vision to effect a transformation of the two coordinate systems.
The transformation of any two coordinate systems in computer vision can be represented using a rotation matrix R and a translation vector T. Assuming that the coordinates of the detected target in the front-view camera coordinate system and the front radar coordinate system are p and q, respectively, the relationship between p and q is as follows:
p=qR+T
in this embodiment, a specific manner of performing coordinate system conversion is the prior art.
Step S103, determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first predetermined corresponding relation table of each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second predetermined corresponding relation table of each parameter in the second parameter set and the weight coefficient.
In the first predetermined correspondence table and the second predetermined correspondence table, the first-class weight coefficient and the second-class weight coefficient corresponding to each parameter are obtained based on experiments in advance. Specifically, in the preliminary experiment, the weight coefficients of the parameters collected by the front-view camera and the front radar are distributed, then the fusion value is obtained by weighting and solving based on the distributed weight coefficients, qualitative and quantitative analysis is carried out on the fusion value and the standard value collected by the standard sensor, and the distributed weight coefficients are considered to be reasonable until the sum of the squares of the errors between the fusion value and the real value obtained by weighting and solving the weight coefficients distributed to the parameters of the front-view camera and the front radar is less than a preset allowable error value.
Specifically, the sum of the squares of the errors of the calculated fusion value and the true value obtained by RT3K is calculated, and the calculation formula is as follows:
Figure BDA0002995868410000061
and when the epsilon is smaller than a given error threshold value T, the scene-based multi-sensor target fusion method can better realize the fusion of the forward-looking camera and the front radar.
The standard sensor is a sensor which is verified in advance and determines that the detection precision of the detection target meets the detection requirement. In this embodiment, the standard sensor is an RT3K sensor.
Specifically, in the preliminary experiment, under each test scenario:
firstly, acquiring a first value, a second value and a standard value which are respectively acquired by a forward-looking camera, a forward radar and a standard sensor aiming at each parameter of a test target;
then calculating a first deviation between the first value of each parameter and the standard value and a second deviation between the second value of each parameter and the standard value;
finally, based on the comparison of the first deviation and the second deviation associated with each parameter, distributing a first class weight coefficient corresponding to each parameter acquired by the front-view camera and a second class weight coefficient corresponding to each parameter acquired by the front radar; wherein the smaller the deviation, the larger the assigned weight coefficient.
And aiming at the same parameter collected by the forward-looking camera and the forward radar, the sum of the corresponding first class weight coefficient and the second class weight coefficient is 1.
Step S104, multiplying each parameter in the first parameter set by the corresponding first class weight coefficient; and multiplying each parameter in the second parameter set by the corresponding second class weight coefficient respectively. For example, a forward looking camera acquires one lateral distance for a guardrail and a forward radar acquires another lateral distance for the same guardrail. And multiplying the transverse distance acquired by the forward-looking camera by a first class of weight coefficient corresponding to the transverse distance, and multiplying the transverse distance acquired by the forward radar by a second class of weight coefficient corresponding to the transverse distance.
And step S105, adding the previous multiplication result and the next multiplication result of each parameter obtained in the step S104 to obtain a fusion numerical value of each parameter related to the target to be fused.
After the above processing, the parameter set for the target to be fused required by the embodiment can be obtained.
The embodiment of the invention also provides a multi-sensor data fusion device, which comprises:
the acquisition module is used for acquiring a first parameter set acquired by the foresight camera aiming at the target to be fused and a second parameter set acquired by the foresight radar aiming at the target to be fused;
the determining module is used for determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first preset corresponding relation table of each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second preset corresponding relation table of each parameter in the second parameter set and the weight coefficient;
the multiplication module is used for multiplying each parameter in the first parameter set with the corresponding first class weight coefficient; multiplying each parameter in the second parameter set by the corresponding second class weight coefficient respectively;
the fusion module is used for adding the previous multiplication result and the next multiplication result obtained by each parameter to obtain a fusion numerical value of each parameter related to the target to be fused; in the first predetermined correspondence table and the second predetermined correspondence table, the sum of the first-class weight coefficient and the second-class weight coefficient corresponding to the same parameter is 1.
The embodiment of the present invention further provides an automobile, including the above multi-sensor data fusion device, which should be finally explained as follows: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not cause the essence of the corresponding technical solutions to depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (6)

1. A multi-sensor data fusion method, comprising:
step S1, acquiring a first parameter set acquired by a forward-looking camera for a target to be fused and a second parameter set acquired by a forward radar for the target to be fused;
step S2, determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first predetermined corresponding relation table of each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second predetermined corresponding relation table of each parameter in the second parameter set and the weight coefficient;
step S3, multiplying each parameter in the first parameter set with the corresponding first class weight coefficient; multiplying each parameter in the second parameter set by the corresponding second class weight coefficient respectively;
step S4, adding the previous multiplication result and the next multiplication result of each parameter obtained in step S3 to obtain a fusion numerical value of each parameter related to the target to be fused; in the first predetermined correspondence table and the second predetermined correspondence table, the sum of the first-class weight coefficient and the second-class weight coefficient corresponding to the same parameter is 1.
2. The method according to claim 1, wherein before step S2, the method further comprises:
step S5, respectively performing coordinate system transformation on the first parameter set and the second parameter set, so that the first parameter set and the second parameter set are transformed into the same coordinate system.
3. The method according to claim 1, wherein in the first predetermined correspondence table and the second predetermined correspondence table of step S2, the first class weight coefficient and the second class weight coefficient corresponding to each parameter are obtained based on experiments in advance; wherein, when the experiment is carried out in advance, under each test scene:
firstly, acquiring a first value, a second value and a standard value which are respectively acquired by a forward-looking camera, a forward radar and a standard sensor aiming at each parameter of a test target;
then calculating a first deviation between the first value of each parameter and the standard value and a second deviation between the second value of each parameter and the standard value;
finally, based on the comparison of the first deviation and the second deviation associated with each parameter, distributing a first class weight coefficient corresponding to each parameter acquired by the front-view camera and a second class weight coefficient corresponding to each parameter acquired by the front radar; wherein the smaller the deviation, the larger the assigned weight coefficient.
4. The method according to claim 3, wherein when the first type of weighting factor is performed for each parameter collected by the forward-looking camera and the second type of weighting factor is performed for each parameter collected by the forward radar, the following requirements are satisfied: making the square sum of the difference value between the fusion value corresponding to each parameter and the standard value smaller than the set error; the fusion value is the sum of the product of the first value of the parameter and the assigned first class weight coefficient and the product of the second value and the assigned second class weight coefficient.
5. A multi-sensor data fusion apparatus, comprising:
the acquisition module is used for acquiring a first parameter set acquired by the foresight camera aiming at the target to be fused and a second parameter set acquired by the foresight radar aiming at the target to be fused;
the determining module is used for determining a first class weight coefficient corresponding to each parameter in the first parameter set according to a first preset corresponding relation table of each parameter in the first parameter set and the weight coefficient, and determining a second class weight coefficient corresponding to each parameter in the second parameter set according to a second preset corresponding relation table of each parameter in the second parameter set and the weight coefficient;
the multiplication module is used for multiplying each parameter in the first parameter set with the corresponding first class weight coefficient; multiplying each parameter in the second parameter set by the corresponding second class weight coefficient respectively;
the fusion module is used for adding the previous multiplication result and the next multiplication result obtained by each parameter to obtain a fusion numerical value of each parameter related to the target to be fused; in the first predetermined correspondence table and the second predetermined correspondence table, the sum of the first-class weight coefficient and the second-class weight coefficient corresponding to the same parameter is 1.
6. An automobile comprising the multi-sensor data fusion apparatus of claim 5.
CN202110329460.9A 2021-03-28 2021-03-28 Multi-sensor data fusion method and device and automobile Pending CN113052241A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110329460.9A CN113052241A (en) 2021-03-28 2021-03-28 Multi-sensor data fusion method and device and automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110329460.9A CN113052241A (en) 2021-03-28 2021-03-28 Multi-sensor data fusion method and device and automobile

Publications (1)

Publication Number Publication Date
CN113052241A true CN113052241A (en) 2021-06-29

Family

ID=76515841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110329460.9A Pending CN113052241A (en) 2021-03-28 2021-03-28 Multi-sensor data fusion method and device and automobile

Country Status (1)

Country Link
CN (1) CN113052241A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107918386A (en) * 2017-10-25 2018-04-17 北京汽车集团有限公司 Multi-Sensor Information Fusion Approach, device and vehicle for vehicle
CN111137283A (en) * 2019-12-27 2020-05-12 奇瑞汽车股份有限公司 Sensor data fusion method and device, advanced driving assistance system and vehicle
CN111739063A (en) * 2020-06-23 2020-10-02 郑州大学 Electric power inspection robot positioning method based on multi-sensor fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107918386A (en) * 2017-10-25 2018-04-17 北京汽车集团有限公司 Multi-Sensor Information Fusion Approach, device and vehicle for vehicle
CN111137283A (en) * 2019-12-27 2020-05-12 奇瑞汽车股份有限公司 Sensor data fusion method and device, advanced driving assistance system and vehicle
CN111739063A (en) * 2020-06-23 2020-10-02 郑州大学 Electric power inspection robot positioning method based on multi-sensor fusion

Similar Documents

Publication Publication Date Title
CN107567412B (en) Object position measurement using vehicle motion data with automotive camera
CN112083725B (en) Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
US10552982B2 (en) Method for automatically establishing extrinsic parameters of a camera of a vehicle
EP4036870A1 (en) Parking spot detection method and parking spot detection system
CN112991391A (en) Vehicle detection and tracking method based on radar signal and vision fusion
CN105374049B (en) Multi-corner point tracking method and device based on sparse optical flow method
CN106203381A (en) Obstacle detection method and device in a kind of driving
CN115063454B (en) Multi-target tracking matching method, device, terminal and storage medium
CN112313536B (en) Object state acquisition method, movable platform and storage medium
CN114758504A (en) Online vehicle overspeed early warning method and system based on filtering correction
CN110637209A (en) Method, apparatus, and computer-readable storage medium having instructions for estimating a pose of a motor vehicle
CN111735443B (en) Dense target track correlation method based on assignment matrix
CN111693051B (en) Multi-target data association method based on photoelectric sensor
CN104471436A (en) Method and device for calculating a change in an image scale of an object
CN112305513A (en) Sensor measurement parameter correction method and system
CN111273701A (en) Visual control system and control method for holder
CN111144415A (en) Method for detecting micro pedestrian target
CN113052241A (en) Multi-sensor data fusion method and device and automobile
US20230281872A1 (en) System for calibrating extrinsic parameters for a camera in an autonomous vehicle
CN114913352A (en) Multi-source information space-time registration method and system based on joint similarity matching
CN114742141A (en) Multi-source information data fusion studying and judging method based on ICP point cloud
CN114067224A (en) Unmanned aerial vehicle cluster target number detection method based on multi-sensor data fusion
CN109919998B (en) Satellite attitude determination method and device and terminal equipment
CN111192290A (en) Blocking processing method for pedestrian image detection
Zhang et al. A Novel Fusion Perception Method of Unmanned Driving Based on Distributed Roadside Information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210629