CN112351154B - Unmanned vehicle road condition identification system - Google Patents

Unmanned vehicle road condition identification system Download PDF

Info

Publication number
CN112351154B
CN112351154B CN202011174844.XA CN202011174844A CN112351154B CN 112351154 B CN112351154 B CN 112351154B CN 202011174844 A CN202011174844 A CN 202011174844A CN 112351154 B CN112351154 B CN 112351154B
Authority
CN
China
Prior art keywords
unit
vehicle
automobile
brightness
central processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011174844.XA
Other languages
Chinese (zh)
Other versions
CN112351154A (en
Inventor
程泊静
胥刚
兰新武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Automotive Engineering Vocational College
Original Assignee
Hunan Automotive Engineering Vocational College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Automotive Engineering Vocational College filed Critical Hunan Automotive Engineering Vocational College
Priority to CN202011174844.XA priority Critical patent/CN112351154B/en
Publication of CN112351154A publication Critical patent/CN112351154A/en
Application granted granted Critical
Publication of CN112351154B publication Critical patent/CN112351154B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of intelligent driving, and discloses a road condition identification system for an unmanned automobile, which is used for improving the intelligent response of the unmanned automobile in a complex and variable environment. The system of the invention comprises: cameras and various sensors; the control device is arranged on a center console of the unmanned automobile and is provided with a central processing unit, an image acquisition unit, a sensing signal acquisition unit, an image optimization unit, a signal transmission unit, a voice navigation unit, a GPS (global positioning system) positioning unit, a lane line following unit, an automobile following unit, a brake control unit, a voice reminding unit and a wireless transmission unit. The invention adopts a simple distributed modular structure, has high intelligent degree, can acquire the surrounding environment and road conditions of the unmanned automobile in real time, ensures the safety of driving and surrounding pedestrians, and has high safety.

Description

Unmanned automobile road condition recognition system
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a road condition identification system for an unmanned automobile.
Background
The unmanned automobile is an intelligent automobile which senses road environment through a vehicle-mounted sensing system, automatically plans a driving route and controls the automobile to reach a preset target.
Current unmanned vehicles have some intelligence, but there is still room for improvement.
Disclosure of Invention
The invention mainly aims to disclose a road condition identification system of an unmanned automobile, so as to improve the intelligent response under the surrounding complex and variable environment.
In order to achieve the above object, the present invention discloses a road condition recognition system for an unmanned vehicle, comprising:
the front camera, the rear camera, the left camera and the right camera are respectively arranged on the front side, the rear side, the left side and the right side of the unmanned automobile;
the human body infrared pyroelectric sensor, the ultrasonic ranging sensor and the brightness sensor are arranged outside the unmanned automobile;
the control device is arranged on a center console of the unmanned automobile and is provided with a central processing unit, an image acquisition unit, a sensing signal acquisition unit, an image optimization unit, a signal transmission unit, a voice navigation unit, a GPS (global positioning system) positioning unit, a lane line following unit, an automobile following unit, a brake control unit, a voice reminding unit and a wireless transmission unit;
the input end of the image acquisition unit is respectively connected with the front camera, the rear camera, the left camera and the right camera, the output end of the image acquisition unit is connected with the signal transmission unit through the image optimization unit, the input end of the sensing signal acquisition unit is respectively connected with the human body infrared pyroelectric sensor, the ultrasonic ranging sensor and the brightness sensor, the output end of the sensing signal acquisition unit is connected with the signal transmission unit, the signal transmission unit is connected with the central processing unit, the central processing unit is respectively connected with the voice navigation unit, the GPS positioning unit, the lane line following unit, the automobile following unit, the brake control unit and the voice reminding unit, and the central processing unit is connected with the background monitoring center through the wireless transmission unit;
the image optimization unit is used for splicing the images of the front camera, the rear camera, the left camera and the right camera which are acquired by the image acquisition unit into an image around the automobile; in the splicing process, pixels of images acquired by each camera are divided into a plurality of layers according to the brightness values, and for the layer with the lowest brightness and the layer with the largest brightness, histogram equalization processing is independently performed, then background noise is removed, and finally noise point removal is performed; then removing noise points and background noise of the image layers between the lowest brightness and the highest brightness, and finally performing histogram equalization processing; finally, overlapping and splicing all the processed image layers into an image-enhanced full-face image around the automobile;
the human body infrared pyroelectric sensor is used for collecting surrounding human body signals;
the ultrasonic ranging sensor measures the distance between the automobile and surrounding obstacles based on ultrasonic waves;
the brightness sensor is used for acquiring brightness information;
the central processing unit is used for analyzing the state of the surrounding environment according to the acquired surrounding environment data, instructing the ultrasonic ranging sensor to acquire the distance between the ultrasonic ranging sensor and a person if a human body signal is acquired by the human body infrared pyroelectric sensor, instructing the brake control unit to start a brake program to brake and decelerate if the distance is within a preset range, and normally driving if the distance is outside the preset range; if the brightness of the surroundings collected by the brightness sensor is lower than a set threshold value, the speed of the automobile is limited and the corresponding speed reduction is carried out; if the vehicle speed is lower than a set value within a period of time, the vehicle following unit is instructed to analyze whether a front vehicle exists from the spliced full-view images around the vehicle, and if the front vehicle exists, the vehicle follows the speed of the front vehicle to enable the distance between the vehicle and the front vehicle to be within a safe range; if the vehicle speed is higher than a set value within a period of time, instructing the lane line following unit to extract lane line information from the spliced full-view images around the vehicle and keeping a lane along with the lane lines under the condition of no lane change; if the information needing voice broadcasting is analyzed, the voice reminding unit is instructed to broadcast;
the GPS positioning unit is used for acquiring real-time position information of the automobile;
the voice navigation sheet is used for planning a route according to the acquired destination position and the current position, and performing voice navigation in the driving process of the planned route;
the central processing unit is also used for coordinating resource preemption conflict between the voice reminding unit and the voice navigation unit.
In conclusion, the invention has the following beneficial effects:
the invention adopts a simple distributed modular structure, has high intelligent degree, can acquire the surrounding environment and road conditions of the unmanned automobile in real time, ensures the safety of driving and surrounding pedestrians, and has high safety.
The present invention will be described in further detail below with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic diagram of a frame of a road condition recognition system for an unmanned vehicle according to a preferred embodiment of the invention.
Detailed Description
Embodiments of the invention will be described in detail below with reference to the drawings, but the invention can be implemented in many different ways as defined and covered by the claims.
Example 1
The embodiment discloses a road condition identification system for an unmanned vehicle, as shown in fig. 1, including:
install respectively at front camera 1, rear camera 2, left camera 3 and right camera 4 on the front side, rear side, left side and the right side of unmanned vehicle.
A human body infrared pyroelectric sensor 5, an ultrasonic ranging sensor 6 and a brightness sensor 7 which are arranged outside the unmanned automobile. In the same way, preferably, the ultrasonic ranging sensor 6 is composed of a plurality of sensors distributed on the same horizontal plane of the vehicle body in the front, the back, the left side and the right side of the vehicle body.
The control device installed on the center console of the unmanned automobile is provided with a central processing unit 8, an image acquisition unit 9, a sensing signal acquisition unit 10, an image optimization unit 11, a signal transmission unit 12, a voice navigation unit 13, a GPS (global positioning system) positioning unit 14, a lane line following unit 15, an automobile following unit 16, a brake control unit 17, a voice reminding unit 18 and a wireless transmission unit 19.
The utility model discloses a car driving device, including image acquisition unit 9, human infrared pyroelectric sensor 5, ultrasonic ranging sensor 6 and luminance sensor 7, signal transmission unit 12 is connected through image optimization unit 11 to image acquisition unit 9 input, human infrared pyroelectric sensor 5, ultrasonic ranging sensor 6 and luminance sensor 7 are connected respectively to image acquisition unit 9 output, signal transmission unit 12 is connected to sensing signal acquisition unit 10 output, signal transmission unit 12 connects central processing unit 8, central processing unit 8 connects voice navigation unit 13, GPS positioning unit 14, lane line respectively and follows unit 15, car with car unit 16, braking control unit 17, pronunciation warning unit 18, central processing unit 8 passes through wireless transmission unit 19 and connects backstage surveillance center 20.
The image optimization unit 11 is configured to splice the images of the front camera 1, the rear camera 2, the left camera 3, and the right camera 4 acquired by the image acquisition unit 9 into an image around the automobile; in the splicing process, pixels of images acquired by each camera are divided into a plurality of layers according to brightness values, and for the layer with the lowest brightness and the layer with the largest brightness, histogram equalization processing is carried out independently, then background noise is removed, and finally noise point removal is carried out; then removing noise points and background noise of the image layer between the lowest brightness and the highest brightness, and finally performing histogram equalization processing; and finally, overlapping and splicing all the processed image layers into an image-enhanced full-face image around the automobile. The image optimization processing mode for differentiating different image layers is adopted, so that the global brightness difference of the spliced image is effectively reduced, and the overall display effect is improved; meanwhile, noise is effectively suppressed, uniform data processing is conveniently carried out on other subsequent functional units, and precision is ensured.
The human body infrared pyroelectric sensor 5 is used for collecting human body signals around.
The ultrasonic ranging sensor 6 measures the distance between the automobile and surrounding obstacles based on ultrasonic waves.
The brightness sensor 7 is used for collecting brightness information. The speed control device is preferably arranged at the head of the automobile so as to facilitate the automobile to timely control the speed under road conditions such as tunnel entering.
The central processing unit 8 is used for analyzing the state of the surrounding environment according to the acquired surrounding environment data, instructing the ultrasonic ranging sensor 6 to acquire the distance between the vehicle and a person if a human body signal is acquired by the human body infrared pyroelectric sensor 5, instructing the braking control unit 17 to start a braking program to brake and decelerate if the distance is within a preset range, and normally driving if the distance is outside the preset range; if the brightness sensor 7 acquires that the surrounding brightness is lower than a set threshold value, the speed of the automobile is limited and the corresponding speed is reduced; if the vehicle speed is lower than the set value within a period of time, the vehicle following unit 16 is instructed to analyze whether a front vehicle exists from the spliced full-view images around the vehicle, and if so, the vehicle follows the speed of the front vehicle to enable the distance between the vehicle and the front vehicle to be within a safe range; if the vehicle speed is higher than the set value within a period of time, the lane line following unit 15 is instructed to extract lane line information from the spliced full-face image around the vehicle and keep the lane along with the lane lines under the condition of no lane change; and if the information needing voice broadcasting, such as information of current road section congestion, system failure needing emergency braking and the like, is analyzed, instructing the voice reminding unit 13 to broadcast.
The GPS positioning unit 14 is used to obtain real-time position information of the vehicle.
The voice navigation unit 13 is configured to plan a route according to the acquired destination location and the current location, and perform voice navigation during a driving process of the planned route.
The central processing unit 8 is further configured to coordinate resource preemption conflict between the voice prompting unit 18 and the voice navigation unit 13, so as to avoid that the user experience is affected by the mixing sound generated by the simultaneous broadcast of the two functional units.
To sum up, the unmanned vehicle road condition identification system disclosed in this embodiment adopts brief distributed modular structure, and intelligent degree is high, can gather unmanned vehicle surrounding environment, road conditions in real time, has ensured driving and pedestrian's safety on every side, and the security is high.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (3)

1. A road condition recognition system for an unmanned vehicle, comprising:
the front camera (1), the rear camera (2), the left camera (3) and the right camera (4) are respectively arranged on the front side, the rear side, the left side and the right side of the unmanned automobile;
a human body infrared pyroelectric sensor (5), an ultrasonic ranging sensor (6) and a brightness sensor (7) which are arranged outside the unmanned automobile;
the control device is arranged on a center console of the unmanned automobile and is provided with a central processing unit (8), an image acquisition unit (9), a sensing signal acquisition unit (10), an image optimization unit (11), a signal transmission unit (12), a voice navigation unit (13), a GPS (global positioning system) positioning unit (14), a lane line following unit (15), an automobile following unit (16), a brake control unit (17), a voice reminding unit (18) and a wireless transmission unit (19);
the system comprises an image acquisition unit (9), a front camera (1), a rear camera (2), a left camera (3) and a right camera (4), wherein the input end of the image acquisition unit (9) is respectively connected with a signal transmission unit (12) through an image optimization unit (11), the input end of a sensing signal acquisition unit (10) is respectively connected with a human infrared pyroelectric sensor (5), an ultrasonic ranging sensor (6) and a brightness sensor (7), the output end of the sensing signal acquisition unit (10) is connected with the signal transmission unit (12), the signal transmission unit (12) is connected with a central processing unit (8), the central processing unit (8) is respectively connected with a voice navigation unit (13), a GPS positioning unit (14), a lane line following unit (15), an automobile following unit (16), a brake control unit (17) and a voice reminding unit (18), and the central processing unit (8) is connected with a background monitoring center (20) through a wireless transmission unit (19);
the image optimization unit (11) is used for splicing the images of the front camera (1), the rear camera (2), the left camera (3) and the right camera (4) collected by the image collection unit (9) into an image around the automobile; in the splicing process, pixels of images acquired by each camera are divided into a plurality of layers according to brightness values, and for the layer with the lowest brightness and the layer with the largest brightness, histogram equalization processing is carried out independently, then background noise is removed, and finally noise point removal is carried out; then removing noise points and background noise of the image layer between the lowest brightness and the highest brightness, and finally performing histogram equalization processing; finally, overlapping and splicing all the processed image layers into an image-enhanced full-face image around the automobile;
the human body infrared pyroelectric sensor (5) is used for collecting human body signals around;
the ultrasonic ranging sensor (6) measures the distance between the automobile and surrounding obstacles based on ultrasonic waves;
the brightness sensor (7) is used for collecting brightness information;
the central processing unit (8) is used for analyzing the state of the surrounding environment according to the acquired surrounding environment data, if a human body signal is acquired by the human body infrared pyroelectric sensor (5), the central processing unit instructs the ultrasonic distance measurement sensor (6) to acquire the distance between the central processing unit and a human body, if the distance is within a preset range, the central processing unit instructs the brake control unit (17) to start a brake program to brake and decelerate, and if the distance is outside the preset range, the central processing unit drives normally; if the brightness acquired by the brightness sensor (7) is lower than a set threshold, the speed of the automobile is limited and the corresponding speed reduction is carried out; if the vehicle speed is lower than a set value within a period of time, the vehicle following unit (16) is instructed to analyze whether a front vehicle exists from the spliced full-view images around the vehicle, and if the front vehicle exists, the vehicle follows the front vehicle to run so that the distance between the vehicle and the front vehicle is within a safe range; if the vehicle speed is higher than a set value within a period of time, the lane line following unit (15) is instructed to extract lane line information from the spliced full-form images around the vehicle and keep the lane along with the lane lines under the condition of no lane change; if the information needing voice broadcasting is analyzed, the voice reminding unit (18) is instructed to broadcast;
the GPS positioning unit (14) is used for acquiring real-time position information of the automobile;
the voice navigation unit (13) is used for planning a route according to the acquired destination position and the current position, and performing voice navigation in the driving process of the planned route;
the central processing unit (8) is further configured to coordinate resource preemption conflicts between the voice prompt unit (18) and the voice navigation unit (13).
2. The unmanned aerial vehicle traffic status identification system according to claim 1, wherein the brightness sensor (7) is disposed at a vehicle head position.
3. The system for identifying road conditions of unmanned vehicles according to claim 1, wherein the ultrasonic ranging sensors (6) are uniformly distributed on the same horizontal plane of the vehicle body.
CN202011174844.XA 2020-10-28 2020-10-28 Unmanned vehicle road condition identification system Active CN112351154B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011174844.XA CN112351154B (en) 2020-10-28 2020-10-28 Unmanned vehicle road condition identification system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011174844.XA CN112351154B (en) 2020-10-28 2020-10-28 Unmanned vehicle road condition identification system

Publications (2)

Publication Number Publication Date
CN112351154A CN112351154A (en) 2021-02-09
CN112351154B true CN112351154B (en) 2022-11-15

Family

ID=74355633

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011174844.XA Active CN112351154B (en) 2020-10-28 2020-10-28 Unmanned vehicle road condition identification system

Country Status (1)

Country Link
CN (1) CN112351154B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113200052B (en) * 2021-05-06 2021-11-16 上海伯镭智能科技有限公司 Intelligent road condition identification method for unmanned driving

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108766004B (en) * 2018-04-27 2021-08-20 榛硕(武汉)智能科技有限公司 Overtaking control system and method for unmanned vehicle
US10816979B2 (en) * 2018-08-24 2020-10-27 Baidu Usa Llc Image data acquisition logic of an autonomous driving vehicle for capturing image data using cameras
CN109753073A (en) * 2019-01-25 2019-05-14 温州大学 A kind of unmanned intelligent vehicle speed management system
CN110580046B (en) * 2019-09-12 2022-08-16 吉利汽车研究院(宁波)有限公司 Control method and system for unmanned sightseeing vehicle
CN110662004A (en) * 2019-10-29 2020-01-07 怀化市联翰信息科技有限公司 Vehicle video monitoring system and method based on GPS
CN111583696A (en) * 2020-05-15 2020-08-25 咸阳师范学院 Unmanned vehicle control system and operation method thereof

Also Published As

Publication number Publication date
CN112351154A (en) 2021-02-09

Similar Documents

Publication Publication Date Title
US12055945B2 (en) Systems and methods for controlling an autonomous vehicle with occluded sensor zones
CN108883725B (en) Driving vehicle alarm system and method
CN111695546B (en) Traffic signal lamp identification method and device for unmanned vehicle
US8976040B2 (en) Intelligent driver assist system based on multimodal sensor fusion
JP7104651B2 (en) Vehicle control system
WO2021243710A1 (en) Intelligent transportation system-based automatic driving method and device, and intelligent transportation system
JP2021527903A (en) Vehicle control methods, devices, devices, programs and computer storage media
WO2018058958A1 (en) Road vehicle traffic alarm system and method therefor
WO2018058957A1 (en) Intelligent vehicle-road cooperation traffic control system
US20200250980A1 (en) Reuse of Surroundings Models of Automated Vehicles
CN104875681A (en) Dynamic vehicle-mounted camera control method based on application scenarios
KR20200128480A (en) Self-driving vehicle and pedestrian guidance system and method using the same
WO2022246852A1 (en) Automatic driving system testing method based on aerial survey data, testing system, and storage medium
WO2021241189A1 (en) Information processing device, information processing method, and program
KR101832224B1 (en) Appratus and method for assisting a driver based on difficulty level of parking
CN112896159A (en) Driving safety early warning method and system
KR102355431B1 (en) AI based emergencies detection method and system
CN114932902B (en) Ghost probe early warning avoiding method and system based on Internet of vehicles technology
CN111216718B (en) Collision avoidance method, device and equipment
CN108550218B (en) Comprehensive management and control system and management and control method for bicycle special lane
CN112351154B (en) Unmanned vehicle road condition identification system
CN109895694B (en) Lane departure early warning method and device and vehicle
CN111427063B (en) Mobile device traffic control method, device, equipment, system and medium
CN114822083B (en) Intelligent vehicle formation auxiliary control system
CN116552539A (en) Vehicle control device, vehicle control method, and computer program for vehicle control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Cheng Bojing

Inventor after: Xu Gang

Inventor after: Lan Xinwu

Inventor before: Xu Gang

Inventor before: Lan Xinwu

Inventor before: Cheng Bojing

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant