CN111918029A - Intelligent motor vehicle night identification method and system based on cold light phenomenon - Google Patents

Intelligent motor vehicle night identification method and system based on cold light phenomenon Download PDF

Info

Publication number
CN111918029A
CN111918029A CN202010662894.6A CN202010662894A CN111918029A CN 111918029 A CN111918029 A CN 111918029A CN 202010662894 A CN202010662894 A CN 202010662894A CN 111918029 A CN111918029 A CN 111918029A
Authority
CN
China
Prior art keywords
robot
controlling
motor vehicle
self
spraying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010662894.6A
Other languages
Chinese (zh)
Other versions
CN111918029B (en
Inventor
夏牧谣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yantian Port International Information Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202010662894.6A priority Critical patent/CN111918029B/en
Publication of CN111918029A publication Critical patent/CN111918029A/en
Application granted granted Critical
Publication of CN111918029B publication Critical patent/CN111918029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D1/00Processes for applying liquids or other fluent materials
    • B05D1/02Processes for applying liquids or other fluent materials performed by spraying
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05DPROCESSES FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05D3/00Pretreatment of surfaces to which liquids or other fluent materials are to be applied; After-treatment of applied coatings, e.g. intermediate treating of an applied coating preparatory to subsequent applications of liquids or other fluent materials
    • B05D3/06Pretreatment of surfaces to which liquids or other fluent materials are to be applied; After-treatment of applied coatings, e.g. intermediate treating of an applied coating preparatory to subsequent applications of liquids or other fluent materials by exposure to radiation
    • B05D3/061Pretreatment of surfaces to which liquids or other fluent materials are to be applied; After-treatment of applied coatings, e.g. intermediate treating of an applied coating preparatory to subsequent applications of liquids or other fluent materials by exposure to radiation using U.V.
    • B05D3/065After-treatment

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Plasma & Fusion (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An intelligent motor vehicle night marking method and system based on cold light phenomenon includes: if the identification patrol command is received, the self-driven robot is controlled to start and the robot camera is controlled to shoot the robot image in real time, after night time, controlling the self-driving robot to move to a binding section for safety patrol and extracting a motor vehicle image in a running state, analyzing whether a motor vehicle is not in a light starting state or not, if so, controlling the self-driving robot to accelerate and keep moving synchronously with the front end of the motor vehicle and controlling the fluorescent spraying mechanism to spray to a vacant area at the front end of the motor vehicle, controlling the ultraviolet lamp and the fluorescent spraying mechanism to start synchronously, controlling the self-driving robot to decelerate and keep moving synchronously with the rear end of the motor vehicle after spraying is finished, controlling the fluorescent spraying mechanism to spray to a vacant area at the rear end of the motor vehicle and controlling the ultraviolet lamp and the fluorescent spraying mechanism to start synchronously, and after the spraying is finished, controlling the self-propelled robot to decelerate to a patrol speed to continue to carry out safety patrol.

Description

Intelligent motor vehicle night identification method and system based on cold light phenomenon
Technical Field
The invention relates to the field of motor vehicle driving protection, in particular to an intelligent motor vehicle night identification method and system based on a cold light phenomenon.
Background
With the increase of living standard of people, more and more automobiles are arranged on the automobile, so that accidents on roads are frequent, particularly at night, and accidents on the roads are more frequent due to the fact that the lighting effect of the street lamps on two sides near the roads is not very obvious; if the lamps of the motor vehicle are damaged, the pedestrians or the non-motor vehicles cannot be seen and cannot be seen in time when the motor vehicle runs at night, and the collision accident is easy to happen at the moment.
Therefore, how to combine the motor vehicle, the robot and the transparent fluorescent paint is a problem which is urgently needed to be solved at present when the robot is in night patrol and controls the robot to spray the transparent fluorescent paint to the front end and the rear end of the motor vehicle after detecting that the motor vehicle does not start a lamp, so as to warn and mark pedestrians, non-motor vehicles and other motor vehicles in the front and rear areas of the motor vehicle and reduce accidents caused by not finding the motor vehicle in time.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the background art, the embodiment of the invention provides an intelligent motor vehicle night identification method and system based on a cold light phenomenon, which can effectively solve the problems related to the background art.
The technical scheme is as follows:
an intelligent motor vehicle night identification method based on a cold light phenomenon comprises the following steps:
s1, controlling the self-driving robot stored in the warehouse to start and controlling a robot camera arranged at the external position of the started self-driving robot to start to capture the robot image in real time according to the received identification patrol command;
s2, after the vehicle is in the set night time, controlling the self-driven robot to move to the position of the binding road section according to the robot image to perform safety patrol, and extracting the vehicle image in the driving state in real time according to the robot image;
s3, analyzing whether the motor vehicle is not in a light starting state in real time according to the extracted motor vehicle image;
s4, if yes, controlling the self-driven robot closest to the motor vehicle to move in synchronization with the front end of the motor vehicle according to the robot image and controlling a fluorescent spraying mechanism arranged at the side position of the robot to start spraying the transparent fluorescent paint to the vacant area at the front end of the motor vehicle;
s5, controlling an ultraviolet lamp arranged at the side position of the self-driven robot to irradiate the sprayed transparent fluorescent paint in real time, and controlling the self-driven robot to decelerate and keep the rear end of the motor vehicle to move synchronously according to the robot image after the spraying of the fluorescent spraying mechanism is finished;
s6, controlling the fluorescent spraying mechanism to spray transparent fluorescent paint to the vacant area at the rear end of the motor vehicle according to the robot image and controlling the ultraviolet lamp to irradiate the sprayed transparent fluorescent paint in real time;
and S7, after the fluorescent spraying mechanism finishes spraying, controlling the self-driven robot to decelerate to a patrol speed according to the robot image and continuously go to a binding road section position for safety patrol.
As a preferred mode of the present invention, after S2, the method further includes the steps of:
s20, when the self-propelled robot patrols, controlling a monitoring camera arranged in a road area to start to capture a monitoring image in real time and analyzing whether a motor vehicle is close to a road ground marking line or not in real time according to the monitoring image;
s21, if yes, analyzing whether the motor vehicle is in a light starting state or not in real time according to the monitoring image;
s22, if not, controlling a lifting spraying mechanism which is arranged at the position of the road ground marking line and corresponds to the motor vehicle to lift in real time to keep synchronous with the motor vehicle, and spraying transparent fluorescent paint to the front end, the wheels and the rear end area of the motor vehicle in real time according to the monitoring image;
and S23, controlling the corresponding lifting spraying mechanism to stop spraying and descend, shrink and reset after the motor vehicle moves away from the road ground marking line area where the lifting spraying mechanism is located.
As a preferable mode of the present invention, in S22, the method further includes the steps of:
s220, when the lifting spraying mechanism carries out spraying, controlling an auxiliary ultraviolet lamp arranged at the position of a street lamp at the side of a road to start in real time to irradiate ultraviolet rays to an area of the motor vehicle sprayed with the transparent fluorescent paint in real time, and analyzing whether pedestrians are in the irradiation area of the auxiliary ultraviolet lamp in real time according to a monitoring image;
s221, if yes, controlling an auxiliary ultraviolet lamp with a human body in an irradiation area to enter a closed state, and analyzing whether the motor vehicle leaves the irradiation area of the auxiliary ultraviolet lamp in real time according to the monitoring image;
and S222, if so, controlling the auxiliary ultraviolet lamp for the motor vehicle leaving the irradiation area to enter a closed state.
As a preferred mode of the present invention, after S20, the method further includes the steps of:
s200, analyzing whether a non-motor vehicle is close to a road ground identification line in real time according to the monitoring image and the light is not turned on;
s201, if yes, controlling a loudspeaker arranged on the side of the road and corresponding to the position of the motor vehicle to play the parking spraying information, and analyzing whether the non-motor vehicle is parked or not and a driver leaves in real time according to the monitoring image;
s202, if so, controlling a lifting spraying mechanism arranged at the position of the road ground identification line and corresponding to the non-motor vehicle to lift and controlling the lifting spraying mechanism to spray transparent fluorescent paint to the non-motor vehicle vacant area in real time according to the monitoring image;
and S203, controlling an auxiliary ultraviolet lamp arranged at the side of the road and corresponding to the position of the motor vehicle on the road and the lifting spraying mechanism to synchronously keep starting to irradiate ultraviolet lamp light on the non-motor vehicle area.
As a preferred mode of the present invention, after S2, the method further includes the steps of:
s24, analyzing whether an accident occurs in real time according to the robot image;
s25, if yes, controlling the self-propelled robot to move around the preset range of the accident occurrence area according to the robot image and controlling a fluorescent spraying mechanism at the side of the self-propelled robot to start spraying transparent fluorescent paint to the ground area;
s26, controlling ultraviolet lamplight on the side of the self-propelled robot to irradiate the sprayed transparent fluorescent paint in real time and extracting an accident image contained in the robot image;
and S27, transmitting the extracted accident image to a road accident rescue center closest to the self-propelled robot.
An intelligent motor vehicle night identification system based on cold light phenomenon, which uses the intelligent motor vehicle night identification method based on cold light phenomenon of any one of claims 1-5, and comprises a robot identification device, a ground identification device and a server;
the robot identification device comprises a self-driving robot, a robot camera, a fluorescent spraying mechanism, a coating cavity and an ultraviolet lamp, wherein the self-driving robot is stored in a warehouse position and used for patrolling and assisting identification operation in a road area; the robot camera is arranged at the position outside the self-driven robot and is used for shooting an environment image outside the self-driven robot; the fluorescent spraying mechanism is arranged at the side position of the self-driven robot, is connected with the coating cavity and is used for spraying the transparent fluorescent coating to a specified position; the coating cavity is arranged in the inner position of the self-driven robot and used for storing the transparent fluorescent coating; the ultraviolet lamp is arranged at the peripheral position of the spraying end of the fluorescent spraying mechanism and is used for irradiating ultraviolet light to the transparent fluorescent paint when the fluorescent spraying mechanism is sprayed;
the ground identification device comprises a monitoring camera, a lifting spraying mechanism, a ground storage cavity, a spraying guide pipe, an auxiliary ultraviolet lamp and a loudspeaker, wherein the monitoring camera is arranged in a road area and used for shooting an environment image of the road area; the lifting spraying mechanism is arranged at the position of the road ground marking line, is connected with the ground storage cavity and is used for spraying the transparent fluorescent paint to the specified position; the ground storage cavity is arranged at the position of the road ground marking line and is provided with an adding port for storing the transparent fluorescent paint; the spraying guide pipe is respectively connected with the lifting spraying mechanism and the ground storage cavity and is used for providing the transparent fluorescent paint which is absorbed by the lifting spraying mechanism and stored in the ground storage cavity; the auxiliary ultraviolet lamp is arranged at the upper end of the road street lamp and is used for emitting ultraviolet light irradiated by the transparent fluorescent paint when the lifting spraying mechanism is sprayed; the loudspeaker is arranged at the side end of the road street lamp and used for sending out set voice information;
the server is arranged at the planned placing position of the urban road management department, and comprises:
the wireless module is used for being respectively in wireless connection with the self-driven robot, the robot camera, the fluorescent spraying mechanism, the ultraviolet lamp, the monitoring camera, the lifting spraying mechanism, the auxiliary ultraviolet lamp, the loudspeaker, the urban road management department, the road accident rescue center and the network;
the information receiving module is used for receiving information and/or instructions and/or requests;
the self-driven control module is used for controlling the self-driven robot to execute the set moving operation according to the set steps;
the robot shooting module is used for controlling the starting or closing of the robot camera;
the information analysis module is used for processing and analyzing the information according to the specified information;
the information extraction module is used for extracting the specified information and/or instruction and/or request contained information and/or instruction and/or request;
the acceleration control module is used for controlling the self-driven robot to execute set acceleration operation according to the set steps;
the mobile spraying module is used for controlling the fluorescent spraying mechanism to execute the set transparent fluorescent paint spraying operation according to the set steps;
the ultraviolet module is used for controlling the starting or the closing of the ultraviolet lamp;
and the deceleration control module is used for controlling the self-driven robot to execute the set deceleration operation according to the set steps.
As a preferred aspect of the present invention, the server further includes:
the monitoring shooting module is used for controlling the starting or closing of the monitoring camera;
the ground lifting module is used for controlling the lifting spraying mechanism to execute set lifting operation according to set steps;
and the ground spraying module is used for controlling the lifting spraying mechanism to execute the set transparent fluorescent paint spraying operation according to the set steps.
As a preferred aspect of the present invention, the server further includes:
and the auxiliary irradiation module is used for controlling the auxiliary ultraviolet lamp to be started or stopped.
As a preferred aspect of the present invention, the server further includes:
and the voice playing module is used for controlling the loudspeaker to play the set voice information.
As a preferred aspect of the present invention, the server further includes:
and the information sending module is used for sending the formulated information and/or instructions and/or requests to the specified object.
The invention realizes the following beneficial effects:
1. after the intelligent motor vehicle night identification system is started and is in night time, the self-driven robot is controlled to go to a bound road section in real time to patrol and identify motor vehicle information during patrol, if the self-driven robot is detected to be in a driving state, a vehicle lamp is not started and the self-driven robot is not in a fluorescent state, the self-driven robot and the front end of the motor vehicle are controlled to keep synchronous movement, a fluorescent spraying mechanism is used for spraying transparent fluorescent paint to a vacant area at the front end of the motor vehicle, and meanwhile, the ultraviolet lamp is controlled to be started to irradiate ultraviolet light to the transparent fluorescent paint, so that pedestrians, non-motor vehicles and other motor vehicles in the front and rear areas of the motor vehicle are warned and identified, and accidents caused by the fact that the motor vehicles are.
2. After the motor vehicle is close to a ground road mark and is not started and is not in a fluorescent state, controlling a corresponding lifting spraying mechanism to correspondingly and synchronously extend out of the motor vehicle and spraying transparent fluorescent paint to the front end and the rear end of the motor vehicle and the vacant areas of wheels, and simultaneously controlling an auxiliary ultraviolet lamp in the corresponding area to start to irradiate ultraviolet light to the transparent fluorescent paint; if the non-motor vehicle is detected not to be powered on, controlling a corresponding lifting spraying mechanism to spray transparent fluorescent paint to the vacant area of the non-motor vehicle and controlling an auxiliary ultraviolet lamp in the corresponding area to be powered on to irradiate ultraviolet light to the transparent fluorescent paint; so as to increase the night recognition degree of the non-motor vehicle.
3. When the self-propelled robot patrols, if the accident happens, the self-propelled robot moves around the accident area, sprays transparent fluorescent paint on the ground, controls the ultraviolet lamp to start, and irradiates ultraviolet light on the transparent fluorescent paint to warn other vehicles and pedestrians around.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart of a method for identifying a smart vehicle at night according to one embodiment of the present invention;
FIG. 2 is a flow chart of a method of raising and lowering a spray head of the spray head mechanism according to one embodiment of the present invention;
FIG. 3 is a flow chart of an auxiliary UV irradiation method according to one embodiment of the present invention;
FIG. 4 is a flow chart of a non-automotive spray coating method provided by one example of the present invention;
FIG. 5 is a flow chart of an incident identification and processing method provided by one example of the present invention;
FIG. 6 is a connection diagram of an intelligent night identification system for a vehicle according to one embodiment of the present invention;
fig. 7 is a schematic view of a self-propelled robot provided by one example of the present invention;
fig. 8 is a schematic cross-sectional view of an area where an elevating spray mechanism according to an example of the present invention is provided.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Example one
Referring to fig. 1, fig. 6-7.
Specifically, the embodiment provides an intelligent motor vehicle night identification method based on a cold light phenomenon, and the method includes the following steps:
and S1, controlling the self-driving robot 10 stored in the warehouse to start according to the received identification patrol command, and controlling the robot camera 11 arranged at the external position of the started self-driving robot 10 to start to shoot the robot image in real time.
Wherein the identification patrol command is sent by an urban road management department.
In S1, specifically, after the information receiving module 31 included in the server 3 receives the identification patrol command sent by the urban road management department maintaining the long connection relationship, the self-propelled control module 32 included in the server 3 controls the self-propelled robot 10 stored in the warehouse to start, wherein the started self-propelled robot 10 is the self-propelled robot 10 stored in the warehouse and in the idle and dormant state; after the self-driven robot 10 is started, the robot shooting module 33 included in the server 3 controls the robot camera 11 arranged at the external position of the self-driven robot 10 in the starting state to start shooting the robot image in real time; the robot image is an image of an environment around the self-propelled robot 10 captured by the robot camera 11.
And S2, controlling the self-driven robot 10 to move to the position of the binding road section according to the robot image to perform safety patrol after the self-driven robot is in the set night time, and extracting the motor vehicle image in the driving state in real time according to the robot image.
In S2, specifically, after the information analysis module 34 included in the server 3 analyzes that the current time period is at night time, where the night time is set by the city theory management department, and is preferably 21:00 to 5:00 in this embodiment; the self-driven control module 32 controls the self-driven robot 10 to move to the position of the bound road section for safety patrol according to the robot image, namely controls the self-driven robot 10 to circularly move at the position of the bound road section, wherein each self-driven robot is bound with at least one road section; when the self-propelled robot 10 is patrolled safely, the information extraction module 35 included in the server 3 extracts the vehicle image in a driving state in real time according to the robot image.
And S3, analyzing whether the motor vehicle is not in a light starting state in real time according to the extracted motor vehicle image.
In S3, specifically, after the information extraction module 35 extracts the vehicle image in the driving state in real time, the information analysis module 34 analyzes whether the vehicle is not in the light-on state in real time according to the vehicle image extracted by the information extraction module 35, that is, whether the light of the vehicle is off.
And S4, if so, controlling the self-driven robot 10 closest to the motor vehicle to move in synchronization with the front end of the motor vehicle according to the robot image and controlling the fluorescent spraying mechanism 12 arranged at the side position of the robot to start spraying the transparent fluorescent paint to the vacant area at the front end of the motor vehicle.
In S4, specifically, after the information analysis module 34 analyzes that the light of the motor vehicle is not in the on state, the acceleration control module 36 included in the server 3 controls the acceleration movement of the self-driven robot 10 closest to the motor vehicle to keep moving synchronously with the front end of the motor vehicle according to the robot image, that is, controls the acceleration movement of the self-driven robot 10 to keep moving synchronously with the front end of the motor vehicle, where the maximum speed of the movement of the self-driven robot 10 is the fastest speed allowed to travel on the road section where the self-driven robot is located, and if the overspeed of the motor vehicle is detected, extracts the license plate of the motor vehicle and uploads the overspeed image and the license plate of the motor vehicle to the motor vehicle management center; self-propelled robot 10 with the motor vehicle keeps the synchronous motion back, the removal spraying module 37 control that server 3 contained set up in the fluorescence spraying mechanism 12 of robot side position start to the regional spraying transparent fluorescent paint of motor vehicle front end free area, wherein the front end free area does not include preceding car light region and preceding windshield region, during the spraying, fluorescence spraying mechanism 12 adopts high-pressure spraying technique, and presses the spraying and hold down with spraying motor vehicle lower half part region and be main, prevents that wind-force from blowing away coating to preceding windshield position and influence driver's sight.
And S5, controlling the ultraviolet lamp 14 arranged at the side position of the self-propelled robot 10 to irradiate the sprayed transparent fluorescent paint in real time, and controlling the self-propelled robot 10 to decelerate and keep the rear end of the motor vehicle to move synchronously according to the robot image after the spraying of the fluorescent paint mechanism 12 is finished.
At S5, specifically, when the fluorescent spraying mechanism 12 is started, the ultraviolet module 38 included in the server 3 controls the ultraviolet lamp 14 disposed at the side position of the self-propelled robot 10 and corresponding to the fluorescent spraying mechanism 12 to irradiate the sprayed transparent fluorescent paint in real time, and the ultraviolet lamp 14 and the fluorescent spraying mechanism 12 keep a synchronous operation state, that is, the fluorescent spraying mechanism 12 is started and the ultraviolet is started; the fluorescent spraying mechanism 12 is closed, and the ultraviolet lamp 14 is closed; after the information analysis module 34 analyzes that the mobile front end spraying is completed in real time according to the robot image, the mobile spraying module 37 controls the fluorescent spraying mechanism 12 to be turned off, and simultaneously the ultraviolet module 38 controls the ultraviolet lamp 14 to be turned off; the deceleration control module 39 included in the server 3 controls the self-driven robot 10 to decelerate and keep moving synchronously with the rear end of the motor vehicle according to the robot image, that is, controls the self-driven robot 10 to keep moving synchronously with the rear end of the motor vehicle.
And S6, controlling the fluorescent spraying mechanism 12 to spray the transparent fluorescent paint to the vacant area at the rear end of the motor vehicle according to the robot image and controlling the ultraviolet lamp 14 to irradiate the sprayed transparent fluorescent paint in real time.
In S6, specifically after the self-propelled robot 10 and the rear end of the vehicle keep moving synchronously, the mobile spraying module 37 controls the fluorescent spraying mechanism 12 to start spraying the transparent fluorescent paint to the vacant area at the rear end of the vehicle, where the vacant area at the rear end does not include the rear lamp area and the rear windshield area, and when spraying, the fluorescent spraying mechanism 12 adopts a high-pressure spraying technique, and the spraying pressure is lowered mainly for spraying the lower half area of the vehicle, so as to prevent wind from blowing the paint to the rear windshield position to affect the rear view of the driver; meanwhile, the ultraviolet module 38 controls the ultraviolet lamp 14 to irradiate the sprayed transparent fluorescent paint in real time, and the ultraviolet lamp 14 and the fluorescent spraying mechanism 12 keep a synchronous operation state.
And S7, after the fluorescent spraying mechanism 12 finishes spraying, controlling the self-propelled robot 10 to decelerate to a patrol speed according to the robot image and continuously go to a binding road section position for safety patrol.
In S7, specifically, after the information analysis module 34 analyzes that the motorized rear end spraying is completed in real time according to the robot image, the mobile spraying module 37 controls the fluorescent spraying mechanism 12 to be turned off, and the ultraviolet module 38 controls the ultraviolet lamp 14 to be turned off at the same time; the self-driving control module 32 controls the self-driving robot 10 to decelerate to a patrol speed according to the robot image and continuously return to the position of the binding road section for safety patrol.
The areas of the motor vehicle sprayed by the fluorescent spraying mechanism 12 are all areas at the side end of the self-propelled robot 10, for example, the transparent fluorescent paint is sprayed on the left half part of the motor vehicle when the self-propelled robot 10 is located at the left side of the motor vehicle.
Example two
Referring to fig. 2-4, fig. 6-8.
This embodiment is substantially identical to the first embodiment, except that, in this embodiment, after S2, the method further includes the following steps:
and S20, controlling the monitoring camera 20 arranged in the road area to start to capture a monitoring image in real time when the self-driven robot 10 patrols, and analyzing whether the motor vehicle is close to the road ground marking line or not in real time according to the monitoring image.
Specifically, when the self-propelled robot 10 patrols, the monitoring camera 20 arranged in the road area is controlled by the monitoring camera module 40 contained in the server 3 to start to capture a monitoring image in real time, wherein the monitoring image refers to an environment image of the road area where the monitoring camera 20 captures, and after the monitoring camera 20 is started, the information analysis module 34 analyzes whether a motor vehicle is close to the road ground identification line in real time according to the monitoring image, that is, whether the distance between the motor vehicle and the road identification line is smaller than the effective spraying distance of the lifting spraying mechanism 21 at the position of the road identification line.
And S21, if yes, analyzing whether the motor vehicle is in a light starting state or not in real time according to the monitoring image.
Specifically, after the information analysis module 34 analyzes that a motor vehicle is close to a road ground identification line, the information analysis module 34 analyzes whether the motor vehicle is in a light on state in real time according to the monitoring image.
And S22, if not, controlling the lifting spraying mechanism 21 which is arranged at the position of the road ground marking line and corresponds to the motor vehicle to lift in real time to keep synchronous with the motor vehicle, and spraying transparent fluorescent paint to the front end, the wheels and the rear end area of the motor vehicle in real time according to the monitoring image.
Specifically, after the information analysis module 34 analyzes that the motor vehicle is not in the light-on state, the server 3 extracts the corresponding motor vehicle spatial coordinate data in the monitoring image, and then the ground lifting module 41 included in the server 3 controls the lifting spraying mechanism 21 arranged at the road ground marking line position and corresponding to the motor vehicle to lift in real time and keep synchronous with the motor vehicle according to the real-time motor vehicle spatial coordinate data, that is, the lifting spraying mechanism 21 corresponding to the real-time position of the motor vehicle is controlled to lift, and the lifting spraying mechanism 21 not corresponding to the motor vehicle automatically descends after lifting; when the lifting spraying mechanism 21 corresponds to the front end, the wheels and the rear end region of the motor vehicle, the ground spraying module 42 included in the server 3 sprays the transparent fluorescent paint to the front end, the wheels and the rear end region of the motor vehicle in real time according to the monitoring image, and the front and rear lamps and the front and rear windshields are avoided during spraying.
And S23, after the motor vehicle moves away from the road ground marking line area where the lifting spraying mechanism 21 is located, controlling the corresponding lifting spraying mechanism 21 to stop spraying and descend, shrink and reset.
Specifically, after the information analysis module 34 analyzes that the motor vehicle moves away from the road ground identification line area where the lifting spraying mechanism 21 is located according to the monitoring image, the ground spraying module 42 controls the corresponding lifting spraying mechanism 21 to stop spraying, and meanwhile, the ground lifting module 41 controls the corresponding lifting spraying mechanism 21 to descend, contract and reset.
As a preferable mode of the present invention, in S22, the method further includes the steps of:
s220, when the lifting spraying mechanism 21 is used for spraying, the auxiliary ultraviolet lamp 24 arranged at the position of the street lamp at the side of the road is controlled to be started in real time to irradiate ultraviolet rays to the area of the motor vehicle sprayed with the transparent fluorescent paint, and whether pedestrians are in the irradiation area of the auxiliary ultraviolet lamp 24 is analyzed in real time according to the monitoring image.
Specifically, when the ground spraying module 42 controls the lifting spraying mechanism 21 to spray, the auxiliary irradiation module 43 included in the server 3 controls the auxiliary ultraviolet lamp 24 arranged at the street lamp position on the side of the road where the motor vehicle is located to start in real time to irradiate ultraviolet rays to the region where the transparent fluorescent paint is sprayed on the motor vehicle, and when the auxiliary ultraviolet lamp 24 is started, the information analysis module 34 analyzes in real time whether a pedestrian is in the irradiation region of the auxiliary ultraviolet lamp 24 according to the monitoring image.
S221, if yes, controlling the auxiliary ultraviolet lamp 24 with the human body in the irradiation area to enter a closed state, and analyzing whether the motor vehicle leaves the irradiation area of the auxiliary ultraviolet lamp 24 in real time according to the monitoring image.
Specifically, after the information analysis module 34 analyzes that a pedestrian is in the irradiation area of the auxiliary ultraviolet lamp 24, the auxiliary irradiation module 43 controls the auxiliary ultraviolet lamp 24 with a human body in the irradiation area to enter a closed state; when the auxiliary ultraviolet lamp 24 is in an irradiation state, the information analysis module 34 analyzes whether the vehicle leaves the irradiation area of the auxiliary ultraviolet lamp 24 in real time according to the monitoring image.
And S222, if so, controlling the auxiliary ultraviolet lamp 24 for the motor vehicle leaving the irradiation area to enter a closed state.
Specifically, after the information analysis module 34 analyzes that the vehicle leaves the irradiation area of the auxiliary ultraviolet lamp 24, the auxiliary irradiation module 43 controls the auxiliary ultraviolet lamp 24 that the vehicle leaves the irradiation area to enter a closed state.
As a preferred mode of the present invention, after S20, the method further includes the steps of:
and S200, analyzing whether a non-motor vehicle is close to the road ground identification line and the light is not turned on in real time according to the monitoring image.
Specifically, after the information analysis module 34 analyzes whether a motor vehicle is adjacent to the road ground identification line in real time according to the monitoring image and the light is not turned on, the information analysis module 34 analyzes whether a non-motor vehicle is adjacent to the road ground identification line in real time according to the monitoring image.
S201, if yes, controlling a loudspeaker 25 arranged at the side of the road and corresponding to the position of the motor vehicle to play the parking spraying information, and analyzing whether the non-motor vehicle is parked or not and a driver leaves in real time according to the monitoring image.
Specifically, after the information analysis module 34 analyzes that a non-motor vehicle is close to the road ground identification line, the voice playing module 44 included in the server 3 controls the speaker 25 arranged at the side of the road and corresponding to the motor vehicle to play the stop spraying information so as to remind the driver of the non-motor vehicle to get off the vehicle and go to a safe area to wait for the spraying of the fluorescent paint; meanwhile, the information analysis module 34 analyzes whether the non-motor vehicle is parked or not and a driver leaves according to the monitoring image in real time.
S202, if so, controlling the lifting spraying mechanism 21 arranged at the position of the road ground identification line corresponding to the non-motor vehicle to lift and controlling the lifting spraying mechanism 21 to spray the transparent fluorescent paint to the vacant area of the non-motor vehicle in real time according to the monitoring image.
Specifically, after the information analysis module 34 analyzes that the non-motor vehicle is parked and the driver leaves, the ground lifting module 41 controls the lifting spraying mechanism 21 arranged at the position of the road ground identification line corresponding to the non-motor vehicle to lift, and after the lifting spraying mechanism 21 lifts, the ground spraying module 42 controls the lifting spraying mechanism 21 to spray the transparent fluorescent paint to the non-motor vehicle vacant area in real time according to the monitoring image.
And S203, controlling the auxiliary ultraviolet lamp 24 arranged at the side of the road and corresponding to the position of the motor vehicle to be in synchronization with the lifting spraying mechanism 21, and keeping the auxiliary ultraviolet lamp on starting to irradiate the ultraviolet lamp 14 light to the non-motor vehicle area.
Specifically, when the lifting spraying mechanism 21 is in a spraying state, the auxiliary irradiation module 43 controls the auxiliary ultraviolet lamp 24 arranged at a street lamp position corresponding to a motor vehicle on the side of the road to be synchronously started with the lifting spraying mechanism 21 to irradiate the ultraviolet lamp 14 light to the non-motor vehicle area; after the spraying is finished, the ground spraying module 42 controls the lifting spraying mechanism 21 to stop, the auxiliary irradiation module 43 controls the ultraviolet lamp 14 to turn off, and then the ground lifting module 41 controls the lifting spraying mechanism 21 to descend and reset.
EXAMPLE III
As shown with reference to fig. 5-7.
Specifically, this embodiment is substantially the same as the first embodiment, except that in this embodiment, after S2, the method further includes the following steps:
and S24, analyzing whether an accident occurs in real time according to the robot image.
Specifically, when the self-propelled robot 10 is in a patrol state, the information analysis module 34 analyzes whether an accident occurs around the self-propelled robot 10 in real time according to the robot image.
And S25, if yes, controlling the self-propelled robot 10 to move around the preset range of the accident occurrence area according to the robot image and controlling the fluorescent spraying mechanism 12 at the side of the self-propelled robot 10 to start spraying the transparent fluorescent paint to the ground area.
Specifically, after the information analysis module 34 analyzes that an accident occurs around the self-propelled robot 10, the self-propelled control module 32 controls the self-propelled robot 10 to move around a preset range of the accident occurrence area according to a robot image, wherein the preset range is set by an urban road management department, and in this embodiment, the preset range is preferably a radius of 3 meters from the accident as a center to the periphery, for example, if the accident range is 15 meters in diameter, the moving range is 21 meters in diameter; when the self-propelled robot 10 moves around the preset area of the accident occurrence area, the mobile spraying module 37 controls the fluorescent spraying mechanism 12 on the side of the self-propelled robot 10 to start spraying transparent fluorescent paint to the ground area.
And S26, controlling the ultraviolet lamp 14 at the side of the self-propelled robot 10 to irradiate the sprayed transparent fluorescent paint in real time and extracting an accident image contained in the robot image.
Specifically, when the fluorescent spraying mechanism 12 is started, the ultraviolet module 38 included in the server 3 controls the ultraviolet lamp 14 at the side of the self-propelled robot 10 to irradiate the sprayed transparent fluorescent paint in real time; the information extraction module 35 extracts an accident area image included in the robot image while the self-propelled robot 10 moves.
And S27, transmitting the extracted accident image to a road accident rescue center closest to the self-propelled robot 10.
Specifically, while the information extraction module 35 extracts the image of the accident area, the information transmission module 45 included in the server 3 synchronously transmits the image of the accident area extracted by the information extraction module 35 to the road accident rescue center closest to the self-propelled robot 10.
Example four
As shown with reference to fig. 6-8.
Specifically, the embodiment provides an intelligent motor vehicle night identification system based on a cold light phenomenon, and an intelligent motor vehicle night identification method based on a cold light phenomenon is used, and comprises a robot identification device 1, a ground identification device 2 and a server 3;
the robot identification device 1 comprises a self-driving robot 10, a robot camera 11, a fluorescent spraying mechanism 12, a coating cavity 13 and an ultraviolet lamp 14, wherein the self-driving robot 10 is stored in a warehouse position and used for patrolling and assisting identification operation in a road area; the robot camera 11 is disposed at a position outside the self-propelled robot 10, and is configured to capture an image of an environment outside the self-propelled robot 10; the fluorescent spraying mechanism 12 is arranged at the side position of the self-driven robot 10, is connected with the coating cavity 13 and is used for spraying transparent fluorescent coating to a specified position; the coating cavity 13 is arranged in the internal position of the self-propelled robot 10 and is used for storing transparent fluorescent coating; the ultraviolet lamp 14 is arranged at the peripheral position of the spraying end of the fluorescent spraying mechanism 12 and is used for irradiating ultraviolet light to the transparent fluorescent paint when the fluorescent spraying mechanism 12 is used for spraying;
the ground identification device 2 comprises a monitoring camera 20, a lifting spraying mechanism 21, a ground storage cavity 22, a spraying conduit 23, an auxiliary ultraviolet lamp 24 and a loudspeaker 25, wherein the monitoring camera 20 is arranged in a road area and is used for shooting an environment image of the road area; the lifting spraying mechanism 21 is arranged at the position of a road ground marking line, is connected with the ground storage cavity 22 and is used for spraying transparent fluorescent paint to a specified position; the ground storage cavity 22 is arranged at the position of the road ground marking line and is provided with an adding port for storing transparent fluorescent paint; the spraying conduit 23 is respectively connected with the lifting spraying mechanism 21 and the ground storage cavity 22 and is used for providing the transparent fluorescent paint which is pumped by the lifting spraying mechanism 21 and stored in the ground storage cavity 22; the auxiliary ultraviolet lamp 24 is arranged at the upper end of the road street lamp and is used for emitting ultraviolet light irradiated by the transparent fluorescent paint when the lifting spraying mechanism 21 is sprayed; the loudspeaker 25 is arranged at the side end of the road street lamp and used for sending set voice information;
the server 3 is arranged at a placement position planned by an urban road management department, and the server 3 comprises:
the wireless module 30 is used for being respectively in wireless connection with the self-driven robot 10, the robot camera 11, the fluorescent spraying mechanism 12, the ultraviolet lamp 14, the monitoring camera 20, the lifting spraying mechanism 21, the auxiliary ultraviolet lamp 24, the loudspeaker 25, the urban road management department, the road accident rescue center and the network;
an information receiving module 31, configured to receive information and/or instructions and/or requests;
a self-driven control module 32 for controlling the self-driven robot 10 to perform a set moving operation according to the set steps;
the robot shooting module 33 is used for controlling the robot camera 11 to start or close;
an information analysis module 34 for processing and analyzing information according to the specified information;
an information extraction module 35, configured to extract information and/or instructions and/or requests included in the specific information and/or instructions and/or requests;
an acceleration control module 36 for controlling the self-propelled robot 10 to perform a set acceleration operation according to the set steps;
the mobile spraying module 37 is used for controlling the fluorescent spraying mechanism 12 to execute the set transparent fluorescent paint spraying operation according to the set steps;
an ultraviolet module 38 for controlling the ultraviolet lamp 14 to be turned on or off;
and a deceleration control module 39 for controlling the self-propelled robot 10 to perform a set deceleration operation according to the set steps.
As a preferred aspect of the present invention, the server 3 further includes:
the monitoring shooting module 40 is used for controlling the monitoring camera 20 to start or close;
the ground lifting module 41 is used for controlling the lifting spraying mechanism 21 to execute the set lifting operation according to the set steps;
and the ground spraying module 42 is used for controlling the lifting spraying mechanism 21 to execute the set transparent fluorescent paint spraying operation according to the set steps.
As a preferred aspect of the present invention, the server 3 further includes:
and the auxiliary irradiation module 43 is used for controlling the auxiliary ultraviolet lamp 24 to be started or stopped.
As a preferred aspect of the present invention, the server 3 further includes:
and a voice playing module 44 for controlling the speaker 25 to play the set voice message.
As a preferred aspect of the present invention, the server 3 further includes:
and an information sending module 45, configured to send the formulation information and/or instruction and/or request to the specified object.
The lifting spraying mechanism 21 comprises a lifting channel, a telescopic hydraulic cylinder, a telescopic rod, a circular lifting shell and a high-pressure sprayer.
It should be understood that, in the fourth embodiment, the specific implementation process of each module described above may correspond to the description of the above method embodiments (the first to the third embodiments), and is not described in detail here.
The system provided in the fourth embodiment is only illustrated by dividing the functional modules, and in practical applications, the above-mentioned functions may be distributed by different functional modules according to needs, that is, the internal structure of the system is divided into different functional modules to complete all or part of the functions described above.
The above embodiments are merely illustrative of the technical ideas and features of the present invention, and are intended to enable those skilled in the art to understand the contents of the present invention and implement the present invention, and not to limit the scope of the present invention. All equivalent changes or modifications made according to the spirit of the present invention should be covered within the protection scope of the present invention.

Claims (10)

1. An intelligent motor vehicle night identification method based on a cold light phenomenon is characterized by comprising the following steps:
s1, controlling the self-driving robot stored in the warehouse to start and controlling a robot camera arranged at the external position of the started self-driving robot to start to capture the robot image in real time according to the received identification patrol command;
s2, after the vehicle is in the set night time, controlling the self-driven robot to move to the position of the binding road section according to the robot image to perform safety patrol, and extracting the vehicle image in the driving state in real time according to the robot image;
s3, analyzing whether the motor vehicle is not in a light starting state in real time according to the extracted motor vehicle image;
s4, if yes, controlling the self-driven robot closest to the motor vehicle to move in synchronization with the front end of the motor vehicle according to the robot image and controlling a fluorescent spraying mechanism arranged at the side position of the robot to start spraying the transparent fluorescent paint to the vacant area at the front end of the motor vehicle;
s5, controlling an ultraviolet lamp arranged at the side position of the self-driven robot to irradiate the sprayed transparent fluorescent paint in real time, and controlling the self-driven robot to decelerate and keep the rear end of the motor vehicle to move synchronously according to the robot image after the spraying of the fluorescent spraying mechanism is finished;
s6, controlling the fluorescent spraying mechanism to spray transparent fluorescent paint to the vacant area at the rear end of the motor vehicle according to the robot image and controlling the ultraviolet lamp to irradiate the sprayed transparent fluorescent paint in real time;
and S7, after the fluorescent spraying mechanism finishes spraying, controlling the self-driven robot to decelerate to a patrol speed according to the robot image and continuously go to a binding road section position for safety patrol.
2. The intelligent night marking method for motor vehicles based on cold light phenomenon as claimed in claim 1, wherein after S2, the method further comprises the following steps:
s20, when the self-propelled robot patrols, controlling a monitoring camera arranged in a road area to start to capture a monitoring image in real time and analyzing whether a motor vehicle is close to a road ground marking line or not in real time according to the monitoring image;
s21, if yes, analyzing whether the motor vehicle is in a light starting state or not in real time according to the monitoring image;
s22, if not, controlling a lifting spraying mechanism which is arranged at the position of the road ground marking line and corresponds to the motor vehicle to lift in real time to keep synchronous with the motor vehicle, and spraying transparent fluorescent paint to the front end, the wheels and the rear end area of the motor vehicle in real time according to the monitoring image;
and S23, controlling the corresponding lifting spraying mechanism to stop spraying and descend, shrink and reset after the motor vehicle moves away from the road ground marking line area where the lifting spraying mechanism is located.
3. The intelligent night marking method for motor vehicles based on cold light phenomenon as claimed in claim 2, wherein in S22, the method further comprises the following steps:
s220, when the lifting spraying mechanism carries out spraying, controlling an auxiliary ultraviolet lamp arranged at the position of a street lamp at the side of a road to start in real time to irradiate ultraviolet rays to an area of the motor vehicle sprayed with the transparent fluorescent paint in real time, and analyzing whether pedestrians are in the irradiation area of the auxiliary ultraviolet lamp in real time according to a monitoring image;
s221, if yes, controlling an auxiliary ultraviolet lamp with a human body in an irradiation area to enter a closed state, and analyzing whether the motor vehicle leaves the irradiation area of the auxiliary ultraviolet lamp in real time according to the monitoring image;
and S222, if so, controlling the auxiliary ultraviolet lamp for the motor vehicle leaving the irradiation area to enter a closed state.
4. The intelligent night marking method for motor vehicles based on cold light phenomenon as claimed in claim 2, wherein after S20, the method further comprises the following steps:
s200, analyzing whether a non-motor vehicle is close to a road ground identification line in real time according to the monitoring image and the light is not turned on;
s201, if yes, controlling a loudspeaker arranged on the side of the road and corresponding to the position of the motor vehicle to play the parking spraying information, and analyzing whether the non-motor vehicle is parked or not and a driver leaves in real time according to the monitoring image;
s202, if so, controlling a lifting spraying mechanism arranged at the position of the road ground identification line and corresponding to the non-motor vehicle to lift and controlling the lifting spraying mechanism to spray transparent fluorescent paint to the non-motor vehicle vacant area in real time according to the monitoring image;
and S203, controlling an auxiliary ultraviolet lamp arranged at the side of the road and corresponding to the position of the motor vehicle on the road and the lifting spraying mechanism to synchronously keep starting to irradiate ultraviolet lamp light on the non-motor vehicle area.
5. The intelligent night marking method for motor vehicles based on cold light phenomenon as claimed in claim 1, wherein after S2, the method further comprises the following steps:
s24, analyzing whether an accident occurs in real time according to the robot image;
s25, if yes, controlling the self-propelled robot to move around the preset range of the accident occurrence area according to the robot image and controlling a fluorescent spraying mechanism at the side of the self-propelled robot to start spraying transparent fluorescent paint to the ground area;
s26, controlling ultraviolet lamplight on the side of the self-propelled robot to irradiate the sprayed transparent fluorescent paint in real time and extracting an accident image contained in the robot image;
and S27, transmitting the extracted accident image to a road accident rescue center closest to the self-propelled robot.
6. An intelligent night identification system for motor vehicles based on cold light phenomenon, which uses the intelligent night identification method for motor vehicles based on cold light phenomenon as claimed in any one of claims 1-5, comprising a robot identification device, a ground identification device and a server, characterized in that:
the robot identification device comprises a self-driving robot, a robot camera, a fluorescent spraying mechanism, a coating cavity and an ultraviolet lamp, wherein the self-driving robot is stored in a warehouse position and used for patrolling and assisting identification operation in a road area; the robot camera is arranged at the position outside the self-driven robot and is used for shooting an environment image outside the self-driven robot; the fluorescent spraying mechanism is arranged at the side position of the self-driven robot, is connected with the coating cavity and is used for spraying the transparent fluorescent coating to a specified position; the coating cavity is arranged in the inner position of the self-driven robot and used for storing the transparent fluorescent coating; the ultraviolet lamp is arranged at the peripheral position of the spraying end of the fluorescent spraying mechanism and is used for irradiating ultraviolet light to the transparent fluorescent paint when the fluorescent spraying mechanism is sprayed;
the ground identification device comprises a monitoring camera, a lifting spraying mechanism, a ground storage cavity, a spraying guide pipe, an auxiliary ultraviolet lamp and a loudspeaker, wherein the monitoring camera is arranged in a road area and used for shooting an environment image of the road area; the lifting spraying mechanism is arranged at the position of the road ground marking line, is connected with the ground storage cavity and is used for spraying the transparent fluorescent paint to the specified position; the ground storage cavity is arranged at the position of the road ground marking line and is provided with an adding port for storing the transparent fluorescent paint; the spraying guide pipe is respectively connected with the lifting spraying mechanism and the ground storage cavity and is used for providing the transparent fluorescent paint which is absorbed by the lifting spraying mechanism and stored in the ground storage cavity; the auxiliary ultraviolet lamp is arranged at the upper end of the road street lamp and is used for emitting ultraviolet light irradiated by the transparent fluorescent paint when the lifting spraying mechanism is sprayed; the loudspeaker is arranged at the side end of the road street lamp and used for sending out set voice information;
the server is arranged at the planned placing position of the urban road management department, and comprises:
the wireless module is used for being respectively in wireless connection with the self-driven robot, the robot camera, the fluorescent spraying mechanism, the ultraviolet lamp, the monitoring camera, the lifting spraying mechanism, the auxiliary ultraviolet lamp, the loudspeaker, the urban road management department, the road accident rescue center and the network;
the information receiving module is used for receiving information and/or instructions and/or requests;
the self-driven control module is used for controlling the self-driven robot to execute the set moving operation according to the set steps;
the robot shooting module is used for controlling the starting or closing of the robot camera;
the information analysis module is used for processing and analyzing the information according to the specified information;
the information extraction module is used for extracting the specified information and/or instruction and/or request contained information and/or instruction and/or request;
the acceleration control module is used for controlling the self-driven robot to execute set acceleration operation according to the set steps;
the mobile spraying module is used for controlling the fluorescent spraying mechanism to execute the set transparent fluorescent paint spraying operation according to the set steps;
the ultraviolet module is used for controlling the starting or the closing of the ultraviolet lamp;
and the deceleration control module is used for controlling the self-driven robot to execute the set deceleration operation according to the set steps.
7. The intelligent night identification system for motor vehicles based on cold light phenomenon as claimed in claim 6, wherein said server further comprises:
the monitoring shooting module is used for controlling the starting or closing of the monitoring camera;
the ground lifting module is used for controlling the lifting spraying mechanism to execute set lifting operation according to set steps;
and the ground spraying module is used for controlling the lifting spraying mechanism to execute the set transparent fluorescent paint spraying operation according to the set steps.
8. The intelligent night identification system for motor vehicles based on cold light phenomenon as claimed in claim 6, wherein said server further comprises:
and the auxiliary irradiation module is used for controlling the auxiliary ultraviolet lamp to be started or stopped.
9. The intelligent night identification system for motor vehicles based on cold light phenomenon as claimed in claim 6, wherein said server further comprises:
and the voice playing module is used for controlling the loudspeaker to play the set voice information.
10. The intelligent night identification system for motor vehicles based on cold light phenomenon as claimed in claim 6, wherein said server further comprises:
and the information sending module is used for sending the formulated information and/or instructions and/or requests to the specified object.
CN202010662894.6A 2020-07-10 2020-07-10 Intelligent motor vehicle night identification method and system based on cold light phenomenon Active CN111918029B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010662894.6A CN111918029B (en) 2020-07-10 2020-07-10 Intelligent motor vehicle night identification method and system based on cold light phenomenon

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010662894.6A CN111918029B (en) 2020-07-10 2020-07-10 Intelligent motor vehicle night identification method and system based on cold light phenomenon

Publications (2)

Publication Number Publication Date
CN111918029A true CN111918029A (en) 2020-11-10
CN111918029B CN111918029B (en) 2021-12-17

Family

ID=73227778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010662894.6A Active CN111918029B (en) 2020-07-10 2020-07-10 Intelligent motor vehicle night identification method and system based on cold light phenomenon

Country Status (1)

Country Link
CN (1) CN111918029B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06325834A (en) * 1993-05-11 1994-11-25 Sumitomo Wiring Syst Ltd Charging connector in electric automobile
JPH1046529A (en) * 1996-08-06 1998-02-17 Washiro Densetsu:Kk Zebra crossing
JP2000027128A (en) * 1998-07-07 2000-01-25 Nec Home Electron Ltd Illumination system, ultraviolet output device, and irradiator for ultraviolet output device
CN2390780Y (en) * 1999-10-15 2000-08-09 王振喜 Protection instrument for vehicle
CN1410173A (en) * 2002-05-20 2003-04-16 万金林 Holographic luminescent and reflecting safety alarming device of bicycle
CN2820343Y (en) * 2005-08-22 2006-09-27 孟祥涛 Non-motor vehicle fluorescence coating
CN2913147Y (en) * 2006-06-06 2007-06-20 陈朝华 Caution device for locomotive engine starting pole
CN206485266U (en) * 2017-01-18 2017-09-12 北京汽车股份有限公司 Automobile safety prompt device and automobile
CN107436610A (en) * 2017-07-31 2017-12-05 中南大学 A kind of vehicle and robot delivery navigation methods and systems of intelligent outdoor environment
CN207864461U (en) * 2018-02-24 2018-09-14 安徽全一科技有限公司 A kind of automobile chassis transmission shaft
CN208585166U (en) * 2018-06-28 2019-03-08 深圳市三本软件有限公司 A kind of new automobile safety reminding device
CN210912676U (en) * 2019-11-26 2020-07-03 唐山学院 Frame structure of unmanned vehicle is patrolled and examined safely

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06325834A (en) * 1993-05-11 1994-11-25 Sumitomo Wiring Syst Ltd Charging connector in electric automobile
JPH1046529A (en) * 1996-08-06 1998-02-17 Washiro Densetsu:Kk Zebra crossing
JP2000027128A (en) * 1998-07-07 2000-01-25 Nec Home Electron Ltd Illumination system, ultraviolet output device, and irradiator for ultraviolet output device
CN2390780Y (en) * 1999-10-15 2000-08-09 王振喜 Protection instrument for vehicle
CN1410173A (en) * 2002-05-20 2003-04-16 万金林 Holographic luminescent and reflecting safety alarming device of bicycle
CN2820343Y (en) * 2005-08-22 2006-09-27 孟祥涛 Non-motor vehicle fluorescence coating
CN2913147Y (en) * 2006-06-06 2007-06-20 陈朝华 Caution device for locomotive engine starting pole
CN206485266U (en) * 2017-01-18 2017-09-12 北京汽车股份有限公司 Automobile safety prompt device and automobile
CN107436610A (en) * 2017-07-31 2017-12-05 中南大学 A kind of vehicle and robot delivery navigation methods and systems of intelligent outdoor environment
CN207864461U (en) * 2018-02-24 2018-09-14 安徽全一科技有限公司 A kind of automobile chassis transmission shaft
CN208585166U (en) * 2018-06-28 2019-03-08 深圳市三本软件有限公司 A kind of new automobile safety reminding device
CN210912676U (en) * 2019-11-26 2020-07-03 唐山学院 Frame structure of unmanned vehicle is patrolled and examined safely

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
葛强林等: "复杂电磁环境和夜间控制灯光条件下行车指挥与控制", 《国防交通工程与技术》 *

Also Published As

Publication number Publication date
CN111918029B (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN106274898A (en) A kind of vehicle-mounted remote-control parking equipment and system thereof
CN111173339A (en) Intelligent parking method and system
CN110580046A (en) Control method and system for unmanned sightseeing vehicle
CN110195387A (en) A kind of electronics road system of energy conservation and intelligence
CN112566079A (en) Parking guidance system in automatic parking system
WO2023155283A1 (en) Automatic driving information auxiliary system based on intelligent lamp pole
CN111918029B (en) Intelligent motor vehicle night identification method and system based on cold light phenomenon
CN110054115A (en) A kind of vehicle automatic transporting guide transport lorry and guiding transportation system
CN109816946B (en) Tunnel inspection device
CN108791054B (en) Intelligent automobile safety protection method and system based on big data analysis
CN209591092U (en) A kind of intelligent transportation device for crossing
CN108513417B (en) Low-energy-consumption underground parking garage illumination system and control method
CN110700138B (en) Intelligent monitoring method and system based on parking space identification
CN110820449A (en) Intelligent expressway emergency protection method and system based on cloud computing
CN203716584U (en) Automatic garage tire positioning device
CN111580510A (en) Intelligent road emergency rescue method and system based on big data and service platform
CN209397982U (en) A kind of parking robot based on hydraulic system
CN112991772A (en) Unmanned vehicle scheduling method and system
CN113192352A (en) Automatic driving method and system for receiving instructions of traffic management personnel
CN210420945U (en) Intelligent recognition's parking area railing device
CN214335955U (en) Intelligent parking device
CN112634656A (en) Intelligent height limit control method and system based on urban road
CN113034965A (en) Intelligent parking device and method
CN216118878U (en) Toll station gate for laser radar identification
CN214942917U (en) Safety identification device for vehicle parking area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211129

Address after: 518083 floor 11, Yantian international administrative office building, Yantian District, Shenzhen City, Guangdong Province

Applicant after: Yantian Port International Information Co.,Ltd.

Address before: 215400 No.46, Huiyang second village, Loudong street, Taicang City, Suzhou City, Jiangsu Province

Applicant before: Xia Muyao

GR01 Patent grant
GR01 Patent grant