CN116908828B - Distance induction control method and device for automobile tail lamp - Google Patents
Distance induction control method and device for automobile tail lamp Download PDFInfo
- Publication number
- CN116908828B CN116908828B CN202311168947.9A CN202311168947A CN116908828B CN 116908828 B CN116908828 B CN 116908828B CN 202311168947 A CN202311168947 A CN 202311168947A CN 116908828 B CN116908828 B CN 116908828B
- Authority
- CN
- China
- Prior art keywords
- controllable light
- vehicle
- tail lamp
- distance
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000006698 induction Effects 0.000 title claims abstract description 6
- 238000001514 detection method Methods 0.000 claims abstract description 37
- 230000008569 process Effects 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 7
- 238000004590 computer program Methods 0.000 description 15
- 238000004020 luminiscence type Methods 0.000 description 10
- 238000005259 measurement Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/30—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating rear of vehicle, e.g. by means of reflecting surfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/002—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
- B60Q9/004—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
- B60Q9/005—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/11—Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/155—Coordinated control of two or more light sources
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/16—Controlling the light source by timing means
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
The application belongs to the technical field of intelligent control of vehicles, and particularly relates to a distance induction control method and device for a tail lamp of a vehicle, wherein the method comprises the following steps: detecting a reversing instruction and starting a tail lamp; acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then: each controllable light-emitting unit of the left tail lamp and the right tail lamp is respectively lightened and kept for a set duration; controlling a camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on; simultaneously, a group of controllable light-emitting units of the left tail lamp and the right tail lamp are lightened and kept for a set time length; controlling a camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on; and outputting the shortest distance between the vehicle and the object according to the acquired image. According to the method, the distance between the vehicle and the object can be obtained by utilizing the irradiation of the tail lamp and the image acquisition, so that the distance sensing is realized.
Description
Technical Field
The application belongs to the technical field of intelligent control of vehicles, and particularly relates to a distance induction control method and device for a tail lamp of a vehicle.
Background
The distance measurement of the vehicle refers to the distance between the vehicle and other vehicles, pedestrians or objects through various sensors, and commonly used ultrasonic distance measurement, millimeter wave distance measurement, laser distance measurement, camera distance measurement and the like are carried out, wherein, a plurality of distance measurement modes except for camera distance measurement depend on a special distance measurement sensor, the cost is high, and the measurement and the calculation of the distance by using the picture of the camera are always research and development hot spots.
When reversing in the garage, because the distances between vehicles and between the vehicles and other objects are often relatively close, the distance between the vehicles needs to be paid attention to at any time, and the vehicles are prevented from being scratched. The accuracy of special range finding sensor is higher, but the working face is very narrow, need set up a plurality of sensors and just can realize the range finding of a face (such as front, back or side of vehicle), and the camera has the characteristics of shooting face width, with low costs, how to utilize the camera to measure the distance more accurately is the problem that needs to solve.
Disclosure of Invention
In view of this, the embodiments of the present application provide a method and an apparatus for controlling distance sensing of a tail light for a vehicle, which can solve the problem that how to measure the distance more accurately by using a camera is needed to be solved.
A first aspect of an embodiment of the present application provides a method for controlling distance sensing of a tail lamp for a vehicle, including:
s1, detecting a reversing instruction and starting a tail lamp;
s2, acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
s3, respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
s4, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each controllable light-emitting unit keeps on;
s5, simultaneously lighting a group of controllable light-emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
s6, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on;
s7, circularly executing S3-S6, outputting a shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
A second aspect of the embodiments of the present application provides a vehicle tail light distance sensing control device, the vehicle tail light distance sensing control device includes:
the reversing detection module is used for detecting a reversing instruction and starting a tail lamp;
the brightness detection module is used for obtaining the detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
the first lighting module is used for respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
the first acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on;
the second lighting module is used for simultaneously lighting a group of controllable light emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
the second acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each group of controllable light-emitting units keep on;
and the circulation output module is used for executing S3-S6 in a circulation mode, outputting the shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
Compared with the prior art, the beneficial effects that this application exists are: when the vehicle is in reverse, the controllable light emitting units of the tail lamp are controlled to be on or off, so that the irradiation points of the controllable light emitting units are displayed in the acquired image, the distance between the vehicle and the object is calculated by utilizing the deviation between the irradiation points of the controllable light emitting units and the projection points, the distance measuring device can be used in combination with the existing image distance measuring method or can be used independently, and the problem of low accuracy of the existing image distance measuring is solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for controlling distance sensing of a tail lamp for a vehicle according to an embodiment of the present application;
fig. 2 is a block diagram of a distance sensing control device for a tail lamp for a vehicle according to an embodiment of the present application;
fig. 3 is a schematic diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
Fig. 1 shows a distance sensing control method for a tail lamp for a vehicle according to an embodiment of the present application, which is described in detail below: a vehicle tail light distance sensing control method, the vehicle tail light distance sensing control method comprising:
s1, detecting a reversing instruction and starting a tail lamp;
s2, acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
s3, respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
s4, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each controllable light-emitting unit keeps on;
s5, simultaneously lighting a group of controllable light-emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
s6, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on;
s7, circularly executing S3-S6, outputting a shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
In the application, the reversing instruction of the vehicle is generated by the gear switching process of the vehicle, belongs to the conventional detection parameters of the vehicle machine, and is not particularly limited to the detection of reversing signals; similarly, turning on the tail lamp of the vehicle based on the reverse signal also belongs to the conventional arrangement of the vehicle, and the description of the application is omitted here.
In this application, luminance sensor is used for detecting ambient brightness, when detecting ambient brightness and being less than the setting value, judges ambient brightness lower, and light is darker, uses the method that this application provided at this moment can assist the detection of rear distance, and when luminance is higher, utilizes tail lamp auxiliary distance to detect then the effect not good, and specific luminance value can be set for by the user, and different values can not influence the realization of the scheme that this application provided, can only influence the discernment difficulty of gained image, but can be used for supplementary detection equally. Meanwhile, as a reference setting mode, the brightness value of the scheme can be set to be consistent with the brightness value used by the automatic lamplight of the vehicle, so that the control variable of the vehicle is reduced.
In this application, unlike the general vehicle tail lamp control process, this application sets up each lighting unit of vehicle every side tail lamp as independent control, can detect each direction through each controllable unit of independent control lighting when using the method that this application provided. In the present application, optionally, the controllable lighting unit may be a lamp group formed by one or several lighting units, where the lighting units are specifically light bulbs or LED lighting points, and the lamp group may be turned on independently.
In this application, except lighting up single controllable luminescence unit alone, can also light up the luminescence unit combination of controlling the side, this kind of mode is mainly used different luminescence point direct position difference, can cross reference between the different controllable luminescence units, still can solve the vehicle in-process with the continuous change of distance of object and need detect a plurality of points in short time simultaneously to prevent the controllable luminescence unit of the different controllable luminescence units of homonymy or the same luminescence unit of different side mutual influence of the same irradiation position and interfere subsequent image detection's problem.
In the application, according to the acquired image, the distance between the vehicle and the rear object can be identified through image processing, so that a distance prompt is output to prompt a driver to pay attention to the rear object.
Compared with the prior art, the beneficial effects that this application exists are: when the vehicle is in reverse, the controllable light emitting units of the tail lamp are controlled to be on or off, so that the irradiation points of the controllable light emitting units are displayed in the acquired image, the distance between the vehicle and the object is calculated by utilizing the deviation between the irradiation points of the controllable light emitting units and the projection points, the distance measuring device can be used in combination with the existing image distance measuring method or can be used independently, and the problem of low accuracy of the existing image distance measuring is solved.
In one embodiment of the present application, the individually illuminating each controllable lighting unit of the left and right taillights for a set period of time includes:
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the first side and keeping a set time length;
repeating the previous step until the controllable light-emitting units at the first side are all lightened;
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the second side and keeping a set time period;
and repeating the previous step until the controllable light-emitting units on the second side are all lighted.
In the application, each controllable light-emitting unit is independently started to detect, so that the illumination influence of other controllable light-emitting units is avoided, and an optimal detection result can be obtained; and the irradiation directions of the controllable light-emitting units are different, so that the directional detection of different directions can be realized, and objects with relatively close distances can be conveniently and rapidly found. In this application, here set for the duration can set up to the millisecond level, and total effect is similar to the flash lamp and shoots, can realize lighting all controllable luminescence units in the short time, and when setting up the interval duration of two controllable luminescence units shorter, can also utilize human persistence of vision to weaken the flash effect of tail lamp, reduce the scintillation state that presents and to the influence of human eye, present the visual effect of different controllable luminescence units smooth transition, do not influence the illumination to the rear.
In one embodiment of the present application, the simultaneously illuminating a set of controllable light emitting units of the left and right taillights for a set period of time includes:
s51, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened;
s52, according to the arrangement sequence of the controllable light emitting units on the second side, one controllable light emitting unit of the tail lamp on the second side is lightened, a set time length is kept, and the next controllable light emitting unit on the second side is lightened in a switching mode;
s53, repeating the previous step until all the controllable light emitting units on the second side are lighted;
s54, switching on the next controllable light-emitting unit at the first side;
s55, repeating the steps S51-S54 until all the controllable light emitting units on the first side are lighted;
s56, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened;
s57, according to the arrangement sequence of the controllable light-emitting units at the first side, one controllable light-emitting unit of the tail lamp at the first side is lightened, a set time length is kept, and the next controllable light-emitting unit at the first side is lightened in a switching mode;
s58, repeating the previous step until all the controllable light emitting units on the first side are lightened;
s59, switching on the next controllable light-emitting unit on the second side;
s510, repeating steps S56-S59 until all controllable light emitting units on the second side are lit.
In this application, the difference from the previous embodiment is that the process of this embodiment is to combine and light up each of the controllable light emitting units on the left and right sides, and the detected points are changed from one to two, so that the object with a relatively close distance can be found more quickly, and the two controllable light emitting units can be combined and lighted up to obtain a better lighting effect.
In one embodiment of the present application, the outputting a shortest distance between a vehicle and an object according to the acquired image includes:
for the images acquired in the step S4, determining a first distance value from each frame of image;
for the images acquired in the step S6, determining a second distance value from each frame of image;
for each cycle, the shortest distance between the vehicle and the object is output from all the obtained first distances and all the obtained second distances.
In the application, the identification processing of the images is synchronously performed in the image acquisition process, so that the distance between the object and the vehicle is obtained and output.
In one embodiment of the present application, the determining a first distance value from each frame of image includes:
detecting the irradiation point of the controllable light-emitting unit corresponding to each frame of image on the image;
determining projection points of the corresponding controllable light-emitting units on the image;
determining the deviation of the irradiation point and the projection point;
and calculating a first distance value from the arrangement angle of the controllable light emitting units and the obtained deviation.
In the application, the relative position of each controllable light emitting unit and the camera is fixed, namely, the projection point of each controllable light emitting unit in the image can be obtained by projecting the controllable light emitting unit in the three-dimensional space along the irradiation direction of the center of the camera.
In the present application, the deviation here refers to the relative distance d of the illumination point from the projection point in the image; the relative angle between the irradiation direction of each controllable light emitting unit and the shooting direction of the camera is fixed, and if the angle is A, L=d/tan A can be obtained from tan A=d/L, wherein L is the distance between the object at the irradiation position of the controllable light emitting unit and the vehicle.
In one embodiment of the present application, the detecting the irradiation point of the controllable light emitting unit corresponding to each frame of image on the image includes:
selecting one primary color according to the luminous color of the controllable luminous unit;
extracting primary color components of each pixel point of the image to obtain a primary color map;
setting a distinguishing color value, and improving the pixel contrast of the basic color image according to the distinguishing color value;
setting a target color value and a tolerance, and determining a maximum communication area meeting the conditions according to the target color value and the tolerance;
and taking the geometric center of the communication area as an irradiation point.
In this application, the differentiating color value may be set according to the ambient brightness, for example, the lowest high value of the ambient brightness when the method is executed is a, and the brightness when the full black is b, the brightness difference a-b may be obtained, and the color value range of the corresponding primary color is 0-255, so the corresponding differentiating color value may be obtained from the currently detected brightness value (for example, the detected color value is e, differentiating color value=e/(a-b) ×255), the primary color component of the pixel lower than the differentiating color value is reduced, and the primary color component higher than the differentiating color value is increased, so that the contrast is more remarkable, and the magnitude of the reduction or increase may be set to a fixed value such as 20% or 50%, and the like, and note that the color value component should be not higher than 255 after the reduction.
In the present application, the target color value may be set to about 180-240, and the area illuminated by the controllable light emitting unit may be found in this interval due to the color extraction process, so as to identify a connected domain meeting the condition, where it can be understood that the connected domain is an area formed by adjacent pixels where the primary color component reaches the target color value, and the geometric center of the connected domain with the largest area is taken as the illumination point. For a method for determining the geometric center of an arbitrary planar area, reference may be made to a method for determining the centroid in the prior art, which is not described herein.
In one embodiment of the present application, the determining a second distance value from each frame of image includes:
detecting the irradiation points of the controllable light-emitting units corresponding to each frame of image on the image, wherein the number of the irradiation points in each frame of image is two;
determining projection points of the corresponding controllable light-emitting units on the image, wherein two projection points are arranged in each frame of image;
determining the deviation of the irradiation point and the corresponding projection point;
and calculating a distance value corresponding to each controllable light-emitting unit according to the arrangement angle of the controllable light-emitting units and the obtained deviation, and selecting a smaller value from the distance values to obtain a second distance value.
In this application, the difference from the case of a single controllable lighting unit is that there are two lighting points and a smaller distance output determined by the two lighting points is taken.
In one embodiment of the present application, the outputting, by the obtained all first distances and all second distances, the shortest distance between the vehicle and the object includes:
and determining the minimum value of all the obtained first distances and the obtained second distances, outputting the minimum value as the shortest distance between the vehicle and the object, and outputting a projection area corresponding to the minimum distance.
In the method, the driver can be effectively reminded of the distance between the rear of the rear vehicle and the object by outputting the minimum value of the two distances, so that safe reversing is ensured.
In one embodiment of the present application, the method for controlling the distance sensing of the tail lamp for the vehicle further includes:
in the process of circularly executing S3-S6, calculating a corresponding distance every time an image is acquired, and judging the calculated distance and the shortest distance between the vehicle and the object output by the previous cycle;
and if the calculated distance is smaller than the shortest distance between the vehicle and the object output in the previous cycle, outputting the shortest distance until the current cycle.
In the application, the situation that the shortest distances detected by different circulation processes are different is also considered, and at this time, because the duration of each circulation is shorter, the shortest distances in two adjacent circulation processes are selected and output at this time, so that the safety of the vehicle can be more effectively ensured.
Referring to fig. 2, a rear lamp distance sensing control device for a vehicle includes:
the reversing detection module is used for detecting a reversing instruction and starting a tail lamp;
the brightness detection module is used for obtaining the detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
the first lighting module is used for respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
the first acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on;
the second lighting module is used for simultaneously lighting a group of controllable light emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
the second acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each group of controllable light-emitting units keep on;
and the circulation output module is used for executing S3-S6 in a circulation mode, outputting the shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
The process of implementing respective functions by each module in the vehicle taillight distance sensing control device provided in this embodiment of the present application may refer to the description of the embodiment shown in fig. 1, and will not be repeated here.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance. It will also be understood that, although the terms "first," "second," etc. may be used in this document to describe various elements in some embodiments of the present application, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first table may be named a second table, and similarly, a second table may be named a first table without departing from the scope of the various described embodiments. The first table and the second table are both tables, but they are not the same table.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The method for controlling the distance induction of the tail lamp for the vehicle can be applied to terminal equipment such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented reality (augmented reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the specific types of the terminal equipment are not limited.
For example, the terminal device may be a Station (ST) in a WLAN, a cellular telephone, a cordless telephone, a Session initiation protocol (Session InitiationProtocol, SIP) telephone, a wireless local loop (Wireless Local Loop, WLL) station, a personal digital assistant (Personal Digital Assistant, PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, an in-vehicle device, a car networking terminal, a computer, a laptop computer, a handheld communication device, a handheld computing device, a satellite radio, a wireless modem card, a television Set Top Box (STB), a customer premise equipment (customer premise equipment, CPE) and/or other devices for communicating over a wireless system as well as next generation communication systems, such as a mobile terminal in a 5G network or a mobile terminal in a future evolved public land mobile network (Public Land Mobile Network, PLMN) network, etc.
By way of example, but not limitation, when the terminal device is a wearable device, the wearable device may also be a generic name for applying wearable technology to intelligently design daily wear, developing wearable devices, such as glasses, gloves, watches, apparel, shoes, and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device comprises full functions, large size, and complete or partial functions which can be realized independent of a smart phone, such as a smart watch or a smart glasses, and is only focused on certain application functions, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets, smart jewelry and the like for physical sign monitoring.
Fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device 3 of this embodiment includes: at least one processor 30 (only one is shown in fig. 3), a memory 31, said memory 31 having stored therein a computer program 32 executable on said processor 30. The processor 30 executes the computer program 32 to implement the steps of the method for controlling the distance sensing of the tail light for the vehicle in the above embodiments, such as steps S1 to S7 shown in fig. 1. Alternatively, the processor 30 may perform the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules shown in fig. 2, when executing the computer program 32.
The terminal device 3 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the terminal device 3 and does not constitute a limitation of the terminal device 3, and may comprise more or less components than shown, or may combine certain components, or different components, e.g. the terminal device may further comprise an input transmitting device, a network access device, a bus, etc.
The processor 30 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may in some embodiments be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may be an external storage device of the terminal device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal device 3. The memory 31 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs etc., such as program codes of the computer program etc. The memory 31 may also be used for temporarily storing data that has been transmitted or is to be transmitted.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The embodiment of the application also provides a terminal device, which comprises at least one memory, at least one processor and a computer program stored in the at least one memory and capable of running on the at least one processor, wherein the processor executes the computer program to enable the terminal device to realize the steps in any of the method embodiments.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps that may implement the various method embodiments described above.
The present embodiments provide a computer program product which, when run on a terminal device, causes the terminal device to perform steps that enable the respective method embodiments described above to be implemented.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (4)
1. The method for controlling the distance induction of the tail lamp of the vehicle is characterized by comprising the following steps of:
s1, detecting a reversing instruction and starting a tail lamp;
s2, acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
s3, respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
s4, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each controllable light-emitting unit keeps on;
s5, simultaneously lighting a group of controllable light-emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
s6, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on;
s7, circularly executing S3-S6, outputting a shortest distance between a vehicle and an object according to the acquired image in each cycle, and outputting prompt information according to the output shortest distance;
each controllable light emitting unit for respectively lighting the left tail lamp and the right tail lamp is kept for a set time length, and comprises the following steps:
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the first side and keeping a set time length;
repeating the previous step until the controllable light-emitting units at the first side are all lightened;
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the second side and keeping a set time period;
repeating the previous step until the controllable light-emitting units at the second side are all lightened;
the set of controllable light emitting units for simultaneously lighting the left and right taillights and maintaining a set period of time comprises:
s51, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened;
s52, according to the arrangement sequence of the controllable light emitting units on the second side, one controllable light emitting unit of the tail lamp on the second side is lightened, a set time length is kept, and the next controllable light emitting unit on the second side is lightened in a switching mode;
s53, repeating the previous step until all the controllable light emitting units on the second side are lighted;
s54, switching on the next controllable light-emitting unit at the first side;
s55, repeating the steps S51-S54 until all the controllable light emitting units on the first side are lighted;
s56, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened;
s57, according to the arrangement sequence of the controllable light-emitting units at the first side, one controllable light-emitting unit of the tail lamp at the first side is lightened, a set time length is kept, and the next controllable light-emitting unit at the first side is lightened in a switching mode;
s58, repeating the previous step until all the controllable light emitting units on the first side are lightened;
s59, switching on the next controllable light-emitting unit on the second side;
s510, repeating the steps S56-S59 until all the controllable light emitting units on the second side are lighted;
the outputting a shortest distance between a vehicle and an object according to the acquired image comprises the following steps:
for the images acquired in the step S4, determining a first distance value from each frame of image;
for the images acquired in the step S6, determining a second distance value from each frame of image;
outputting a shortest distance between the vehicle and the object from all the obtained first distances and all the obtained second distances for each cycle;
said determining a first distance value from each frame of images comprises:
detecting the irradiation point of the controllable light-emitting unit corresponding to each frame of image on the image;
determining projection points of the corresponding controllable light-emitting units on the image;
determining the deviation of the irradiation point and the projection point;
calculating a first distance value from the arrangement angle of the controllable light emitting units and the obtained deviation;
the detecting the irradiation point of the controllable light emitting unit corresponding to each frame of image on the image comprises the following steps:
selecting one primary color according to the luminous color of the controllable luminous unit;
extracting primary color components of each pixel point of the image to obtain a primary color map;
setting a distinguishing color value, and improving the pixel contrast of the basic color image according to the distinguishing color value;
setting a target color value and a tolerance, and determining a maximum communication area meeting the conditions according to the target color value and the tolerance;
taking the geometric center of the communication area as an irradiation point;
said determining a second distance value from each frame of images comprises:
detecting the irradiation points of the controllable light-emitting units corresponding to each frame of image on the image, wherein the number of the irradiation points in each frame of image is two;
determining projection points of the corresponding controllable light-emitting units on the image, wherein two projection points are arranged in each frame of image;
determining the deviation of the irradiation point and the corresponding projection point;
and calculating a distance value corresponding to each controllable light-emitting unit according to the arrangement angle of the controllable light-emitting units and the obtained deviation, and selecting a smaller value from the distance values to obtain a second distance value.
2. The method of controlling the distance sensing of the tail light for the vehicle according to claim 1, wherein the outputting of the shortest distance between the vehicle and the object from all the first distances and all the second distances obtained comprises:
and determining the minimum value of all the obtained first distances and the obtained second distances, outputting the minimum value as the shortest distance between the vehicle and the object, and outputting a projection area corresponding to the minimum distance.
3. The vehicle tail light distance sensing control method according to claim 2, characterized in that the vehicle tail light distance sensing control method further comprises:
in the process of circularly executing S3-S6, calculating a corresponding distance every time an image is acquired, and judging the calculated distance and the shortest distance between the vehicle and the object output by the previous cycle;
and if the calculated distance is smaller than the shortest distance between the vehicle and the object output in the previous cycle, outputting the shortest distance until the current cycle.
4. A rear lamp distance sensing control device for a vehicle for performing the rear lamp distance sensing control method for a vehicle according to any one of claims 1 to 3, characterized in that the rear lamp distance sensing control device for a vehicle comprises:
the reversing detection module is used for detecting a reversing instruction and starting a tail lamp;
the brightness detection module is used for obtaining the detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
the first lighting module is used for respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
the first acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on;
the second lighting module is used for simultaneously lighting a group of controllable light emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
the second acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each group of controllable light-emitting units keep on;
and the circulation output module is used for executing S3-S6 in a circulation mode, outputting the shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311168947.9A CN116908828B (en) | 2023-09-12 | 2023-09-12 | Distance induction control method and device for automobile tail lamp |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311168947.9A CN116908828B (en) | 2023-09-12 | 2023-09-12 | Distance induction control method and device for automobile tail lamp |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116908828A CN116908828A (en) | 2023-10-20 |
CN116908828B true CN116908828B (en) | 2023-12-19 |
Family
ID=88355022
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311168947.9A Active CN116908828B (en) | 2023-09-12 | 2023-09-12 | Distance induction control method and device for automobile tail lamp |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116908828B (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006172210A (en) * | 2004-12-16 | 2006-06-29 | Matsushita Electric Works Ltd | Distance image sensor for vehicle, and obstacle monitoring device using the same |
CN104802710A (en) * | 2015-04-17 | 2015-07-29 | 浙江大学 | Intelligent automobile parking assisting system and assisting method |
CN106828304A (en) * | 2015-12-07 | 2017-06-13 | 财团法人金属工业研究发展中心 | Car backing warning method using structure light sensing obstacle |
CN107992810A (en) * | 2017-11-24 | 2018-05-04 | 智车优行科技(北京)有限公司 | Vehicle identification method and device, electronic equipment, computer program and storage medium |
EP3531393A1 (en) * | 2018-02-27 | 2019-08-28 | odelo GmbH | Method and vehicle lamp for recording when the distance between successive vehicles falls below a safety distance |
CN110364024A (en) * | 2019-06-10 | 2019-10-22 | 深圳市锐明技术股份有限公司 | Environment control method, device and the car-mounted terminal of driving vehicle |
CN113022434A (en) * | 2021-05-13 | 2021-06-25 | 周宇 | Automatic door opening anti-collision device and method for vehicle |
CN115214457A (en) * | 2022-04-24 | 2022-10-21 | 广州汽车集团股份有限公司 | Vehicle light control method, vehicle light control device, vehicle, and storage medium |
CN115257527A (en) * | 2022-06-27 | 2022-11-01 | 智己汽车科技有限公司 | Tail lamp display control method and device and vehicle |
CN115534801A (en) * | 2022-08-29 | 2022-12-30 | 深圳市欧冶半导体有限公司 | Vehicle lamp self-adaptive dimming method and device, intelligent terminal and storage medium |
CN116461259A (en) * | 2022-01-19 | 2023-07-21 | 福特全球技术公司 | Auxiliary vehicle operation with improved object detection |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230099674A1 (en) * | 2021-09-29 | 2023-03-30 | Subaru Corporation | Vehicle backup warning systems |
-
2023
- 2023-09-12 CN CN202311168947.9A patent/CN116908828B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006172210A (en) * | 2004-12-16 | 2006-06-29 | Matsushita Electric Works Ltd | Distance image sensor for vehicle, and obstacle monitoring device using the same |
CN104802710A (en) * | 2015-04-17 | 2015-07-29 | 浙江大学 | Intelligent automobile parking assisting system and assisting method |
CN106828304A (en) * | 2015-12-07 | 2017-06-13 | 财团法人金属工业研究发展中心 | Car backing warning method using structure light sensing obstacle |
CN107992810A (en) * | 2017-11-24 | 2018-05-04 | 智车优行科技(北京)有限公司 | Vehicle identification method and device, electronic equipment, computer program and storage medium |
EP3531393A1 (en) * | 2018-02-27 | 2019-08-28 | odelo GmbH | Method and vehicle lamp for recording when the distance between successive vehicles falls below a safety distance |
CN110364024A (en) * | 2019-06-10 | 2019-10-22 | 深圳市锐明技术股份有限公司 | Environment control method, device and the car-mounted terminal of driving vehicle |
CN113022434A (en) * | 2021-05-13 | 2021-06-25 | 周宇 | Automatic door opening anti-collision device and method for vehicle |
CN116461259A (en) * | 2022-01-19 | 2023-07-21 | 福特全球技术公司 | Auxiliary vehicle operation with improved object detection |
CN115214457A (en) * | 2022-04-24 | 2022-10-21 | 广州汽车集团股份有限公司 | Vehicle light control method, vehicle light control device, vehicle, and storage medium |
CN115257527A (en) * | 2022-06-27 | 2022-11-01 | 智己汽车科技有限公司 | Tail lamp display control method and device and vehicle |
CN115534801A (en) * | 2022-08-29 | 2022-12-30 | 深圳市欧冶半导体有限公司 | Vehicle lamp self-adaptive dimming method and device, intelligent terminal and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN116908828A (en) | 2023-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3089442B1 (en) | Vehicle-mounted image recognition device | |
WO2019200578A1 (en) | Electronic apparatus, and identity recognition method thereof | |
EP2448251B1 (en) | Bundling night vision and other driver assistance systems (DAS) using near infra red (NIR) illumination and a rolling shutter | |
US20050111698A1 (en) | Apparatus for vehicle surroundings monitoring and method thereof | |
TWI727219B (en) | Method for generating representation of image, imaging system, and machine-readable storage devices | |
JP4775123B2 (en) | Vehicle monitoring device | |
CN107110648A (en) | The system and method detected for visual range | |
CN107627969B (en) | Method and device for changing color of vehicle body and computer storage medium | |
CN108764139B (en) | Face detection method, mobile terminal and computer readable storage medium | |
CN111601373B (en) | Backlight brightness control method and device, mobile terminal and storage medium | |
US20170116488A1 (en) | Method for identifying an incoming vehicle and corresponding system | |
CN113686350B (en) | Road information display method and device and intelligent wearable equipment | |
US8724852B2 (en) | Method for sensing motion and device for implementing the same | |
CN116908828B (en) | Distance induction control method and device for automobile tail lamp | |
US20140241580A1 (en) | Object detection apparatus | |
CN107851382B (en) | Light fixture detection device and lamps and lanterns detection method | |
JP5892079B2 (en) | Object detection device | |
CN106331465A (en) | Image acquisition device and auxiliary shooting method thereof | |
EP3226554B1 (en) | Imaging device and vehicle | |
CN115278099A (en) | Light supplement lamp control method, terminal device and storage medium | |
CN113591514B (en) | Fingerprint living body detection method, fingerprint living body detection equipment and storage medium | |
CN114450185B (en) | Image display device, display control method, and recording medium | |
CN116994514B (en) | Image recognition-based vehicle display brightness control method, device and system | |
CN113132617B (en) | Image jitter judgment method and device and image identification triggering method and device | |
CN114841863A (en) | Image color correction method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |