CN116908828A - Distance induction control method and device for automobile tail lamp - Google Patents

Distance induction control method and device for automobile tail lamp Download PDF

Info

Publication number
CN116908828A
CN116908828A CN202311168947.9A CN202311168947A CN116908828A CN 116908828 A CN116908828 A CN 116908828A CN 202311168947 A CN202311168947 A CN 202311168947A CN 116908828 A CN116908828 A CN 116908828A
Authority
CN
China
Prior art keywords
controllable light
vehicle
distance
image
tail lamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311168947.9A
Other languages
Chinese (zh)
Other versions
CN116908828B (en
Inventor
林启程
唐勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yonglin Electronics Co Ltd
Original Assignee
Yonglin Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yonglin Electronics Co Ltd filed Critical Yonglin Electronics Co Ltd
Priority to CN202311168947.9A priority Critical patent/CN116908828B/en
Publication of CN116908828A publication Critical patent/CN116908828A/en
Application granted granted Critical
Publication of CN116908828B publication Critical patent/CN116908828B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/30Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating rear of vehicle, e.g. by means of reflecting surfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/005Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/16Controlling the light source by timing means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The application belongs to the technical field of intelligent control of vehicles, and particularly relates to a distance induction control method and device for a tail lamp of a vehicle, wherein the method comprises the following steps: detecting a reversing instruction and starting a tail lamp; acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then: each controllable light-emitting unit of the left tail lamp and the right tail lamp is respectively lightened and kept for a set duration; controlling a camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on; simultaneously, a group of controllable light-emitting units of the left tail lamp and the right tail lamp are lightened and kept for a set time length; controlling a camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on; and outputting the shortest distance between the vehicle and the object according to the acquired image. According to the application, the distance between the vehicle and the object can be obtained by utilizing the irradiation of the tail lamp and the image acquisition, so that the distance sensing is realized.

Description

Distance induction control method and device for automobile tail lamp
Technical Field
The application belongs to the technical field of intelligent control of vehicles, and particularly relates to a distance induction control method and device for a tail lamp of a vehicle.
Background
The distance measurement of the vehicle refers to the distance between the vehicle and other vehicles, pedestrians or objects through various sensors, and commonly used ultrasonic distance measurement, millimeter wave distance measurement, laser distance measurement, camera distance measurement and the like are carried out, wherein, a plurality of distance measurement modes except for camera distance measurement depend on a special distance measurement sensor, the cost is high, and the measurement and the calculation of the distance by using the picture of the camera are always research and development hot spots.
When reversing in the garage, because the distances between vehicles and between the vehicles and other objects are often relatively close, the distance between the vehicles needs to be paid attention to at any time, and the vehicles are prevented from being scratched. The accuracy of special range finding sensor is higher, but the working face is very narrow, need set up a plurality of sensors and just can realize the range finding of a face (such as front, back or side of vehicle), and the camera has the characteristics of shooting face width, with low costs, how to utilize the camera to measure the distance more accurately is the problem that needs to solve.
Disclosure of Invention
In view of the above, the embodiment of the application provides a method and a device for controlling the distance sensing of a tail lamp for a vehicle, which can solve the problem that how to more accurately measure the distance by using a camera is needed to be solved.
A first aspect of an embodiment of the present application provides a method for controlling distance sensing of a tail lamp for a vehicle, including:
s1, detecting a reversing instruction and starting a tail lamp;
s2, acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
s3, respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
s4, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each controllable light-emitting unit keeps on;
s5, simultaneously lighting a group of controllable light-emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
s6, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on;
s7, circularly executing S3-S6, outputting a shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
A second aspect of an embodiment of the present application provides a distance sensing control device for a tail lamp for a vehicle, the distance sensing control device including:
the reversing detection module is used for detecting a reversing instruction and starting a tail lamp;
the brightness detection module is used for obtaining the detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
the first lighting module is used for respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
the first acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on;
the second lighting module is used for simultaneously lighting a group of controllable light emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
the second acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each group of controllable light-emitting units keep on;
and the circulation output module is used for executing S3-S6 in a circulation mode, outputting the shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
Compared with the prior art, the application has the beneficial effects that: when the vehicle is in reverse, the controllable light emitting units of the tail lamp are controlled to be on or off, so that the irradiation points of the controllable light emitting units are displayed in the acquired image, the distance between the vehicle and the object is calculated by utilizing the deviation between the irradiation points of the controllable light emitting units and the projection points, the distance measuring device can be used in combination with the existing image distance measuring method or can be used independently, and the problem of low accuracy of the existing image distance measuring is solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for controlling distance sensing of a tail lamp for a vehicle according to an embodiment of the present application;
fig. 2 is a block diagram of a distance sensing control device for a tail lamp of a vehicle according to an embodiment of the present application;
fig. 3 is a schematic diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Fig. 1 shows a distance sensing control method for a tail lamp for a vehicle according to an embodiment of the present application, which is described in detail below: a vehicle tail light distance sensing control method, the vehicle tail light distance sensing control method comprising:
s1, detecting a reversing instruction and starting a tail lamp;
s2, acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
s3, respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
s4, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each controllable light-emitting unit keeps on;
s5, simultaneously lighting a group of controllable light-emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
s6, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on;
s7, circularly executing S3-S6, outputting a shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
In the application, the reversing instruction of the vehicle is generated by the gear switching process of the vehicle, belongs to the conventional detection parameters of the vehicle machine, and is not particularly limited for detecting reversing signals; similarly, turning on the tail lamp of the vehicle based on the reversing signal also belongs to the conventional arrangement of the vehicle, and the application is not repeated here.
In the application, the brightness sensor is used for detecting the ambient brightness, when the ambient brightness is detected to be lower than a set value, the ambient brightness is judged to be lower, and the light is darker, at the moment, the method provided by the application can assist the detection of the rear distance, and when the brightness is higher, the effect is poor when the tail lamp is used for assisting the detection of the distance, the specific brightness value can be set by a user, and different values can not influence the realization of the scheme provided by the application, but only influence the recognition difficulty of the obtained image, and can also be used for assisting the detection. Meanwhile, as a reference setting mode, the brightness value of the scheme can be set to be consistent with the brightness value used by the automatic lamplight of the vehicle, so that the control variable of the vehicle is reduced.
In the application, different from the general control process of the tail lamp of the vehicle, each light-emitting unit of the tail lamp at each side of the vehicle is set to be independently controlled, and each direction can be detected by independently controlling the lighting of each controllable unit when the method provided by the application is used. In the present application, alternatively, the controllable light emitting unit may be a lamp group formed by one or several light emitting units, where the light emitting units are specifically light bulbs or LED light emitting points.
In the application, besides the single controllable light-emitting unit is independently lightened, the light-emitting units on the left side and the right side are combined to lighten, the mode is mainly used for different direct light-emitting positions of different light-emitting points, the different controllable light-emitting units can be mutually referred, meanwhile, the problem that a plurality of points need to be detected in short time when the distance between the vehicle and an object continuously changes in the moving process of the vehicle, and the problem that the different controllable light-emitting units on the same side or the controllable light-emitting units on the same irradiation position on different sides mutually influence and interfere with subsequent image detection can be solved.
In the application, according to the acquired image, the distance between the vehicle and the rear object can be identified through image processing, so that a distance prompt is output to prompt a driver to pay attention to the rear object.
Compared with the prior art, the application has the beneficial effects that: when the vehicle is in reverse, the controllable light emitting units of the tail lamp are controlled to be on or off, so that the irradiation points of the controllable light emitting units are displayed in the acquired image, the distance between the vehicle and the object is calculated by utilizing the deviation between the irradiation points of the controllable light emitting units and the projection points, the distance measuring device can be used in combination with the existing image distance measuring method or can be used independently, and the problem of low accuracy of the existing image distance measuring is solved.
In one embodiment of the present application, the individually illuminating each controllable lighting unit of the left and right taillights for a set period of time includes:
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the first side and keeping a set time length;
repeating the previous step until the controllable light-emitting units at the first side are all lightened;
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the second side and keeping a set time period;
and repeating the previous step until the controllable light-emitting units on the second side are all lighted.
In the application, each controllable light-emitting unit is independently started for detection, so that the illumination influence of other controllable light-emitting units is avoided, and the optimal detection result can be obtained; and the irradiation directions of the controllable light-emitting units are different, so that the directional detection of different directions can be realized, and objects with relatively close distances can be conveniently and rapidly found. In the application, the set time length can be set to be in millisecond level, the total effect is similar to flash lamp photographing, all controllable light emitting units can be lightened in a short time, when the interval time length of the two controllable light emitting units is set to be shorter, the flash effect of the tail lamp can be weakened by utilizing the persistence of vision of a human body, the influence of the presented flash state on eyes of the human body is reduced, the visual effect of smooth transition of different controllable light emitting units is presented, and the illumination to the rear is not influenced.
In one embodiment of the present application, the simultaneously illuminating a set of controllable light emitting units of the left and right taillights for a set period of time includes:
s51, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened;
s52, according to the arrangement sequence of the controllable light emitting units on the second side, one controllable light emitting unit of the tail lamp on the second side is lightened, a set time length is kept, and the next controllable light emitting unit on the second side is lightened in a switching mode;
s53, repeating the previous step until all the controllable light emitting units on the second side are lighted;
s54, switching on the next controllable light-emitting unit at the first side;
s55, repeating the steps S51-S54 until all the controllable light emitting units on the first side are lighted;
s56, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened;
s57, according to the arrangement sequence of the controllable light-emitting units at the first side, one controllable light-emitting unit of the tail lamp at the first side is lightened, a set time length is kept, and the next controllable light-emitting unit at the first side is lightened in a switching mode;
s58, repeating the previous step until all the controllable light emitting units on the first side are lightened;
s59, switching on the next controllable light-emitting unit on the second side;
s510, repeating steps S56-S59 until all controllable light emitting units on the second side are lit.
In the present application, the process of the present embodiment is different from that of the previous embodiment in that each of the controllable light emitting units on the left and right sides is combined to be lighted, and simultaneously, the detected points are changed from one to two, so that the object with a relatively close distance can be found more quickly, and the two controllable light emitting units are combined to be lighted to obtain a better lighting effect.
In one embodiment of the present application, the outputting a shortest distance between a vehicle and an object according to the acquired image includes:
for the images acquired in the step S4, determining a first distance value from each frame of image;
for the images acquired in the step S6, determining a second distance value from each frame of image;
for each cycle, the shortest distance between the vehicle and the object is output from all the obtained first distances and all the obtained second distances.
In the application, the identification processing of the images is synchronously carried out in the image acquisition process, so that the distance between the object and the vehicle is obtained and output.
In one embodiment of the present application, the determining a first distance value from each frame of image includes:
detecting the irradiation point of the controllable light-emitting unit corresponding to each frame of image on the image;
determining projection points of the corresponding controllable light-emitting units on the image;
determining the deviation of the irradiation point and the projection point;
and calculating a first distance value from the arrangement angle of the controllable light emitting units and the obtained deviation.
In the application, the relative position of each controllable light-emitting unit and the camera is fixed, namely, the projection point of each controllable light-emitting unit in the image can be obtained by projecting the controllable light-emitting unit in the three-dimensional space along the irradiation direction of the center of the camera.
In the present application, the deviation here refers to the relative distance d of the irradiation point and the projection point in the image; the relative angle between the irradiation direction of each controllable light emitting unit and the shooting direction of the camera is fixed, and if the angle is A, L=d/tan A can be obtained from tan A=d/L, wherein L is the distance between the object at the irradiation position of the controllable light emitting unit and the vehicle.
In one embodiment of the present application, the detecting the irradiation point of the controllable light emitting unit corresponding to each frame of image on the image includes:
selecting one primary color according to the luminous color of the controllable luminous unit;
extracting primary color components of each pixel point of the image to obtain a primary color map;
setting a distinguishing color value, and improving the pixel contrast of the basic color image according to the distinguishing color value;
setting a target color value and a tolerance, and determining a maximum communication area meeting the conditions according to the target color value and the tolerance;
and taking the geometric center of the communication area as an irradiation point.
In the present application, the differentiating color value may be set according to the ambient brightness, for example, the lowest high value of the ambient brightness when the method is executed is a, and the brightness when the full black is b, the brightness difference a-b may be obtained, and the color value range of the corresponding primary color is 0-255, so the corresponding differentiating color value may be obtained from the currently detected brightness value (for example, the detected color value is e, differentiating color value=e/(a-b) ×255), the primary color component of the pixel lower than the differentiating color value is reduced, and the primary color component higher than the differentiating color value is increased, so that the contrast is more remarkable, and the magnitude of the reduction or increase may be set to a fixed value such as 20% or 50%, etc., note that the color value component should be not higher than 255 after the increase, and should not be lower than 0 after the decrease.
In the present application, the target color value may be set to about 180-240, and the area illuminated by the controllable light emitting unit may be found in this interval due to the color extraction process, so as to identify a connected domain meeting the condition. For the method for determining the geometric center of an arbitrary planar area, reference may be made to the method for determining the centroid in the prior art, and the present application will not be described herein.
In one embodiment of the present application, the determining a second distance value from each frame of image includes:
detecting the irradiation points of the controllable light-emitting units corresponding to each frame of image on the image, wherein the number of the irradiation points in each frame of image is two;
determining projection points of the corresponding controllable light-emitting units on the image, wherein two projection points are arranged in each frame of image;
determining the deviation of the irradiation point and the corresponding projection point;
and calculating a distance value corresponding to each controllable light-emitting unit according to the arrangement angle of the controllable light-emitting units and the obtained deviation, and selecting a smaller value from the distance values to obtain a second distance value.
In the present application, the difference from the case of a single controllable light emitting unit is that there are two light emitting points and a smaller distance output determined by the two light emitting points is taken.
In one embodiment of the present application, the outputting the shortest distance between the vehicle and the object from all the obtained first distances and all the obtained second distances includes:
and determining the minimum value of all the obtained first distances and the obtained second distances, outputting the minimum value as the shortest distance between the vehicle and the object, and outputting a projection area corresponding to the minimum distance.
In the application, the driver can be effectively reminded of the distance between the rear of the rear vehicle and the object by outputting the minimum value of the two distances, so that the safe reversing is ensured.
In one embodiment of the present application, the method for controlling the distance sensing of the tail lamp for the vehicle further includes:
in the process of circularly executing S3-S6, calculating a corresponding distance every time an image is acquired, and judging the calculated distance and the shortest distance between the vehicle and the object output by the previous cycle;
and if the calculated distance is smaller than the shortest distance between the vehicle and the object output in the previous cycle, outputting the shortest distance until the current cycle.
In the application, the condition that the shortest distances detected in different circulation processes are different is also considered, and at the moment, the shortest distance in two adjacent circulation processes is selected and output because the duration of each circulation is shorter, so that the safety of the vehicle can be more effectively ensured.
Referring to fig. 2, a rear lamp distance sensing control device for a vehicle includes:
the reversing detection module is used for detecting a reversing instruction and starting a tail lamp;
the brightness detection module is used for obtaining the detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
the first lighting module is used for respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
the first acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on;
the second lighting module is used for simultaneously lighting a group of controllable light emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
the second acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each group of controllable light-emitting units keep on;
and the circulation output module is used for executing S3-S6 in a circulation mode, outputting the shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
The process of implementing respective functions by each module in the vehicle taillight distance sensing control device provided by the embodiment of the present application may refer to the description of the embodiment shown in fig. 1, and will not be repeated here.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance. It will also be understood that, although the terms "first," "second," etc. may be used herein in some embodiments of the application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first table may be named a second table, and similarly, a second table may be named a first table without departing from the scope of the various described embodiments. The first table and the second table are both tables, but they are not the same table.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The method for controlling the distance induction of the tail lamp for the vehicle, provided by the embodiment of the application, can be applied to terminal equipment such as mobile phones, tablet computers, wearable equipment, vehicle-mounted equipment, augmented reality (augmented reality, AR)/Virtual Reality (VR) equipment, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal digital assistant, PDA) and the like, and the embodiment of the application does not limit the specific types of the terminal equipment.
For example, the terminal device may be a Station (ST) in a WLAN, a cellular telephone, a cordless telephone, a Session initiation protocol (Session InitiationProtocol, SIP) telephone, a wireless local loop (Wireless Local Loop, WLL) station, a personal digital assistant (Personal Digital Assistant, PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, an in-vehicle device, a car networking terminal, a computer, a laptop computer, a handheld communication device, a handheld computing device, a satellite radio, a wireless modem card, a television Set Top Box (STB), a customer premise equipment (customer premise equipment, CPE) and/or other devices for communicating over a wireless system as well as next generation communication systems, such as a mobile terminal in a 5G network or a mobile terminal in a future evolved public land mobile network (Public Land Mobile Network, PLMN) network, etc.
By way of example, but not limitation, when the terminal device is a wearable device, the wearable device may also be a generic name for applying wearable technology to intelligently design daily wear, developing wearable devices, such as glasses, gloves, watches, apparel, shoes, and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device comprises full functions, large size, and complete or partial functions which can be realized independent of a smart phone, such as a smart watch or a smart glasses, and is only focused on certain application functions, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets, smart jewelry and the like for physical sign monitoring.
Fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device 3 of this embodiment includes: at least one processor 30 (only one is shown in fig. 3), a memory 31, said memory 31 having stored therein a computer program 32 executable on said processor 30. The processor 30 executes the computer program 32 to implement the steps of the method for controlling the distance sensing of the tail light for the vehicle in the above embodiments, such as steps S1 to S7 shown in fig. 1. Alternatively, the processor 30 may perform the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules shown in fig. 2, when executing the computer program 32.
The terminal device 3 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the terminal device 3 and does not constitute a limitation of the terminal device 3, and may comprise more or less components than shown, or may combine certain components, or different components, e.g. the terminal device may further comprise an input transmitting device, a network access device, a bus, etc.
The processor 30 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may in some embodiments be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may be an external storage device of the terminal device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal device 3. The memory 31 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs etc., such as program codes of the computer program etc. The memory 31 may also be used for temporarily storing data that has been transmitted or is to be transmitted.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The embodiment of the application also provides a terminal device, which comprises at least one memory, at least one processor and a computer program stored in the at least one memory and capable of running on the at least one processor, wherein the processor executes the computer program to enable the terminal device to realize the steps in any of the method embodiments.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product enabling a terminal device to carry out the steps of the method embodiments described above when the computer program product is run on the terminal device.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The method for controlling the distance induction of the tail lamp of the vehicle is characterized by comprising the following steps of:
s1, detecting a reversing instruction and starting a tail lamp;
s2, acquiring a detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
s3, respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
s4, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each controllable light-emitting unit keeps on;
s5, simultaneously lighting a group of controllable light-emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
s6, controlling the camera to acquire a frame of image of the rearview direction of the vehicle in the time when each group of controllable light-emitting units keep on;
s7, circularly executing S3-S6, outputting a shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
2. The vehicle tail light distance sensing control method as claimed in claim 1, wherein each controllable light emitting unit of the left and right tail lights is individually lighted and maintained for a set period of time, comprising:
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the first side and keeping a set time length;
repeating the previous step until the controllable light-emitting units at the first side are all lightened;
according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened and kept for a set duration;
turning off the former controllable light-emitting unit, and lighting the next controllable light-emitting unit at the second side and keeping a set time period;
and repeating the previous step until the controllable light-emitting units on the second side are all lighted.
3. The method of controlling distance sensing of a tail lamp for a vehicle according to claim 1, wherein the simultaneously lighting a group of controllable light emitting units of the left and right tail lamps for a set period of time, comprises:
s51, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the first side tail lamp is lightened;
s52, according to the arrangement sequence of the controllable light emitting units on the second side, one controllable light emitting unit of the tail lamp on the second side is lightened, a set time length is kept, and the next controllable light emitting unit on the second side is lightened in a switching mode;
s53, repeating the previous step until all the controllable light emitting units on the second side are lighted;
s54, switching on the next controllable light-emitting unit at the first side;
s55, repeating the steps S51-S54 until all the controllable light emitting units on the first side are lighted;
s56, according to the arrangement sequence of the controllable light emitting units, a first controllable light emitting unit of the second side tail lamp is lightened;
s57, according to the arrangement sequence of the controllable light-emitting units at the first side, one controllable light-emitting unit of the tail lamp at the first side is lightened, a set time length is kept, and the next controllable light-emitting unit at the first side is lightened in a switching mode;
s58, repeating the previous step until all the controllable light emitting units on the first side are lightened;
s59, switching on the next controllable light-emitting unit on the second side;
s510, repeating steps S56-S59 until all controllable light emitting units on the second side are lit.
4. The method for controlling the distance sensing of a tail light for a vehicle according to claim 1, wherein said outputting a shortest distance between a vehicle and an object based on the acquired image comprises:
for the images acquired in the step S4, determining a first distance value from each frame of image;
for the images acquired in the step S6, determining a second distance value from each frame of image;
for each cycle, the shortest distance between the vehicle and the object is output from all the obtained first distances and all the obtained second distances.
5. The method of controlling distance sensing of a tail light for a vehicle as set forth in claim 4, wherein said determining a first distance value from each frame of image includes:
detecting the irradiation point of the controllable light-emitting unit corresponding to each frame of image on the image;
determining projection points of the corresponding controllable light-emitting units on the image;
determining the deviation of the irradiation point and the projection point;
and calculating a first distance value from the arrangement angle of the controllable light emitting units and the obtained deviation.
6. The method for controlling the distance sensing of the tail light for the vehicle according to claim 5, wherein detecting the irradiation point of the controllable light emitting unit corresponding to each frame of image on the image comprises:
selecting one primary color according to the luminous color of the controllable luminous unit;
extracting primary color components of each pixel point of the image to obtain a primary color map;
setting a distinguishing color value, and improving the pixel contrast of the basic color image according to the distinguishing color value;
setting a target color value and a tolerance, and determining a maximum communication area meeting the conditions according to the target color value and the tolerance;
and taking the geometric center of the communication area as an irradiation point.
7. The method of controlling distance sensing of a tail light for a vehicle as set forth in claim 4, wherein said determining a second distance value from each frame of image includes:
detecting the irradiation points of the controllable light-emitting units corresponding to each frame of image on the image, wherein the number of the irradiation points in each frame of image is two;
determining projection points of the corresponding controllable light-emitting units on the image, wherein two projection points are arranged in each frame of image;
determining the deviation of the irradiation point and the corresponding projection point;
and calculating a distance value corresponding to each controllable light-emitting unit according to the arrangement angle of the controllable light-emitting units and the obtained deviation, and selecting a smaller value from the distance values to obtain a second distance value.
8. The method of controlling the distance sensing of the tail light for the vehicle according to claim 7, wherein the outputting of the shortest distance between the vehicle and the object from all the first distances and all the second distances obtained includes:
and determining the minimum value of all the obtained first distances and the obtained second distances, outputting the minimum value as the shortest distance between the vehicle and the object, and outputting a projection area corresponding to the minimum distance.
9. The vehicle tail light distance sensing control method as set forth in claim 8, wherein said vehicle tail light distance sensing control method further includes:
in the process of circularly executing S3-S6, calculating a corresponding distance every time an image is acquired, and judging the calculated distance and the shortest distance between the vehicle and the object output by the previous cycle;
and if the calculated distance is smaller than the shortest distance between the vehicle and the object output in the previous cycle, outputting the shortest distance until the current cycle.
10. A rear lamp distance sensing control device for a vehicle, characterized by comprising:
the reversing detection module is used for detecting a reversing instruction and starting a tail lamp;
the brightness detection module is used for obtaining the detection value of the brightness sensor, judging whether the detection value of the brightness sensor reaches a set threshold value, and if the detection value of the brightness sensor reaches the set threshold value, then:
the first lighting module is used for respectively lighting each controllable light-emitting unit of the left tail lamp and the right tail lamp and keeping a set time length;
the first acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each controllable light-emitting unit keeps on;
the second lighting module is used for simultaneously lighting a group of controllable light emitting units of the left tail lamp and the right tail lamp and keeping a set time length;
the second acquisition module is used for controlling the camera to acquire a frame of image of the rear view direction of the vehicle in the time when each group of controllable light-emitting units keep on;
and the circulation output module is used for executing S3-S6 in a circulation mode, outputting the shortest distance between a vehicle and an object according to the acquired image every cycle, and outputting prompt information according to the output shortest distance.
CN202311168947.9A 2023-09-12 2023-09-12 Distance induction control method and device for automobile tail lamp Active CN116908828B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311168947.9A CN116908828B (en) 2023-09-12 2023-09-12 Distance induction control method and device for automobile tail lamp

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311168947.9A CN116908828B (en) 2023-09-12 2023-09-12 Distance induction control method and device for automobile tail lamp

Publications (2)

Publication Number Publication Date
CN116908828A true CN116908828A (en) 2023-10-20
CN116908828B CN116908828B (en) 2023-12-19

Family

ID=88355022

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311168947.9A Active CN116908828B (en) 2023-09-12 2023-09-12 Distance induction control method and device for automobile tail lamp

Country Status (1)

Country Link
CN (1) CN116908828B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006172210A (en) * 2004-12-16 2006-06-29 Matsushita Electric Works Ltd Distance image sensor for vehicle, and obstacle monitoring device using the same
CN104802710A (en) * 2015-04-17 2015-07-29 浙江大学 Intelligent automobile parking assisting system and assisting method
CN106828304A (en) * 2015-12-07 2017-06-13 财团法人金属工业研究发展中心 Car backing warning method using structure light sensing obstacle
CN107992810A (en) * 2017-11-24 2018-05-04 智车优行科技(北京)有限公司 Vehicle identification method and device, electronic equipment, computer program and storage medium
EP3531393A1 (en) * 2018-02-27 2019-08-28 odelo GmbH Method and vehicle lamp for recording when the distance between successive vehicles falls below a safety distance
CN110364024A (en) * 2019-06-10 2019-10-22 深圳市锐明技术股份有限公司 Environment control method, device and the car-mounted terminal of driving vehicle
CN113022434A (en) * 2021-05-13 2021-06-25 周宇 Automatic door opening anti-collision device and method for vehicle
CN115214457A (en) * 2022-04-24 2022-10-21 广州汽车集团股份有限公司 Vehicle light control method, vehicle light control device, vehicle, and storage medium
CN115257527A (en) * 2022-06-27 2022-11-01 智己汽车科技有限公司 Tail lamp display control method and device and vehicle
CN115534801A (en) * 2022-08-29 2022-12-30 深圳市欧冶半导体有限公司 Vehicle lamp self-adaptive dimming method and device, intelligent terminal and storage medium
US20230099674A1 (en) * 2021-09-29 2023-03-30 Subaru Corporation Vehicle backup warning systems
CN116461259A (en) * 2022-01-19 2023-07-21 福特全球技术公司 Auxiliary vehicle operation with improved object detection

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006172210A (en) * 2004-12-16 2006-06-29 Matsushita Electric Works Ltd Distance image sensor for vehicle, and obstacle monitoring device using the same
CN104802710A (en) * 2015-04-17 2015-07-29 浙江大学 Intelligent automobile parking assisting system and assisting method
CN106828304A (en) * 2015-12-07 2017-06-13 财团法人金属工业研究发展中心 Car backing warning method using structure light sensing obstacle
CN107992810A (en) * 2017-11-24 2018-05-04 智车优行科技(北京)有限公司 Vehicle identification method and device, electronic equipment, computer program and storage medium
EP3531393A1 (en) * 2018-02-27 2019-08-28 odelo GmbH Method and vehicle lamp for recording when the distance between successive vehicles falls below a safety distance
CN110364024A (en) * 2019-06-10 2019-10-22 深圳市锐明技术股份有限公司 Environment control method, device and the car-mounted terminal of driving vehicle
CN113022434A (en) * 2021-05-13 2021-06-25 周宇 Automatic door opening anti-collision device and method for vehicle
US20230099674A1 (en) * 2021-09-29 2023-03-30 Subaru Corporation Vehicle backup warning systems
CN116461259A (en) * 2022-01-19 2023-07-21 福特全球技术公司 Auxiliary vehicle operation with improved object detection
CN115214457A (en) * 2022-04-24 2022-10-21 广州汽车集团股份有限公司 Vehicle light control method, vehicle light control device, vehicle, and storage medium
CN115257527A (en) * 2022-06-27 2022-11-01 智己汽车科技有限公司 Tail lamp display control method and device and vehicle
CN115534801A (en) * 2022-08-29 2022-12-30 深圳市欧冶半导体有限公司 Vehicle lamp self-adaptive dimming method and device, intelligent terminal and storage medium

Also Published As

Publication number Publication date
CN116908828B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
EP3089442B1 (en) Vehicle-mounted image recognition device
TWI814804B (en) Distance measurement processing apparatus, distance measurement module, distance measurement processing method, and program
US20050111698A1 (en) Apparatus for vehicle surroundings monitoring and method thereof
US9677898B2 (en) Electronic apparatus and control method thereof
KR100682067B1 (en) Image processig syste to control vehicle haedlamps or other vehicle equipment
US8543254B1 (en) Vehicular imaging system and method for determining roadway width
EP2546779B1 (en) Environment recognizing device for a vehicle and vehicle control system using the same
EP2295292A1 (en) Lighting device foe vehicle and lighting method
CN107110648A (en) The system and method detected for visual range
JP2014515893A (en) Method and apparatus for evaluating an image taken by a camera of a vehicle
JP2019036907A (en) Imaging apparatus and device
CN111601373B (en) Backlight brightness control method and device, mobile terminal and storage medium
US20230342894A1 (en) Converting input image data from a plurality of vehicle cameras of a surround-view system into optimised output image data
CN116908828B (en) Distance induction control method and device for automobile tail lamp
JP5892079B2 (en) Object detection device
CN106331465A (en) Image acquisition device and auxiliary shooting method thereof
EP3226554B1 (en) Imaging device and vehicle
EP3168779A1 (en) Method for identifying an incoming vehicle and corresponding system
CN113591514B (en) Fingerprint living body detection method, fingerprint living body detection equipment and storage medium
US11509875B1 (en) Enhanced color consistency for imaging
CN113686350A (en) Road information display method and device and intelligent wearable equipment
US11445124B2 (en) Object detection system for a motor vehicle
CN116994514A (en) Image recognition-based vehicle display brightness control method, device and system
CN117962735A (en) Vehicle LED light distance adjustment control method and device
WO2023095679A1 (en) Visual confirmation status determination device and visual confirmation status determination system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant