CN217363146U - Object recognition system and device - Google Patents

Object recognition system and device Download PDF

Info

Publication number
CN217363146U
CN217363146U CN202123109287.6U CN202123109287U CN217363146U CN 217363146 U CN217363146 U CN 217363146U CN 202123109287 U CN202123109287 U CN 202123109287U CN 217363146 U CN217363146 U CN 217363146U
Authority
CN
China
Prior art keywords
camera
control unit
laser radar
recognition system
object recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202123109287.6U
Other languages
Chinese (zh)
Inventor
疏达
艾伯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Benewake Beijing Co Ltd
Original Assignee
Benewake Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Benewake Beijing Co Ltd filed Critical Benewake Beijing Co Ltd
Priority to CN202123109287.6U priority Critical patent/CN217363146U/en
Application granted granted Critical
Publication of CN217363146U publication Critical patent/CN217363146U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The utility model relates to an object recognition system and a device, wherein the object recognition system comprises a laser radar, a camera and a control unit, the laser radar and the camera are sequentially arranged along the transmission direction of a transmission device, and the control unit is respectively in communication connection with the laser radar and the camera; the laser radar is used for detecting the object transmitted by the transmission equipment and feeding back detection data to the control unit; the camera is used for shooting an object and feeding back a shot video frame to the control unit; and the control unit is used for triggering the camera to shoot the object when the laser radar detects the object according to the detection data, and identifying the object according to the video frame. Through the technical scheme, the classification information and the size information of the object are identified, and the loss of power consumption resources and the need of more hard disk space caused by continuous operation of the camera and processing of a large amount of visual data are avoided.

Description

Object recognition system and device
Technical Field
The present disclosure relates to the field of object recognition technologies, and in particular, to an object recognition system and an object recognition device.
Background
At present, an image pickup apparatus such as a video camera can recognize an object by acquiring image information of the object, but power consumption resources are lost and more hard disk space is required due to continuous operation of the video camera and processing of a large amount of visual data.
SUMMERY OF THE UTILITY MODEL
In order to solve the above technical problems or at least partially solve the above technical problems, the present disclosure provides an object identification system, method, apparatus, electronic device, and medium, which identify classification information and size information of an object, and avoid that continuous operation of a camera and processing of a large amount of visual data will cause loss of power consumption resources and require more hard disk space.
In a first aspect, the present disclosure provides an object recognition system, including a laser radar, a camera and a control unit, where the laser radar and the camera are sequentially arranged along a transmission direction of a transmission device, and the control unit is respectively in communication connection with the laser radar and the camera;
the laser radar is used for detecting the object transmitted by the transmission equipment and feeding back detection data to the control unit;
the camera is used for shooting the object and feeding back a shot video frame to the control unit;
and the control unit is used for triggering the camera to shoot the object and identifying the object according to the video frame when the laser radar detects the object according to the detection data.
Optionally, the conveying apparatus comprises a conveyor belt; the laser radar is fixed right above the conveyor belt, and the detection direction is vertical downward; the camera is fixed on the lateral upper side of the conveyor belt.
Optionally, the object recognition system further includes a display screen, the control unit is in communication connection with the display screen, and the control unit is further configured to send the recognized object information to the display screen for display.
Optionally, the object recognition system further includes a light supplement lamp, and the light supplement lamp is used for providing a lighting function for the camera.
Optionally, the control unit is electrically connected to the light supplement lamp, and the control unit is further configured to control the light supplement lamp to be turned on when the camera is triggered to shoot.
Optionally, the control unit is specifically configured to determine that the laser radar detects the object when it is determined that a variation of the distance detected by the laser radar is greater than a set threshold according to the detection data.
In a second aspect, an embodiment of the present disclosure further provides an identification apparatus, which is applied to the object identification system in the first aspect, where the apparatus includes:
the object judgment module is used for judging whether the laser radar detects the object transmitted by the transmission equipment or not according to the detection data fed back by the laser radar;
the camera triggering module is used for triggering a camera to shoot the object when the laser radar detects the object;
and the object identification module is used for identifying the object according to the video frame fed back by the camera.
The object recognition system comprises a laser radar, a camera and a control unit, wherein the laser radar and the camera are sequentially arranged along the conveying direction of conveying equipment, and the control unit is respectively in communication connection with the laser radar and the camera; the laser radar is used for detecting the object transmitted by the transmission equipment and feeding back detection data to the control unit; the camera is used for shooting an object and feeding back a shot video frame to the control unit; and the control unit is used for triggering the camera to shoot the object when the laser radar detects the object according to the detection data, and identifying the object according to the video frame. Therefore, the embodiment of the disclosure can realize that the camera is triggered only when the object on the conveyor belt is detected by the laser radar, the object is shot, the shot video frame data is transmitted to the control unit, the classification information and the size information of the object are identified, and the continuous operation of the camera and the processing of a large amount of visual data, which result in the loss of power consumption resources and the need of more hard disk spaces, are avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of an object identification system according to an embodiment of the present disclosure.
Fig. 2 is a schematic structural diagram of another object identification system according to an embodiment of the present disclosure.
Fig. 3 is a schematic flow chart of an object identification method according to an embodiment of the present disclosure.
Fig. 4 is a schematic flowchart of an object identification process according to an embodiment of the present disclosure.
Fig. 5 is a schematic structural diagram of an object identification device according to an embodiment of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
Fig. 1 is a schematic structural diagram of an object identification system according to an embodiment of the present disclosure. As shown in fig. 1, the object recognition system includes a laser radar 10, a camera 11 and a control unit 12, the laser radar 10 and the camera 11 are sequentially arranged along a conveying direction of a conveying device 13, that is, along the XX' direction in fig. 1, and the control unit 12 is respectively in communication connection with the laser radar 10 and the camera 11; the laser radar 10 is used for detecting the object 14 transmitted by the transmission equipment 13 and feeding back the detection data to the control unit 12; the camera 11 is used for shooting an object 14 and feeding back a shot video frame to the control unit 12; the control unit 12 is configured to trigger the camera 11 to shoot the object 14 when it is determined that the laser radar 10 detects the object 14 according to the detection data, and identify the object 14 according to the video frame.
Specifically, the control unit 12 determines whether the laser radar 10 detects the object 14 transmitted by the transmission device according to the detection data fed back by the laser radar 10, and when the control unit 12 determines that the laser radar 10 detects the object 14, the control unit triggers the camera 11 to shoot the object 14, and then identifies the object 14 according to the video frame fed back by the camera 11.
Specifically, as shown in fig. 1, the laser radar 10 and the camera 11 are arranged in order in the conveying direction of the conveying apparatus 13, that is, in the XX' direction in fig. 1. The control unit 12 is in communication connection with the laser radar 10 and the camera 11, the control unit 12 may be a central control system computer, the camera 11 may be a single-point laser radar, and the communication connection mode may be a wired data transmission mode or a wireless data transmission mode. When an object 14 is conveyed on the conveying device 13 in the conveying direction XX', the laser radar 10 detects the object 14 and transmits detection data to the control unit 12, and when the control unit 12 determines that the object 14 passes through the conveying device 13, the camera 11 is triggered to shoot the object 14, that is, the laser radar 10 continuously monitors the conveying device 13, and when any object 14 comes, the object 14 is detected by the laser radar 10 and the camera 11 is triggered. Further, the camera 11 transmits the captured video frame data to the control unit 12, and the control unit 12 identifies the object 14 according to the video frame data, for example, the identification information may include classification information and size information of the object 14.
In some embodiments, the ultrasonic sensor may be used to detect an object, but its performance is not ideal because the ultrasonic sensor is easily interfered by surrounding objects to cause false triggering, and its low frame rate is not suitable for being used in the case of an object moving at a high speed, because its detection distance range is small, it must be installed at a low height, and furthermore, the ultrasonic sensor has no IO interface. Because the ultrasonic sensor uses Pulse Width Modulation (PWM), an additional single chip microcomputer is needed to read the distance information and generate a 3.3V or 5V voltage signal to trigger the camera, and the triggering process is complicated.
In some embodiments, objects may be identified by continuously processing camera frames, but more hard disk space and power consuming resources are required.
This disclosed embodiment can solve above-mentioned technical problem through setting up lidar, and lidar has the IO interface usually, and the camera can be triggered by lidar very fast, consequently can reduce the complexity through eliminating the demand to extra little the control unit. And the laser radar has the advantages of low power consumption, small size, high precision, high response speed, easy installation and integration and the like, and the laser radar is integrated into an object recognition system, so that the system automation is brought, and the service life of the system is prolonged. Therefore, lidar is the best choice for embodiments of the present disclosure.
Therefore, the embodiment of the disclosure provides an object identification system, through setting up laser radar, camera and control unit, laser radar and camera set gradually along transfer apparatus's direction of transfer, in order to utilize the consumption resource effectively, only can trigger the camera when the object on the conveyer belt is detected by laser radar, shoot the object, and video frame data transmission to the control unit that will shoot, discern classification information and the size information of object, avoided the continuous operation of camera and handled a large amount of visual data will lead to the loss of consumption resource and need more hard disk spaces.
Alternatively, as shown in fig. 1, the conveying device 13 comprises a conveyor belt; the laser radar 10 is fixed right above the conveyor belt, and the detection direction is vertical downward; the camera 11 is fixed above the side of the conveyor belt.
In particular, a mounting may be provided across the conveyor belt, on which both the lidar 10 and the camera 11 are fixed. When the object 14 passes below the laser radar 10, the detection direction of the laser radar 10 is vertically downward, the object 14 can be detected, detection data is transmitted to the control unit 12, and then the camera 11 above the conveyor belt side is triggered to shoot the object 14. In addition, by fixing the lidar 10 directly above the conveyor belt with the detection direction vertically downward, the height of the object 14 can be determined from the detection data, which is not possible in visual data because the camera is mounted downward.
Alternatively, as shown in fig. 1, the control unit 12 is configured to determine that the laser radar 10 detects the object 14 when it is determined that the amount of change in the distance detected by the laser radar 10 is greater than a set threshold value according to the detection data.
Specifically, the control unit 12 is configured to determine whether the amount of change in the distance detected by the laser radar 10 is greater than a set threshold according to the detection data, determine that the laser radar 10 detects the object 14 if the amount of change in the distance is greater than the set threshold, and determine that the laser radar 10 does not detect the object 14 if the amount of change in the distance is less than or equal to the set threshold.
Specifically, as shown in fig. 1, the distance variation detected by lidar 10 may be due to an object 14 on the conveyor belt, or may be due to a typical error variation of lidar 10 itself. Lidar 10 may itself report small changes of around 2 cm, so to avoid false triggering of camera 11, a threshold value may be set, which may be set at 2 cm, and when the amount of change in the distance detected by lidar 10 is greater than the threshold value, i.e. the amount of change in the distance is greater than 2 cm, it may be determined that lidar 10 detected object 14.
Illustratively, if the control unit 12 determines that the laser radar 10 detects the object 14, the camera 11 is further triggered to shoot, and the object 14 is identified according to the video frame fed back by the camera 11. The control unit 12 converts the video frames into images by using an image stream pipeline, then preprocesses the images by using open source computer vision, and finally recognizes the preprocessed images by using a deep learning network to determine the classification information and the size information of the object.
Fig. 2 is a schematic structural diagram of another object identification system according to an embodiment of the present disclosure. As shown in fig. 2, the object recognition system further includes a display screen 15, the control unit 12 is in communication connection with the display screen 15, and the control unit 12 is further configured to send the recognized object information to the display screen 15 for displaying.
Specifically, after the control unit 12 identifies the object 14, the control unit 12 stores the classification information and the size information in the database; and/or send the classification information and the size information to the display screen 15 for display.
Optionally, as shown in fig. 2, the object recognition system further includes a fill-in light 16, where the fill-in light 16 is used to provide an illumination function for the camera 11.
Specifically, two light supplement lamps 16 may be respectively disposed on two sides of the camera 11 for enhancing the illumination effect, so that the video captured by the camera 11 is clearer. Illustratively, the control unit 12 is electrically connected to the fill-in light 16, and when the camera 11 is triggered to shoot, the control unit 12 can control the fill-in light 16 to be turned on.
The embodiment of the disclosure also provides an object identification method, which can be realized based on the object identification system of the embodiment. Fig. 3 is a schematic flowchart of an object identification method according to an embodiment of the present disclosure, and as shown in fig. 3, the object identification method includes:
s301, judging whether the laser radar detects the object transmitted by the transmission equipment or not according to the detection data fed back by the laser radar.
Specifically, as shown in fig. 1 or fig. 2, the control unit 12 determines whether the amount of change in the distance detected by the laser radar 10 is greater than a set threshold value, based on the detection data; if the variation of the distance is larger than the set threshold, determining that the laser radar 10 detects the object 14; if the amount of change in the distance is less than or equal to the set threshold value, it is determined that the laser radar 10 does not detect the object 14.
And S302, when the laser radar is determined to detect the object, triggering the camera to shoot the object.
Specifically, as shown in fig. 1 or fig. 2, if it is determined that the laser radar 10 detects the object 14 based on the above steps, the camera 11 is further triggered to shoot the object 14.
And S303, identifying the object according to the video frame fed back by the camera.
Fig. 4 is a schematic flowchart of an object identification process provided in an embodiment of the present disclosure, and as shown in fig. 4, the object identification process includes: converting the video frames shot by the camera 11 into images by adopting an image stream pipeline 17; preprocessing the image using open source computer vision 18; the preprocessed image is identified using a deep learning network 19 to determine classification information and size information of the object.
Specifically, with reference to fig. 1 and 4 or fig. 2 and 4, the camera 11 shoots, i.e., captures, a video frame, video-converts the captured video frame into an image through the image stream pipeline 17, further, visually reads the image using the open-source computer vision 18 and adjusts the size of the image, i.e., pre-processes the image, further, the pre-processed image is transmitted to the deep learning network 19, and the object 14 of the image is detected using the deep learning network 19 to identify the classification information and the size information of the object 14. In addition, deep learning has evolved, and pre-trained models have been available that can be used to classify objects 14 and extract accurate information from visual data.
For example, the classification information and the size information of the object 14 may be stored in the database 20, the classification information and the size information of the object 14 may be sent to the display screen 15 for displaying, and the classification information and the size information of the object 14 may be stored in the database and sent to the display screen 15 for displaying, which is not limited in this embodiment.
Therefore, the embodiment of the disclosure provides an object identification system, through setting up laser radar, camera and control unit, laser radar and camera set gradually along transfer apparatus's direction of transfer, in order to utilize the consumption resource effectively, only can trigger the camera when the object on the conveyer belt is detected by laser radar, shoot the object, and video frame data transmission to the control unit that will shoot, discern classification information and the size information of object, avoided the continuous operation of camera and handled a large amount of visual data will lead to the loss of consumption resource and need more hard disk spaces.
The embodiment of the disclosure also provides an object identification device, which is applied to the object identification system of the embodiment. Fig. 5 is a schematic structural diagram of an object recognition device according to an embodiment of the present disclosure, and as shown in fig. 5, the object recognition device includes: an object determining module 501, configured to determine whether the laser radar detects an object transmitted by the transmitting device according to detection data fed back by the laser radar; the camera triggering module 502 is used for triggering the camera to shoot an object when the laser radar is determined to detect the object; and the object identification module 503 is configured to identify an object according to the video frame fed back by the camera.
An embodiment of the present disclosure further provides an electronic device, including: a memory and one or more processors; wherein the memory is in communication with the one or more processors, and the memory stores instructions executable by the one or more processors, and when the instructions are executed by the one or more processors, the electronic device is configured to implement the object identification method described in any of the embodiments of the present disclosure.
FIG. 6 is a schematic block diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. As shown in fig. 3, the electronic apparatus 600 includes a Central Processing Unit (CPU) 601 which can execute various processes in the foregoing embodiments in accordance with a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus 300 are also stored. The CPU601, ROM602, and RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, the above described methods may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a medium readable thereby, the computer program comprising program code for performing the aforementioned object identification method. In such embodiments, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611.
The disclosed embodiments also provide a computer-readable storage medium storing a program or instructions for causing a computer to execute an object recognition method, the method including:
judging whether the laser radar detects an object transmitted by the transmission equipment or not according to detection data fed back by the laser radar;
when the laser radar is determined to detect the object, triggering a camera to shoot the object;
and identifying the object according to the video frame fed back by the camera.
Optionally, the computer executable instructions, when executed by the computer processor, may also be used to implement aspects of the method for failure control of an unmanned vehicle provided by any of the embodiments of the present disclosure.
From the above description of the embodiments, it is obvious for those skilled in the art that the present application can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods of the embodiments of the present disclosure.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowcharts or block diagrams may represent a module, a program segment, or a portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present disclosure may be implemented by software or hardware. The units or modules described may also be provided in a processor, and the names of the units or modules do not in some cases constitute a limitation on the units or modules themselves.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (5)

1. An object recognition system is characterized by comprising a laser radar, a camera and a control unit, wherein the laser radar and the camera are sequentially arranged along the conveying direction of conveying equipment, and the control unit is respectively in communication connection with the laser radar and the camera;
the laser radar is used for detecting the object transmitted by the transmission equipment and feeding back detection data to the control unit;
the camera is used for shooting the object and feeding back a shot video frame to the control unit;
and the control unit is used for triggering the camera to shoot the object and identifying the object according to the video frame when the laser radar detects the object according to the detection data.
2. The object recognition system of claim 1, wherein the conveying device comprises a conveyor belt; the laser radar is fixed right above the conveyor belt, and the detection direction is vertical downward; the camera is fixed on the lateral upper side of the conveyor belt.
3. The object recognition system of claim 1, further comprising a display screen, wherein the control unit is communicatively coupled to the display screen, and wherein the control unit is further configured to send the recognized object information to the display screen for display.
4. The object recognition system of claim 1, further comprising a fill light for providing illumination to the camera.
5. The object recognition system of claim 4, wherein the control unit is electrically connected to the light supplement lamp, and the control unit is further configured to control the light supplement lamp to be turned on when the camera is triggered to shoot.
CN202123109287.6U 2021-12-10 2021-12-10 Object recognition system and device Active CN217363146U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202123109287.6U CN217363146U (en) 2021-12-10 2021-12-10 Object recognition system and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202123109287.6U CN217363146U (en) 2021-12-10 2021-12-10 Object recognition system and device

Publications (1)

Publication Number Publication Date
CN217363146U true CN217363146U (en) 2022-09-02

Family

ID=83040157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202123109287.6U Active CN217363146U (en) 2021-12-10 2021-12-10 Object recognition system and device

Country Status (1)

Country Link
CN (1) CN217363146U (en)

Similar Documents

Publication Publication Date Title
US7936926B2 (en) Apparatus, method, and program for face feature point detection
CN100579174C (en) Motion detection method and device
EP3486837A1 (en) Method of identifying vehicle, apparatus, inspection system and electronic device using the same, and storage of the same
US20090231440A1 (en) Brightness automatically adjusting system and method for adjusting brightness thereof
CN111539265B (en) Method for detecting abnormal behavior in elevator car
CN109508667B (en) Elevator video anti-pinch method and elevator video monitoring device
JPWO2013047088A1 (en) Biological recognition device
CN217363146U (en) Object recognition system and device
CN109697422B (en) Optical motion capture method and optical motion capture camera
CN113902740A (en) Construction method of image blurring degree evaluation model
EP3842996A1 (en) Method of and system for determining traffic signal state
CN114222040A (en) Object recognition system, method, device, electronic equipment and medium
CN112188109B (en) Camera debugging method and device and electronic equipment
US11812131B2 (en) Determination of appropriate image suitable for feature extraction of object from among captured images in which object is detected
CN110620879A (en) Dynamic light supplementing device and method for face recognition
JP2013171319A (en) Vehicle state detection device, vehicle behavior detection device and vehicle state detection method
CN115424324A (en) Street lamp energy-saving control method and device based on edge calculation and storage medium
CN116091749A (en) Infrared visual object recognition system
CN111626078A (en) Method and device for identifying lane line
CN111079682A (en) Cabin supervision method and system based on artificial intelligence
CN110177222A (en) A kind of the camera exposure parameter method of adjustment and device of the unused resource of combination vehicle device
KR20100053719A (en) Method of driving wiper using charge coupled device camera and wiper driving apparatus
Bialik et al. Fast-moving object counting with an event camera
US20230360405A1 (en) Information processing device, information processing system, and information processing method
CN114463771A (en) LED state code identification system and method based on color gradation distribution fitting

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant