WO2020071267A1 - Module de capteur, dispositif électronique, procédé d'étalonnage de capteur de vision, procédé de détection d'objet et programme - Google Patents

Module de capteur, dispositif électronique, procédé d'étalonnage de capteur de vision, procédé de détection d'objet et programme

Info

Publication number
WO2020071267A1
WO2020071267A1 PCT/JP2019/038115 JP2019038115W WO2020071267A1 WO 2020071267 A1 WO2020071267 A1 WO 2020071267A1 JP 2019038115 W JP2019038115 W JP 2019038115W WO 2020071267 A1 WO2020071267 A1 WO 2020071267A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
event signal
shutter
event
sensor array
Prior art date
Application number
PCT/JP2019/038115
Other languages
English (en)
Japanese (ja)
Inventor
宏昌 長沼
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Priority to US17/277,490 priority Critical patent/US11659286B2/en
Priority to CN201980063544.3A priority patent/CN112771841B/zh
Publication of WO2020071267A1 publication Critical patent/WO2020071267A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/26Power supplies; Circuitry or arrangement to switch on the power source; Circuitry to check the power source voltage
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/68Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/68Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects
    • H04N25/683Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects by defect estimation performed on the scene signal, e.g. real time or on the fly detection

Definitions

  • the present invention relates to a sensor module, an electronic device, a method for calibrating a vision sensor, a method for detecting a subject, and a program.
  • an event-driven vision sensor in which a pixel that detects a change in the intensity of incident light generates a signal asynchronously with time.
  • the event-driven vision sensor is advantageous in that it can operate at low power and at high speed compared to a frame-type vision sensor that scans all pixels at predetermined intervals, specifically, an image sensor such as a CCD or CMOS. It is. Techniques related to such an event-driven vision sensor are described in Patent Literature 1 and Patent Literature 2, for example.
  • the present invention provides a sensor module, an electronic device, a vision sensor calibration method, a subject detection method, and a program which provide convenience by providing a means for blocking or opening an angle of view in an event-driven vision sensor.
  • the purpose is to do.
  • an event-driven vision sensor including a sensor array configured by a sensor that generates an event signal when detecting a change in the intensity of incident light, shields an angle of view of the sensor array, and A sensor module including a shutter that can be opened and an electronic device including such a sensor module are provided.
  • a method of calibrating an event-driven vision sensor including a sensor array including a sensor that generates an event signal when detecting a change in the intensity of incident light comprising: Driving the vision sensor to block the angle of view of the sensor array; and performing a calibration of the vision sensor based on an event signal received while the shutter blocks the angle of view of the sensor array.
  • a method for detecting a subject using an event-driven vision sensor including a sensor array including a sensor that generates an event signal when detecting a change in the intensity of incident light.
  • a method for detecting a subject comprising the steps of: driving a shutter to repeatedly cover and open the angle of view of the sensor array; and detecting the subject from an event signal received from opening to blocking.
  • a processing circuit connected to an event-driven vision sensor including a sensor array configured by a sensor that generates an event signal when detecting a change in the intensity of incident light, Driving the shutter to block the angle of view of the sensor array; and performing the calibration of the vision sensor based on the event signal received while the shutter blocks the angle of view of the sensor array.
  • the program for the is provided.
  • a processing circuit connected to an event-driven vision sensor including a sensor array configured by a sensor that generates an event signal when detecting a change in the intensity of incident light
  • a program is provided for executing a step of driving a shutter to repeatedly block and open the angle of view of the sensor array and a step of detecting a subject from an event signal received between the time of opening and the time of blocking.
  • the shutter blocks or opens the angle of view of the sensor array, thereby providing convenience in an event-driven vision sensor.
  • FIG. 2 is a block diagram illustrating a schematic configuration of an electronic device including the sensor module according to the first embodiment of the present invention.
  • FIG. 5 is a sequence diagram illustrating a first example of an operation of the sensor module according to the first embodiment of the present invention.
  • FIG. 5 is a sequence diagram illustrating a second example of the operation of the sensor module according to the first embodiment of the present invention.
  • FIG. 7 is a sequence diagram illustrating a third example of the operation of the sensor module according to the first embodiment of the present invention.
  • FIG. 7 is a block diagram illustrating a schematic configuration of an electronic device including a sensor module according to a second embodiment of the present invention.
  • FIG. 11 is a sequence diagram illustrating a first example of an operation of the sensor module according to the second embodiment of the present invention.
  • FIG. 5 is a sequence diagram illustrating a first example of an operation of the sensor module according to the first embodiment of the present invention.
  • FIG. 5 is a sequence diagram illustrating a second example of the operation of the
  • FIG. 13 is a sequence diagram illustrating a second example of the operation of the sensor module according to the second embodiment of the present invention. It is a block diagram showing an example of composition of a processing circuit of a control part when performing motion prediction in a 2nd embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a schematic configuration of an electronic device including the sensor module according to the first embodiment of the present invention.
  • the electronic device 10 includes a sensor module 100 and a control unit 200.
  • the sensor module 100 includes an event-driven vision sensor 110, an actuator 120, and a shutter 130.
  • the vision sensor 110 includes a sensor array 111 composed of sensors 111A, 111B,... Corresponding to pixels of an image, and a processing circuit 112 connected to the sensor array 111.
  • Each of the sensors 111A, 111B,... Includes a light receiving element, and generates an event signal when detecting a change in the intensity of incident light, more specifically, a change in luminance.
  • the event signal is output from the processing circuit 112 as information indicating, for example, a time stamp, sensor identification information (for example, the position of a pixel), and the polarity (increase or decrease) of the luminance change.
  • the intensity of the light reflected or scattered by the subject changes.
  • the subject moves by an event signal generated by the sensors 111A, 111B,. Can be detected in time series.
  • the event-driven vision sensor 110 is advantageous in that it can operate at low power and at high speed as compared with the frame-type vision sensor. This is because only one of the sensors 111A, 111B,... Constituting the sensor array 111 that detects a change in luminance generates an event signal. Since the sensor that has not detected the change in luminance does not generate an event signal, the processing circuit 112 can process and transmit only the event signal of the sensor that has detected the change in luminance at high speed. In addition, when there is no change in luminance, processing and transmission processing do not occur, so that operation with low power becomes possible.
  • the brightness does not change unless the subject moves, so that the subject that does not move due to the event signal generated by the sensors 111A, 111B,. Is difficult to capture. That is, it is difficult to obtain information on the surrounding environment including the stationary subject only with the vision sensor 110.
  • the sensor module 100 includes the actuator 120 connected to the vision sensor 110.
  • the actuator 120 is driven according to a control signal transmitted from the control unit 200, and is configured to displace the sensor array 111 in a direction perpendicular to the optical axis direction of the sensors 111A, 111B,.
  • the actuator 120 displaces the sensor array 111, the positional relationship between all the sensors 111A, 111B,... And the subject changes. That is, at this time, the same change as when all the objects have moved within the angle of view of the sensor array 111 occurs. Therefore, regardless of whether or not the subject is actually moving, the subject can be detected by an event signal generated by, for example, the sensors 111A, 111B,... Corresponding to the edges of the subject. Since the amount of displacement of the sensor array 111 required to generate the above-described change is not large, the actuator 120 may be a vibrator that slightly displaces or vibrates the sensor array 111.
  • the direction in which the actuator 120 displaces the sensor array 111 is perpendicular to the optical axis direction of the sensors 111A, 111B,...
  • the direction of displacement is not perpendicular to the optical axis direction.
  • the actuator 120 may displace the sensor array 111 in an arbitrary direction.
  • the displacement amount of the sensor array 111 required to generate the above-described change is minimized, and .. Is advantageous in that the positional relationship between the sensors 111A, 111B,.
  • the sensor module 100 includes the shutter 130.
  • the shutter 130 is arranged so as to cover and open the entire angle of view of the sensor array 111 of the vision sensor 110.
  • the shutter 130 may be a mechanical shutter such as a focal plane shutter or a lens shutter, or may be an electronic shutter such as a liquid crystal shutter.
  • the opened shutter 130 is closed, the entire angle of view of the sensor array 111 is shielded, so that the intensity of light incident on all the sensors 111A, 111B,.
  • the closed shutter 130 is opened, the entire angle of view of the sensor array 111 is opened, and in principle, a change occurs in which the brightness of all the sensors 111A, 111B,.
  • calibration of the sensor array 111 and detection of a self-luminous subject are performed using such an operation.
  • the control unit 200 includes a communication interface 210, a processing circuit 220, and a memory 230.
  • the communication interface 210 receives the event signal transmitted from the processing circuit 112 of the vision sensor 110 and outputs the event signal to the processing circuit 220. Further, the communication interface 210 transmits the control signal generated by the processing circuit 220 to the actuator 120.
  • the processing circuit 220 operates, for example, according to a program stored in the memory 230, and processes the received event signal. For example, based on the event signal, the processing circuit 220 generates an image in which the position where the luminance change has occurred is mapped in a time series, and the image is temporarily or continuously stored in the memory 230, or further processed via the communication interface 210. Or to another device. Further, the processing circuit 220 generates a control signal for driving the actuator 120 and the shutter 130, respectively.
  • FIG. 2 is a sequence diagram showing a first example of the operation of the sensor module according to the first embodiment of the present invention.
  • a control signal generated by the processing circuit 220 of the control unit 200 is transmitted to the actuator 120 (S101).
  • the sensor array 111 is displaced in a predetermined direction, and the event signals generated by the sensors 111A, 111B,. Is transmitted from the vision sensor 110 to the control unit 200 (S103).
  • the processing circuit 220 detects a subject from the received event signal (S104). As described above, at this time, the subject can be detected regardless of whether the subject is actually moving.
  • the processing circuit 220 may execute a series of steps from transmission of a control signal to the actuator 120 (S101) to reception of an event signal (S103) and capture of environmental information based on the event signal (S104). For example, the processing circuit 220 converts an event signal received during a predetermined time from the transmission of the control signal to the actuator 120 (S101) as an event signal indicating environmental information to an event signal received at other times. You may treat it separately.
  • FIG. 3 is a sequence diagram showing a second example of the operation of the sensor module according to the first embodiment of the present invention.
  • a control signal generated by the processing circuit 220 of the control unit 200 is transmitted to the shutter 130 (S111).
  • the shutter 130 By closing the shutter 130 that has received the control signal (S112), the entire angle of view of the sensor array 111 is blocked, and the intensity of light incident on all the sensors 111A, 111B,. Therefore, after the event signal indicating that the brightness has decreased due to the interruption of the light is transmitted from the vision sensor 110 to the control unit 200 (S113), the event signal should not be received in principle.
  • the shutter 130 may be used as a sensor array. An event signal can be generated even while blocking the angle of view of 111. Therefore, in the control unit 200, the processing circuit 220 keeps the shutter 130 closed for a predetermined time and monitors an event signal received while the shutter 130 blocks the angle of view of the sensor array 111. . If an event signal is received during this time (S114), the processing circuit 220 executes calibration of the vision sensor 110 based on the received event signal (S115). Specifically, the processing circuit 220 specifies the sensor that has generated the event signal as a defective pixel (bright point) or adjusts a threshold of a luminance change for generating the event signal in the sensor.
  • FIG. 4 is a sequence diagram showing a third example of the operation of the sensor module according to the first embodiment of the present invention.
  • a control signal generated by the processing circuit 220 of the control unit 200 is transmitted to the shutter 130 with the shutter 130 closed (S121).
  • the shutter 130 that has received the control signal is opened (S122)
  • the entire angle of view of the sensor array 111 is opened, and in principle, an event signal indicating that the brightness has increased in all the sensors 111A, 111B,. 110 is transmitted to the control unit 200 (S123).
  • control signal generated by the processing circuit 220 of the control unit 200 is transmitted to the shutter 130 again (S125), and when the shutter 130 is closed (S126) and the entire angle of view of the sensor array 111 is shielded, similarly, all of the sensor array 111 is blocked.
  • An event signal indicating that the brightness has decreased in the sensors 111A, 111B,... Is transmitted from the vision sensor 110 to the control unit 200 (S127).
  • the control unit 200 transmits the control signal for repeating the shielding and opening of the angle of view of the sensor array 111 to the shutter 130. Receive the generated event signal.
  • the control unit 200 sets the time t1, that is, the cycle of repeating the blocking and opening of the angle of view, to be longer than the blinking cycle of the light source included in the self-luminous subject (while setting the time as short as described above).
  • the self-luminous subject can be specified based on the received event signal (S128).
  • an event is forcibly generated in the vision sensor 110 by the actuator 120 displacing the sensor array 111, for example, the surrounding environment including a stationary subject.
  • Information can be obtained.
  • the sensor array 111 can be calibrated by the shutter 130 blocking the entire angle of view of the sensor array 111.
  • by repeatedly opening and closing the shutter 130 at a predetermined cycle it is possible to detect a self-luminous subject such as an illumination or a display.
  • the sensor module 100 includes both the actuator 120 and the shutter 130. However, since these functions are independent of each other, one of the actuator 120 and the shutter 130 is included in the sensor module 100. You may. Further, in the above example, the control unit 200 is illustrated and described separately from the sensor module 100, but the control unit 200 may be included in the sensor module 100. In this case, the processing circuit 112 of the sensor module 100 and the processing circuit 220 of the control unit 200 may be configured separately or may be common.
  • FIG. 5 is a block diagram illustrating a schematic configuration of an electronic device including the sensor module according to the second embodiment of the present invention.
  • the electronic device 20 includes a sensor module 300, a control unit 200, and a movable support mechanism 400.
  • the sensor module 300 includes the event-driven vision sensor 110 and the shutter 130 similar to those of the first embodiment.
  • the sensor module 300 is supported by a movable support mechanism 400 including frames 410A, 410B, 410C and actuators 420A, 420B.
  • the actuators 420A and 420B are rotary actuators driven according to control signals transmitted from the control unit 200.
  • Actuator 420A causes a predetermined angle of rotational displacement between frames 410A and 410B according to the control signal
  • actuator 420B similarly causes a predetermined angle of rotational displacement between frames 410B and 410C. Accordingly, the actuators 420A and 420B apply displacement to the sensor module 300 including the vision sensor 110.
  • an event is forcibly generated in the vision sensor 110, and information on the surrounding environment including, for example, a stationary subject is obtained.
  • the actuator 420B may be understood to be included in the sensor module 300.
  • the control unit 200 controls the actuator 200 based on an event signal generated by the vision sensor 110 when the actuators 420A and 420B apply displacement to the sensor module 300. The correction value can be reflected on the control signals of 420A and 420B.
  • FIG. 6 is a sequence diagram showing a first example of the operation of the sensor module according to the second embodiment of the present invention.
  • the control signal generated by the processing circuit 220 of the control unit 200 is transmitted to one or both of the actuators 420A and 420B (S131).
  • the actuators 420A and 420B are driven according to the control signal (S132)
  • displacement occurs in the sensor module 300, and the positional relationship between the sensors 111A, 111B,... And the subject changes.
  • the event signals generated by the sensors 111A, 111B,... Are transmitted from the vision sensor 110 to the control unit 200 (S133).
  • the processing circuit 220 measures the delay time d1 from the transmission of the control signal to the actuators 420A and 420B (S131) to the reception of the event signal (S133), and controls the actuators 420A and 420B based on the delay time d1. Calibration is performed (S134). Specifically, the processing circuit 220 determines a correction value of the control signal according to the delay time d1, and the determined correction value is reflected on a control signal generated by the processing circuit thereafter.
  • the actuator 420A or 420B can be calibrated independently. Further, by transmitting a control signal to both of the actuators 420A and 420B, a composite system including the actuators 420A and 420B can be calibrated.
  • the correction value of the control signal determined according to the delay time d1 is used, for example, when the control unit 200 corrects the parameter of the PID control executed when the actuators 420A and 420B want to realize the displacement following the specific pattern. Used for
  • FIG. 7 is a sequence diagram showing a second example of the operation of the sensor module according to the second embodiment of the present invention.
  • a control signal is transmitted (S131), and the actuators 420A and 420B that have received the control signal are driven (S132) to rotate the vision sensor 110. Displacement occurs.
  • the actuators 420A and 420B are worn, the rotational displacement of the vision sensor 110 is not instantaneously stabilized, and for example, vibration occurs.
  • the processing circuit 220 measures delay times d1 and d2 from transmission of the control signal to the actuators 420A and 420B (S131) to reception of the event signal at a plurality of timings (S133-1 and S133-2). As a result, as a result, the processing circuit 220 measures the elapsed time d2-d1 from the start of the reception of the event signal (S133-1) to the end of the reception (S133-2).
  • the processing circuit 220 determines a correction value according to the elapsed time d2-d1, and the determined correction value is reflected in a control signal generated by the processing circuit later. Specifically, when the elapsed time d2-d1 exceeds the threshold value, the processing circuit 220 sets a flag indicating that wear has occurred in the actuators 420A and 420B. In this case, the processing circuit 220 may set a value such as an operating torque different from those of the other actuators for the worn actuators 420A and 420B.
  • FIG. 8 is a block diagram illustrating a configuration example of a processing circuit of a control unit when performing motion prediction in the second embodiment of the present invention.
  • the processing circuit 220 of the control unit 200 includes, for example, a drive pattern generation unit 221, a control signal generation unit 222, and an event signal as functions implemented by operating according to a program stored in the memory 230.
  • An analysis unit 223, an error calculation unit 224, and a motion prediction unit 225 are included.
  • the drive pattern generator 221 generates a drive pattern for the actuators 420A and 420B.
  • the drive pattern may be, for example, a pattern preliminarily defined by a program stored in the memory 230, or may be determined based on a measurement value of another sensor such as an acceleration sensor included in the electronic device 20. It may be something.
  • the control signal generator 222 generates control signals for the actuators 420A and 420B according to the drive pattern generated by the drive pattern generator 221.
  • an event signal is transmitted from the vision sensor 110 to the control unit 200.
  • the event signal analysis unit 223 calculates the displacement of the sensor module 300 from the received event signal. Specifically, for example, the event signal analysis unit 223 calculates the motion vector of the vision sensor 110 backward from the motion vector of the subject obtained by analyzing the event signal. The event signal analysis unit 223 provides information including the back-calculated displacement of the sensor module 300 to the error calculation unit 224.
  • the error calculation unit 224 calculates the displacement of the sensor module 300 and the drive pattern generation unit, which are calculated backward, taking into account, for example, the operation delay time d1 of the actuators 420A and 420B specified by the example described above with reference to FIG.
  • the error characteristics of the actuators 420A and 420B are calculated from the difference from the drive pattern generated by the drive pattern 221.
  • the error characteristics may be normalized and stored in the memory 230 for each type of motion of the actuators 420A and 420B (specifically, translation and rotation in each axial direction).
  • the control signal generation unit 222 inputs the generated control signal to the motion prediction unit 225 before outputting it.
  • the motion prediction unit 225 predicts the motion of the actuators 420A and 420B with respect to the input control signal based on the error characteristics of the actuators 420A and 420B calculated by the error calculation unit 224.
  • the control signal generation unit 222 corrects the control signal so that the difference between the motion predicted by the motion prediction unit 225 and the drive pattern generated by the drive pattern generation unit 221 is reduced.
  • control signal generation unit 222 inputs the corrected control signal to the motion prediction unit 225 again, and the motion prediction unit 225 re-predicts the motion of the actuators 420A and 420B with respect to the corrected control signal based on the error characteristics. However, the control signal generation unit 222 may re-correct the control signal so that the difference between the re-predicted motion and the driving pattern is reduced.
  • the processing circuit 220 of the control unit 200 switches the transmission of the control signal to the actuators 420A and 420B from the event signal.
  • the delay amounts of the actuators 420A and 420B can be calibrated, and vibration due to wear of internal parts of the actuators 420A and 420B can be detected.
  • the processing circuit 220 implements the functions of the error calculation unit 224 and the motion prediction unit 225, thereby correcting the control signal in consideration of the error occurring in the movement of the actuators 420A and 420B, and is intended.
  • the actuators 420A and 420B can be operated more accurately with respect to the driven pattern.
  • the calibration of the delay amounts of the actuators 420A and 420B, the detection of the vibration, and the correction of the control signal have been described in the same embodiment, but these operations can be executed independently of each other. Therefore, a part of the electronic device 20 or the sensor module 300 may be mounted and the other may not be mounted. Further, in the above example, it has been described that an event can be forcibly generated for the vision sensor 110 as in the first embodiment, but this function is not essential. Since the shutter 130 is not essential, the vision sensor 110 may not include the shutter 130 in the present embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Studio Devices (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

L'invention concerne un module de capteur qui comporte : un capteur de vision événementiel qui comprend un réseau de capteurs configuré à partir de capteurs qui génèrent un signal d'événement lorsqu'une variation d'intensité de la lumière incidente est détectée ; et un obturateur qui peut fermer ou ouvrir l'angle d'image du réseau de capteurs. L'invention concerne également un dispositif électronique équipé dudit module de capteur.
PCT/JP2019/038115 2018-10-04 2019-09-27 Module de capteur, dispositif électronique, procédé d'étalonnage de capteur de vision, procédé de détection d'objet et programme WO2020071267A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/277,490 US11659286B2 (en) 2018-10-04 2019-09-27 Sensor module, electronic device, vision sensor calibration method, subject detection method, and program
CN201980063544.3A CN112771841B (zh) 2018-10-04 2019-09-27 传感器模块、电子设备、视觉传感器校准方法、对象检测方法和程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018188935A JP7369517B2 (ja) 2018-10-04 2018-10-04 センサモジュール、電子機器、被写体の検出方法およびプログラム
JP2018-188935 2018-10-04

Publications (1)

Publication Number Publication Date
WO2020071267A1 true WO2020071267A1 (fr) 2020-04-09

Family

ID=70054985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/038115 WO2020071267A1 (fr) 2018-10-04 2019-09-27 Module de capteur, dispositif électronique, procédé d'étalonnage de capteur de vision, procédé de détection d'objet et programme

Country Status (4)

Country Link
US (1) US11659286B2 (fr)
JP (1) JP7369517B2 (fr)
CN (1) CN112771841B (fr)
WO (1) WO2020071267A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022054057A (ja) * 2020-09-25 2022-04-06 ソニーセミコンダクタソリューションズ株式会社 情報処理装置および情報処理システム
WO2022136510A1 (fr) * 2020-12-22 2022-06-30 Sony Semiconductor Solutions Corporation Appareil pour capteur de vision dynamique et procédé d'ajustement d'un capteur de vision dynamique
JP2023080101A (ja) * 2020-09-25 2023-06-08 ソニーセミコンダクタソリューションズ株式会社 情報処理装置および情報処理システム
WO2024089968A1 (fr) * 2022-10-25 2024-05-02 キヤノン株式会社 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11220661A (ja) * 1998-02-02 1999-08-10 Olympus Optical Co Ltd 撮像装置
JP2011176793A (ja) * 2010-01-27 2011-09-08 Sony Corp 撮像装置
JP2013079937A (ja) * 2011-09-30 2013-05-02 Honda Research Inst Europe Gmbh 路面分析
JP2017533497A (ja) * 2014-09-16 2017-11-09 クゥアルコム・インコーポレイテッドQualcomm Incorporated イベントベースダウンサンプリング
JP2018501675A (ja) * 2014-09-30 2018-01-18 クアルコム,インコーポレイテッド センサ素子アレイにおける特徴計算

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06309485A (ja) 1993-02-25 1994-11-04 Nippondenso Co Ltd 光学的情報読取装置
JP3014895B2 (ja) * 1993-06-02 2000-02-28 株式会社日立製作所 ビデオカメラ
JPH07226873A (ja) 1994-02-16 1995-08-22 Hitachi Ltd 自動追尾撮像装置
JP4337505B2 (ja) 2003-10-31 2009-09-30 ソニー株式会社 撮像装置および撮像方法、画像処理装置および画像処理方法、画像表示システム、記録媒体、並びにプログラム
JP4487640B2 (ja) * 2004-06-01 2010-06-23 ソニー株式会社 撮像装置
JP2007104158A (ja) * 2005-10-03 2007-04-19 National Univ Corp Shizuoka Univ シャッタの開閉時間補正装置を備える電子カメラ
US8237824B1 (en) * 2007-03-28 2012-08-07 Ambarella, Inc. Fixed pattern noise and bad pixel calibration
JP2009130725A (ja) 2007-11-26 2009-06-11 Nippon Telegr & Teleph Corp <Ntt> 可視光通信システムとその光受信装置
JP2009139724A (ja) 2007-12-07 2009-06-25 Sony Corp 撮像装置
JP2009210784A (ja) 2008-03-04 2009-09-17 Sony Corp 撮像装置
JP5038448B2 (ja) 2010-02-22 2012-10-03 オリンパス株式会社 カメラ
KR101880998B1 (ko) 2011-10-14 2018-07-24 삼성전자주식회사 이벤트 기반 비전 센서를 이용한 동작 인식 장치 및 방법
JP2013172445A (ja) * 2012-02-23 2013-09-02 Nikon Corp 撮像装置、撮像制御方法およびプログラム
JP2013183282A (ja) * 2012-03-01 2013-09-12 Sony Corp 欠陥画素補正装置、および、その制御方法ならびに当該方法をコンピュータに実行させるためのプログラム
KR102022970B1 (ko) 2013-04-30 2019-11-04 삼성전자주식회사 시각 센서에 기반하여 공간 정보를 감지하는 장치 및 방법
JP5952975B2 (ja) * 2013-10-02 2016-07-13 オリンパス株式会社 撮像装置および撮像方法
US9924116B2 (en) * 2014-08-05 2018-03-20 Seek Thermal, Inc. Time based offset correction for imaging systems and adaptive calibration control
WO2016117034A1 (fr) * 2015-01-20 2016-07-28 オリンパス株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme
KR102402678B1 (ko) * 2015-03-18 2022-05-26 삼성전자주식회사 이벤트 기반 센서 및 프로세서의 동작 방법
KR102155895B1 (ko) 2015-11-26 2020-09-14 삼성전자주식회사 객체를 추적하여 영상을 수신하는 방법 및 장치
EP3424211B1 (fr) * 2016-03-03 2020-07-08 Insightness AG Capteur de vision fondé sur des évènements
US10057469B2 (en) * 2016-05-23 2018-08-21 Veoneer Us, Inc. Camera shutter arrangements and camera arrangements including camera shutter arrangements
US10306148B2 (en) 2016-08-30 2019-05-28 Microsoft Technology Licensing, Llc Motion triggered gated imaging
US20180146149A1 (en) 2016-11-21 2018-05-24 Samsung Electronics Co., Ltd. Event-based sensor, user device including the same, and operation method of the same
CN108574793B (zh) * 2017-03-08 2022-05-10 三星电子株式会社 被配置为重新生成时间戳的图像处理设备及包括其在内的电子设备
US10237481B2 (en) * 2017-04-18 2019-03-19 Facebook Technologies, Llc Event camera for generation of event-based images
US10466779B1 (en) * 2017-07-24 2019-11-05 Facebook Technologies, Llc Event camera for eye tracking
US11244464B2 (en) * 2018-03-09 2022-02-08 Samsung Electronics Co., Ltd Method and apparatus for performing depth estimation of object
KR20190133465A (ko) 2018-05-23 2019-12-03 삼성전자주식회사 다이나믹 비전 센서의 데이터 처리 방법, 이를 수행하는 다이나믹 비전 센서 및 이를 포함하는 전자 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11220661A (ja) * 1998-02-02 1999-08-10 Olympus Optical Co Ltd 撮像装置
JP2011176793A (ja) * 2010-01-27 2011-09-08 Sony Corp 撮像装置
JP2013079937A (ja) * 2011-09-30 2013-05-02 Honda Research Inst Europe Gmbh 路面分析
JP2017533497A (ja) * 2014-09-16 2017-11-09 クゥアルコム・インコーポレイテッドQualcomm Incorporated イベントベースダウンサンプリング
JP2018501675A (ja) * 2014-09-30 2018-01-18 クアルコム,インコーポレイテッド センサ素子アレイにおける特徴計算

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022054057A (ja) * 2020-09-25 2022-04-06 ソニーセミコンダクタソリューションズ株式会社 情報処理装置および情報処理システム
JP2023080101A (ja) * 2020-09-25 2023-06-08 ソニーセミコンダクタソリューションズ株式会社 情報処理装置および情報処理システム
JP7318150B2 (ja) 2020-09-25 2023-07-31 ソニーセミコンダクタソリューションズ株式会社 情報処理装置および情報処理システム
JP7317783B2 (ja) 2020-09-25 2023-07-31 ソニーセミコンダクタソリューションズ株式会社 情報処理装置
WO2022136510A1 (fr) * 2020-12-22 2022-06-30 Sony Semiconductor Solutions Corporation Appareil pour capteur de vision dynamique et procédé d'ajustement d'un capteur de vision dynamique
WO2024089968A1 (fr) * 2022-10-25 2024-05-02 キヤノン株式会社 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme

Also Published As

Publication number Publication date
JP2020057989A (ja) 2020-04-09
US20210274081A1 (en) 2021-09-02
CN112771841A (zh) 2021-05-07
CN112771841B (zh) 2023-06-30
JP7369517B2 (ja) 2023-10-26
US11659286B2 (en) 2023-05-23

Similar Documents

Publication Publication Date Title
WO2020071268A1 (fr) Dispositif électronique, procédé de commande d&#39;actionneur et programme
WO2020071267A1 (fr) Module de capteur, dispositif électronique, procédé d&#39;étalonnage de capteur de vision, procédé de détection d&#39;objet et programme
WO2020071266A1 (fr) Module de capteur, dispositif électronique, procédé de détection de sujet et programme
US9635236B2 (en) Camera body, camera system, and method of controlling camera-body blur correction
JP6145782B1 (ja) 監視カメラ
US9426366B2 (en) Digital photographing apparatus and method of controlling the same
US20210327090A1 (en) Sensor calibration system, display control apparatus, program, and sensor calibration method
US10785402B2 (en) Imaging device and focusing control method of imaging device
WO2016031359A1 (fr) Dispositif de commande, procédé de commande, et programme
EP2383608A1 (fr) Appareil d&#39;imagerie numérique
US20210400252A1 (en) Imaging method, imaging system, manufacturing system, and method for manufacturing a product
US20240015377A1 (en) Imaging control device, imaging apparatus, imaging control method, and program
JP6611525B2 (ja) 撮像装置及び撮像システム
JP6070375B2 (ja) 焦点調整装置、ならびに、投射装置および投射装置の制御方法
WO2022181095A1 (fr) Dispositif de commande, dispositif d&#39;imagerie, procédé de commande et programme
WO2019181125A1 (fr) Appareil de traitement d&#39;image et procédé de traitement d&#39;image
JP2023103814A (ja) 赤外線カメラシステム
JP2023103813A (ja) 赤外線カメラシステムおよび較正方法
CN102111534A (zh) 建构高动态范围图像的系统及方法
US20190086767A1 (en) Monitoring camera
JP2013157891A (ja) 電子カメラ
JPH02103689A (ja) 視覚センサ装置
JP2013172342A (ja) カメラ
KR20140128794A (ko) 폐쇄회로 텔레비전의 제어방법 및 그 제어장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19868449

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19868449

Country of ref document: EP

Kind code of ref document: A1