WO2022190622A1 - 情報処理装置、情報処理方法、およびプログラム - Google Patents
情報処理装置、情報処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2022190622A1 WO2022190622A1 PCT/JP2022/001100 JP2022001100W WO2022190622A1 WO 2022190622 A1 WO2022190622 A1 WO 2022190622A1 JP 2022001100 W JP2022001100 W JP 2022001100W WO 2022190622 A1 WO2022190622 A1 WO 2022190622A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vibration
- frequency
- information
- information processing
- event
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 55
- 238000003672 processing method Methods 0.000 title claims abstract description 8
- 238000001514 detection method Methods 0.000 claims abstract description 42
- 230000008859 change Effects 0.000 claims abstract description 29
- 238000004364 calculation method Methods 0.000 claims description 46
- 238000012545 processing Methods 0.000 claims description 27
- 238000004458 analytical method Methods 0.000 claims description 17
- 238000012544 monitoring process Methods 0.000 abstract description 31
- 238000005516 engineering process Methods 0.000 abstract description 7
- 238000000034 method Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 14
- 238000000926 separation method Methods 0.000 description 7
- 238000005070 sampling Methods 0.000 description 6
- 230000002159 abnormal effect Effects 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012731 temporal analysis Methods 0.000 description 1
- 238000000700 time series analysis Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/47—Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01H—MEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
- G01H9/00—Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves by using radiation-sensitive means, e.g. optical means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M7/00—Vibration-testing of structures; Shock-testing of structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
Definitions
- the present disclosure relates to an information processing device, an information processing method, and a program, and particularly relates to an information processing device, an information processing method, and a program that enable more suitable vibration monitoring.
- Patent Document 1 discloses a vibration analysis system that calculates a phase difference that represents the vibration state of an object to be inspected based on time-series images.
- the present disclosure has been made in view of such circumstances, and is intended to realize more suitable vibration monitoring.
- the information processing apparatus of the present disclosure outputs from an event-based vision sensor (EVS), based on event data including pixel positions, times, and polarities at which events, which are brightness changes for each pixel, have occurred. It is an information processing apparatus including a vibration detection unit that generates vibration information representing
- the information processing apparatus outputs from an EVS (event-based vision sensor), based on event data including pixel positions, times, and polarities at which events, which are brightness changes for each pixel, have occurred.
- EVS event-based vision sensor
- the program of the present disclosure is output to a computer from an EVS (event-based vision sensor), based on event data including the pixel position, time, and polarity of an event that is a brightness change for each pixel. It is a program for executing a process of generating vibration information representing a state.
- EVS event-based vision sensor
- vibration-based vision sensor based on event data output from an EVS (event-based vision sensor), including the pixel position, time, and polarity at which an event, which is a brightness change for each pixel, occurred, vibration representing the vibration state of the subject Information is generated.
- EVS event-based vision sensor
- FIG. 1 is a diagram showing a configuration example of a vibration monitoring system according to an embodiment of the present disclosure
- FIG. 7 is a flowchart for explaining the flow of vibration detection processing; It is a figure explaining generation of frequency two-dimensional data. It is a figure explaining the specific example of calculation of a frequency. It is a figure explaining the example of calculation of amplitude.
- 4 is a flowchart for explaining the flow of displaying frequency images. It is a figure explaining calculation of the amplitude of the vibration in real space. It is a figure explaining calculation of the amplitude of the vibration in real space. It is a figure explaining the restoration of a luminance change.
- FIG. 10 is a diagram showing an example of frequency separation when the EVS camera is vibrating; 4 is a flowchart for explaining the flow of frequency separation; 1 is a diagram illustrating a configuration example of an EVS camera according to an embodiment of the present disclosure; FIG. FIG. 4 is a diagram showing an example of a companion chip; It is a figure explaining the application example of the technique which concerns on this indication. It is a figure explaining the application example of the technique which concerns on this indication. It is a figure explaining the application example of the technique which concerns on this indication. It is a figure explaining the application example of the technique which concerns on this indication. It is a block diagram which shows the structural example of a computer.
- a vibration sensor usually measures the displacement of one point on the surface of the object to be measured. Therefore, in order to grasp the vibration state of the entire object, a plurality of vibration sensors are required, which lengthens the measurement time.
- contact-type vibration sensors that measure vibration by contacting an object may malfunction or be damaged by the vibration of the object.
- a non-contact type vibration sensor that uses reflection of laser light or the like requires a light source and is therefore expensive.
- an event-based vision sensor (hereafter referred to as EVS) that asynchronously outputs pixel data in an event-based manner has been known as an image sensor that outputs images in such a frame-based manner.
- the EVS asynchronously detects the brightness change of each pixel as an event and outputs only the data of the pixel where the event is detected, so it is possible to output data with high efficiency, high speed and low delay.
- EVS has features such as high time resolution and low power consumption compared to conventional image sensors.
- FIG. 1 is a diagram showing a configuration example of a vibration monitoring system that is one embodiment of the present disclosure.
- the vibration monitoring system in FIG. 1 consists of an EVS camera 10 and an information processing device 20.
- the EVS camera 10 has an EVS (event-based vision sensor) 11 and outputs event data to the information processing device 20 by photographing a subject whose vibration is to be monitored.
- EVS event-based vision sensor
- the EVS 11 is configured to include a plurality of pixels arranged in a matrix, for example.
- the EVS 11 detects a luminance change of a pixel as an event, and asynchronously outputs event data for each pixel in which an event has occurred. Note that the event data for each pixel does not necessarily have to be output asynchronously.
- the event data includes information on the pixel position (coordinates) (x, y) of the pixel where the event occurred, time t, and polarity p.
- the polarity p is binary information representing an increase or decrease in the luminance value of each pixel compared to before the event occurred.
- the event data is data that is output when the luminance value changes by a predetermined threshold value or more, and is not output unless the luminance value changes by the threshold value or more. Therefore, the data is extremely sparse compared to the image data output by the frame-based method.
- the information processing device 20 is configured as a computer such as a personal computer (PC), for example.
- the information processing device 20 includes an input section 21 , a processing section 22 and a display section 23 .
- the input unit 21 is configured by a connection interface that connects the EVS camera 10 and the information processing device 20 .
- the input unit 21 inputs event data for each pixel output from the EVS 11 to the processing unit 22 .
- the processing unit 22 is composed of a processor such as a CPU (Central Processing Unit).
- the processing unit 22 executes predetermined processing based on the event data from the input unit 21 and supplies the processing result to the display unit 23 .
- the display unit 23 is configured by a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
- the display unit 23 displays information according to the processing result from the processing unit 22 .
- the display unit 23 may be provided outside the information processing device 20 .
- the processing unit 22 implements the vibration detection unit 30 and the display control unit 40 by executing a predetermined program.
- the vibration detection unit 30 generates vibration information representing the vibration state of the subject whose vibration is to be monitored, based on the event data for each pixel output from the EVS 11 , and supplies the vibration information to the display control unit 40 .
- the vibration information includes the frequency and amplitude of vibration of the subject.
- the vibration detector 30 is composed of a frequency calculator 31 and an amplitude calculator 32 .
- the frequency calculation unit 31 calculates the vibration frequency of the subject as the vibration information described above. Specifically, the frequency calculator 31 generates two-dimensional frequency data having frequency information representing the frequency for each pixel position of the pixels of the EVS 11 based on the event data output during a predetermined period.
- the frequency two-dimensional data is two-dimensional array data having, as pixel values, the frequency of vibration of the subject only at the pixel positions corresponding to the vibrating subject in the imaging range of the EVS camera 10 .
- the luminance change is large at the edge (outline) of the vibrating subject. Information is retained.
- the amplitude calculator 32 calculates the amplitude of the vibration of the subject as the vibration information described above based on the frequency two-dimensional data generated by the frequency calculator 31 . Specifically, the amplitude calculator 32 calculates the amplitude of the vibration of a subject, which is a pixel area having the same frequency information (pixel value) in the frequency two-dimensional data. In the frequency two-dimensional data, the continuous pixel length of the pixel region corresponding to the edge portion of the object corresponds to the amplitude of vibration of the object.
- the frequency and amplitude of the vibration of the subject are calculated as vibration information.
- the display control unit 40 causes the display unit 23 to display a display image that visualizes the vibration state of the subject based on the vibration information generated by the vibration detection unit 30 by controlling the display unit 23 . Specifically, the display control unit 40 controls display of a display image having display information corresponding to frequency information for each pixel position based on the two-dimensional frequency data generated by the frequency calculation unit 31 .
- step S ⁇ b>11 the vibration detection unit 30 acquires event data (x, y, t, p) input from the input unit 21 .
- the EVS 11 outputs event data for each pixel in which an event (pixel brightness change) has occurred in units of microseconds.
- step S12 the frequency calculation unit 31 calculates the vibration frequency of the subject based on the event data output from the EVS 11 during a predetermined period.
- event data (x, y, t, p) for a predetermined period. event data is acquired.
- the time series of polarities p of events occurring at respective pixel positions are indicated by upward and downward arrows.
- An upward arrow indicates that the luminance value of the pixel has increased by a predetermined threshold or more
- a downward arrow indicates that the luminance value of the pixel has decreased by a predetermined threshold or more.
- frequency information (the frequency of the subject corresponding to the pixel) is calculated based on the time interval t at which the event of the same polarity occurs at the same pixel position.
- the event indicated by the upward arrow occurs at intervals of 100 ms
- the event indicated by the downward arrow also occurs at 100 ms intervals. occurring at intervals.
- the vibration frequency of the object corresponding to the pixel position (x0, y0) is calculated to be 10 Hz.
- the frequency of vibration of the subject corresponding to pixel position (x1, y1) is 20 Hz
- the frequency of vibration of the subject corresponding to pixel position (x2, y2) is 9 Hz
- the frequency of vibration of the subject corresponding to pixel position (x3 , y3) is calculated to be 21 Hz.
- frequency two-dimensional data 100 having frequency information for each pixel position (x, y) is generated.
- the frequency two-dimensional data 100 is generated for each predetermined period.
- the event Ep indicating the increase of the brightness value occurs continuously as indicated by the circle
- the event Ep is indicated by the triangle.
- the event En representing the increase in luminance value occurs continuously.
- the frequency calculation target event when events of the same polarity occur consecutively, the event that is temporally leading among the consecutive events of the same polarity is set as the frequency calculation target event, and the frequency calculation target event is generated.
- the frequency information is calculated based on the time interval.
- the amplitude calculation unit 32 calculates the amplitude of the vibration of the object based on the two-dimensional frequency data. calculate.
- pixels having the same frequency information in the frequency two-dimensional data are grouped into one pixel region.
- three pixels are grouped into one pixel area PG. Since the subjects corresponding to the pixel regions PG grouped in this manner vibrate at the same frequency, they are presumed to be one subject.
- the objects corresponding to the pixel area PG are regarded as one object if they are temporally continuous. Note that, as shown in FIG. 5, the intervals at which events of different polarities occur differ depending on the pixel positions corresponding to the edge portions of the vibrating object. For example, the closer the pixel is to the apex of the vibration amplitude of the edge portion, the shorter the intervals at which events of different polarities occur.
- the amplitude of the vibration of the subject is calculated with the pixel region PG as one subject.
- the amplitude of vibration is calculated based on the pixel continuous length (three pixels in the example of FIG. 5) of the pixel region PG. That is, in the image captured by the EVS 11, the subject corresponding to the pixel area PG vibrates with an amplitude of 3 pixels. Therefore, the amplitude calculated here is not the amplitude of the vibration of the object in real space.
- step S14 the vibration detection unit 30 outputs the frequency and amplitude of the vibration of the object calculated as described above as vibration information.
- the vibration of the object can be detected based on the event data output from the EVS. can be measured by
- the event data output from the EVS is used, it is possible to perform sampling at a sufficient sampling frequency with low power consumption compared to the method using time-series images output by the frame-based method.
- vibration detection based on event data from the EVS makes it possible to achieve more suitable vibration monitoring.
- FIG. 6 is a flowchart explaining the flow of displaying frequency images.
- the processes of steps S31 to S34 are executed for each pixel position (x, y) each time an event occurs in each pixel.
- the frequency image is displayed at a frame rate of 30 fps, for example.
- step S ⁇ b>31 the vibration detection unit 30 acquires event data (x, y, t, p) input from the input unit 21 .
- the EVS 11 outputs event data for each pixel in which an event (pixel brightness change) has occurred in units of microseconds.
- step S32 the vibration detection unit 30 determines whether or not the polarity p has changed in the event data output each time an event occurs. If it is determined that the polarity p has not changed, the process returns to step S31, and if it is determined that the polarity p has changed, the process proceeds to step S33.
- step S32 if the events of the same polarity occur consecutively, the event is the first event (event for frequency calculation) among the consecutive events of the same polarity. It is determined whether there is
- step S33 the frequency calculation unit 31 calculates the frequency based on the interval between the time t of the current event and the time t' of the event at the time of the previous polarity change.
- step S34 the frequency calculation unit 31 updates the frequency information of the frequency two-dimensional data described with reference to FIG. 3 with the calculated frequency for each pixel position (x, y), and Store t as the time t' of the previous event.
- step S35 the frequency calculation unit 31 determines whether or not the time for one frame, that is, 1/30 second in this example, has elapsed. If it is determined that the time for one frame has not elapsed, the calculation of the frequency for each pixel position (x, y) and the update of the frequency information are repeated.
- the frequency calculation unit 31 converts the frequency two-dimensional data containing the most recently updated frequency information for each pixel position (x, y) into It is supplied to the display control unit 40 .
- the average value of the frequencies calculated for each pixel position (x, y) for one frame may be used as the frequency information. good.
- step S36 the display control unit 40 displays a frame image of a frequency image having display information corresponding to frequency information for each pixel position (x, y) based on the two-dimensional frequency data from the frequency calculation unit 31. display on the part 23.
- the display information may be color information or luminance information.
- the display information is color information
- a frame image is displayed in which the pixel area is color-coded for each frequency.
- the display information is brightness information, for example, a frame image having a brighter pixel area is displayed as the frequency is higher.
- step S31 the process returns to step S31, and the subsequent processes are repeated as the process for displaying the next frame image.
- the frequency and amplitude of the vibration of the subject can be calculated as the vibration information based on the event data output from the EVS.
- the amplitude calculated in the vibration detection process is not the amplitude of the vibration of the object in real space.
- the vibration of the subject in the image captured by the EVS 11 differs depending on the vibration direction of the subject and from which direction the subject is captured (the subject's capturing direction).
- the amplitude calculator 32 calculates the amplitude of the vibration of the subject in real space based on the relationship between the vibration direction of the subject and the shooting direction of the EVS camera 10 .
- the amplitude calculator 32 calculates the vibration projected on the XY plane and the vibration projected on the YZ plane, as shown in FIG. , and the amplitude of each vibration projected onto the XZ plane. Then, the amplitude calculator 32 obtains the amplitude of the vibration in the three-dimensional space based on the amplitude of each vibration projected onto the three planes.
- the amplitude obtained as described above is the pixel-by-pixel amplitude on the image captured by the EVS 11, not the real-scale amplitude in the real space.
- the amplitude calculator 32 calculates the real-scale amplitude of the vibration of the subject in real space based on the relationship between the distance to the subject and the lens focal length of the EVS camera 10 .
- the amplitude calculator 32 converts the coordinates on the image coordinate system (x, y) projected onto the image plane 150I to the camera coordinate system (X, Y) based on the lens focal length. to the coordinates of Based on the distance from the EVS camera 10 to the subject SB, the amplitude calculator 32 transforms the coordinates on the camera coordinate system (X, Y) into the coordinates on the real space plane 150W in the world coordinate system. As a result, the pixel-by-pixel amplitude on the image captured by the EVS 11 is converted to real-scale amplitude.
- the amplitude of the vibration of the subject in the real space can be calculated.
- the left diagram of FIG. 9 shows an example of luminance change at a predetermined pixel position (x, y).
- event data is output when the luminance value changes by a predetermined threshold or more.
- the luminance value of the pixel increases by the threshold value Lth or more at times t 1 , t 2 , and t 3
- the luminance value of the pixel increases by the threshold value Lth or more at times t 4 and t 5 .
- the occurrence of the polarity event indicated by the upward arrow is assumed to be the addition of the luminance value for the threshold Lth
- the occurrence of the polarity event indicated by the downward arrow is assumed to be the subtraction of the luminance value for the threshold Lth.
- the frequency calculation unit 31 restores the luminance change for each pixel as shown in FIG. 9 based on the polarity time series included in the event data. Then, the frequency calculation unit 31 can calculate frequency information by performing frequency analysis on the restored brightness change using FFT (Fast Fourier Transform).
- FFT Fast Fourier Transform
- the threshold value Lth is decreased, noise also increases, so noise removal processing may be performed on the restored luminance change.
- the vibration detected by the vibration detection unit 30 includes both the vibration of the subject and the vibration of the EVS camera 10 . In this case, in order to detect only the vibration of the subject, it is necessary to separate the frequency of the vibration detected by the vibration detection section 30 .
- the EVS camera 10 can calibration.
- the EVS camera 10 is calibrated in a shooting direction in which the subject vibration projected on the camera plane and the camera vibration are orthogonal.
- FIG. 11 is a flowchart explaining the flow of frequency separation using luminance changes for each pixel restored based on event data.
- step S ⁇ b>51 the vibration detection unit 30 acquires event data (x, y, t, p) input from the input unit 21 .
- the EVS 11 outputs event data for each pixel in which an event (pixel brightness change) has occurred in units of microseconds.
- event data for a certain period of time is acquired.
- step S52 the frequency calculation unit 31 restores luminance changes for each pixel based on the time series of the polarity p included in the acquired event data, as described with reference to FIG.
- step S53 the frequency calculation unit 31 performs frequency analysis on the restored luminance change.
- step S54 the frequency calculation unit 31 determines whether a plurality of frequency components (specifically, two frequency components) are mixed in the result of frequency analysis. If it is determined that a plurality of frequency components are mixed, the process proceeds to step S55.
- a plurality of frequency components specifically, two frequency components
- the frequency calculator 31 removes the frequency component of the vibration of the EVS camera 10 from the multiple frequency components.
- the frequency component of the vibration of the EVS camera 10 can be obtained by detecting the vibration of the EVS camera 10 from event data obtained by photographing a stationary subject with the EVS camera 10 vibrating in advance.
- step S55 is skipped.
- the result of the frequency analysis is the vibration frequency of the object or the vibration frequency of the EVS camera 10 .
- frequency separation can be performed from luminance changes for each pixel restored based on event data.
- the combination of the vibration of the subject and the vibration of the EVS camera 10 and the brightness change due to these vibrations may be learned by simulating various patterns. As a result, it is possible to easily detect the vibration of the object and the vibration of the EVS camera 10 from the luminance change restored based on the event data.
- the vibration detection unit 30 is implemented by the processing unit 22 of the information processing device 20 configured as a computer, but may be implemented within the EVS camera.
- FIG. 12 is a diagram showing a configuration example of an EVS camera that is one embodiment of the present disclosure.
- the EVS camera 210 in FIG. 12 includes an EVS (event-based vision sensor) 211 and a processing section 212.
- the EVS 211 like the EVS 11 in FIG. 1, detects changes in luminance of pixels as events, and outputs event data for each pixel where an event has occurred.
- the processing unit 212 is composed of a processor such as a CPU.
- the processing unit 212 implements the vibration detection unit 30 by executing a predetermined program. Vibration information representing the vibration state of the subject generated by the vibration detection unit 30 is output to an external device or the like connected to the EVS camera 210 .
- the processor that configures the processing unit 212 may be, for example, integrated into one chip together with the EVS 211, or may be mounted on a companion chip or the like electrically connected to the EVS 211.
- the processing unit 212 When the processing unit 212 is mounted on a companion chip separate from the EVS 211, as shown in FIG. 13, the EVS 211 and the companion chip 231 can be stacked to be electrically connected to each other. .
- FIG. 14 is a diagram illustrating a configuration example of a failure prediction system.
- the failure prediction system in FIG. 14 predicts the presence or absence of a failure in the vibration monitoring target object 300 by monitoring the vibration of the vibration monitoring target object 300 .
- the vibration monitoring target object 300 is, for example, a plurality of facilities and equipment installed in a factory or the like.
- the failure prediction system in FIG. 14 is composed of an EVS camera 310, a vibration analyzer 320, and an information presentation device 330.
- the EVS camera 310 corresponds to the EVS camera 10 in FIG. 1, and the vibration analysis device 320 and the information presentation device 330 correspond to the information processing device 20 in FIG.
- the vibration analyzer 320 analyzes the vibration of the vibration monitoring target object 300 based on the event data from the EVS camera 310 .
- the vibration analysis device 320 analyzes the vibration of the vibration monitoring target object 300 by using techniques such as time-series analysis and anomaly detection algorithm.
- the information presentation device 330 presents information based on the analysis results of vibration by the vibration analysis device 320 . For example, when abnormal vibration is detected in a part of the vibration monitoring target object 300, the information presentation device 330 detects the abnormal vibration in the frequency image that visualizes the frequency two-dimensional data generated by the vibration analysis device 320. The detected part is highlighted or surrounded by a frame. Further, in this case, the information presentation device 330 may output a predetermined alarm by display or sound.
- vibration analyzer 320 feeds back control parameters for controlling vibration monitoring target object 300 to vibration monitoring target object 300, as shown in FIG. good too.
- a control parameter is, for example, a current value or a voltage value for driving the vibration monitoring target object 300 .
- vibration analysis device 320 feeds back ROI (Region of Interest) and a threshold for event data output to EVS camera 310.
- ROI Region of Interest
- the ROI is fed back to the EVS camera 310, so that the EVS camera 310 can perform shooting centering on the pixel area where the event occurs, and can output the event data more efficiently.
- the EVS camera 310 can output event data more appropriately in response to changes in luminance, and can prevent erroneous detection and omission of detection of vibration. .
- FIG. 16 is a diagram showing a configuration example of a microvibration detection system.
- the minute vibration detection system in FIG. 16 monitors minute vibrations of the vibration monitoring target object 400 .
- the micro-vibration detection system in FIG. 16 consists of an EVS camera 410, a vibration detection device 420, and an active light source 430.
- the EVS camera 410 corresponds to the EVS camera 10 in FIG. 1, and the vibration detection device 420 corresponds to the information processing device 20 in FIG.
- the vibration detection device 420 detects vibrations of the vibration monitoring target object 400 based on event data from the EVS camera 410 .
- the active light source 430 irradiates light onto the surface of the vibration monitoring object 400 vibrating with minute amplitude.
- the EVS camera 410 captures the reflected light reflected by the surface of the vibration monitoring target object 400 as a subject.
- the vibration amplitude w of the vibration monitoring target object 400 changes within one pixel in the imaging range of the EVS camera 410, even if the vibration monitoring target object 400 is captured as a subject, the vibration is not detected.
- the vibration of the vibration monitoring target object 400 is detected by increasing the amplitude w of the vibration of the vibration monitoring target object 400 (area where the brightness changes) by the reflected light reflected on the surface of the vibration monitoring target object 400. becomes possible.
- the vibration direction of the vibration monitoring target object 400 is not specified. It is also possible to
- Computer configuration example> The series of processes described above can be executed by hardware or by software.
- a program that constitutes the software is installed in the computer.
- the computer includes, for example, a computer built into dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs.
- FIG. 17 is a block diagram showing an example of the hardware configuration of a computer that executes the series of processes described above by a program.
- a CPU 601 In the computer, a CPU 601 , a ROM (Read Only Memory) 602 and a RAM (Random Access Memory) 603 are interconnected by a bus 604 .
- a bus 604 In the computer, a CPU 601 , a ROM (Read Only Memory) 602 and a RAM (Random Access Memory) 603 are interconnected by a bus 604 .
- An input/output interface 605 is further connected to the bus 604 .
- An input unit 606 , an output unit 607 , a storage unit 608 , a communication unit 609 and a drive 610 are connected to the input/output interface 605 .
- the input unit 606 consists of a keyboard, mouse, microphone, and the like.
- the output unit 607 includes a display, a speaker, and the like.
- a storage unit 608 includes a hard disk, a nonvolatile memory, or the like.
- a communication unit 609 includes a network interface and the like.
- a drive 610 drives a removable medium 611 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
- the CPU 601 loads, for example, a program stored in the storage unit 608 into the RAM 603 via the input/output interface 605 and the bus 604, and executes the above-described series of programs. is processed.
- the program executed by the computer (CPU 601) can be provided by being recorded on removable media 611 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the storage unit 608 via the input/output interface 605 by loading the removable medium 611 into the drive 610 . Also, the program can be received by the communication unit 609 and installed in the storage unit 608 via a wired or wireless transmission medium. In addition, the program can be installed in the ROM 602 or the storage unit 608 in advance.
- the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
- the present disclosure can be configured as follows.
- An information processing device comprising a detection unit.
- the vibration detection unit includes a frequency calculation unit that calculates a frequency of vibration of the subject as the vibration information based on the event data.
- the frequency calculation unit generates frequency two-dimensional data having frequency information for each pixel position based on the event data output during a predetermined period.
- the information processing apparatus is luminance information.
- the vibration detection unit further includes an amplitude calculation unit that calculates the amplitude of vibration of the subject as the vibration information based on the two-dimensional frequency data.
- the amplitude calculator calculates the amplitude of the vibration of the subject, with pixel regions having the same frequency information in the two-dimensional frequency data as one subject.
- the amplitude calculation unit calculates an amplitude of vibration of the subject based on a pixel continuous length of the pixel region.
- the amplitude calculator calculates the amplitude of the vibration of the subject in real space based on the relationship between the direction of vibration of the subject and the shooting direction of the EVS, and the relationship between the distance to the subject and the focal length of the lens.
- the information processing apparatus according to (11).
- the frequency calculation unit calculates the frequency information by performing frequency analysis by restoring luminance changes for each pixel based on the time series of the polarities included in the event data. processing equipment.
- the frequency calculation unit calculates the frequency information by removing a frequency component of the vibration of the EVS from the result of the frequency analysis. .
- the information processing apparatus configured as a computer.
- the information processing device configured as a companion chip electrically connected to the EVS.
- the information processing device Vibration information representing the vibration state of the object is generated based on the event data output from the EVS (event-based vision sensor), which includes the pixel position, time, and polarity at which the event, which is the brightness change of each pixel, occurred. Processing method.
- Vibration information representing the vibration state of the object is generated based on the event data output from the EVS (event-based vision sensor), which includes the pixel position, time, and polarity at which an event, which is a brightness change for each pixel, occurred. program to run the
- EVS camera 10 EVS camera, 11 EVS, 20 information processing device, 21 input unit, 22 processing unit, 23 display unit, 30 vibration detection unit, 31 frequency calculation unit, 32 amplitude calculation unit, 40 display control unit, 210 EVS camera, 211 EVS , 212 processing unit, 231 companion chip
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of Mechanical Vibrations Or Ultrasonic Waves (AREA)
Abstract
Description
2.振動モニタリングシステムの構成例
3.振動検出処理の流れ
4.周波数画像の表示
5.振幅の算出における課題と対策
6.輝度変化の復元による周波数の算出
7.EVSカメラが振動している場合の周波数分離
8.EVSカメラの構成例
9.本開示に係る技術の適用例
10.コンピュータの構成例
振動の計測に用いられる各種の振動センサが知られている。
図1は、本開示の実施の形態の1つである振動モニタリングシステムの構成例を示す図である。
次に、図2のフローチャートを参照して、振動検出部30による振動検出処理の流れについて説明する。
次に、上述した周波数二次元データを視覚化した表示画像である周波数画像の表示について説明する。
上述した振動検出処理によれば、EVSから出力されたイベントデータに基づいて、振動情報として、被写体の振動の周波数と振幅を算出することができる。しかしながら、振動検出処理において算出される振幅は、実空間における当該被写体の振動の振幅ではない。
以上においては、同画素位置において同極性のイベントが発生した時刻tの間隔に基づいて、周波数情報が算出されるものとした。
振動のモニタリング対象となる被写体を撮影するEVSカメラ10が振動している場合、振動検出部30により検出される振動には、被写体の振動とEVSカメラ10の振動が混在してしまう。この場合、被写体の振動のみを検出するために、振動検出部30により検出される振動に対して周波数分離を行う必要がある。
以上においては、振動検出部30が、コンピュータとして構成される情報処理装置20の処理部22により実現されるものとしたが、EVSカメラ内で実現されてもよい。
以下においては、本開示に係る技術の適用例について説明する。
図14は、故障予測システムの構成例を示す図である。
図16は、微小振動検出システムの構成例を示す図である。
上述した一連の処理は、ハードウェアにより実行することもできるし、ソフトウェアにより実行することもできる。一連の処理をソフトウェアにより実行する場合には、そのソフトウェアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウェアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
(1)
EVS(イベントベースビジョンセンサ)から出力される、画素毎の輝度変化であるイベントが発生した画素位置、時刻、および極性を含むイベントデータに基づいて、被写体の振動状態を表す振動情報を生成する振動検出部
を備える情報処理装置。
(2)
前記振動検出部は、前記イベントデータに基づいて、前記振動情報として、前記被写体の振動の周波数を算出する周波数算出部を有する
(1)に記載の情報処理装置。
(3)
前記周波数算出部は、所定期間に出力された前記イベントデータに基づいて、前記画素位置毎に周波数情報を有する周波数二次元データを生成する
(2)に記載の情報処理装置。
(4)
前記周波数算出部は、同画素位置において同極性の前記イベントが発生した前記時刻の間隔に基づいて、前記周波数情報を算出する
(3)に記載の情報処理装置。
(5)
前記周波数算出部は、連続する前記同極性の前記イベントのうちの時間的に先頭となる前記イベントを、周波数算出対象イベントとして、前記周波数情報を算出する
(4)に記載の情報処理装置。
(6)
前記周波数二次元データに基づいて、前記画素位置毎に、前記周波数情報に応じた表示情報を有する表示画像の表示を制御する表示制御部をさらに備える
(3)乃至(5)のいずれかに記載の情報処理装置。
(7)
前記表示情報は、色情報である
(6)に記載の情報処理装置。
(8)
前記表示情報は、輝度情報である
(6)に記載の情報処理装置。
(9)
前記振動検出部は、前記周波数二次元データに基づいて、前記振動情報として、前記被写体の振動の振幅を算出する振幅算出部をさらに有する
(3)乃至(8)のいずれかに記載の情報処理装置。
(10)
前記振幅算出部は、前記周波数二次元データにおいて同一の前記周波数情報を有する画素領域を1つの前記被写体として、前記被写体の振動の振幅を算出する
(9)に記載の情報処理装置。
(11)
前記振幅算出部は、前記画素領域の画素連続長に基づいて、前記被写体の振動の振幅を算出する
(10)に記載の情報処理装置。
(12)
前記振幅算出部は、前記被写体の振動方向と前記EVSによる撮影方向との関係、および、前記被写体までの距離とレンズ焦点距離との関係に基づいて、実空間における前記被写体の振動の振幅を算出する
(11)に記載の情報処理装置。
(13)
前記周波数算出部は、前記イベントデータに含まれる前記極性の時系列に基づいて前記画素毎の輝度変化を復元して周波数解析を行うことで、前記周波数情報を算出する
(3)に記載の情報処理装置。
(14)
前記周波数算出部は、前記EVSが振動している場合、前記周波数解析の結果から、前記EVSの振動の周波数成分を除去することで、前記周波数情報を算出する
(13)に記載の情報処理装置。
(15)
コンピュータとして構成される
(1)乃至(14)のいずれかに記載の情報処理装置。
(16)
前記EVSに電気的に接続されるコンパニオンチップとして構成される
(1)乃至(14)のいずれかに記載の情報処理装置。
(17)
情報処理装置が、
EVS(イベントベースビジョンセンサ)から出力される、画素毎の輝度変化であるイベントが発生した画素位置、時刻、および極性を含むイベントデータに基づいて、被写体の振動状態を表す振動情報を生成する
情報処理方法。
(18)
コンピュータに、
EVS(イベントベースビジョンセンサ)から出力される、画素毎の輝度変化であるイベントが発生した画素位置、時刻、および極性を含むイベントデータに基づいて、被写体の振動状態を表す振動情報を生成する
処理を実行させるためのプログラム。
Claims (18)
- EVS(イベントベースビジョンセンサ)から出力される、画素毎の輝度変化であるイベントが発生した画素位置、時刻、および極性を含むイベントデータに基づいて、被写体の振動状態を表す振動情報を生成する振動検出部
を備える情報処理装置。 - 前記振動検出部は、前記イベントデータに基づいて、前記振動情報として、前記被写体の振動の周波数を算出する周波数算出部を有する
請求項1に記載の情報処理装置。 - 前記周波数算出部は、所定期間に出力された前記イベントデータに基づいて、前記画素位置毎に周波数情報を有する周波数二次元データを生成する
請求項2に記載の情報処理装置。 - 前記周波数算出部は、同画素位置において同極性の前記イベントが発生した前記時刻の間隔に基づいて、前記周波数情報を算出する
請求項3に記載の情報処理装置。 - 前記周波数算出部は、連続する前記同極性の前記イベントのうちの時間的に先頭となる前記イベントを、周波数算出対象イベントとして、前記周波数情報を算出する
請求項4に記載の情報処理装置。 - 前記周波数二次元データに基づいて、前記画素位置毎に、前記周波数情報に応じた表示情報を有する表示画像の表示を制御する表示制御部をさらに備える
請求項3に記載の情報処理装置。 - 前記表示情報は、色情報である
請求項6に記載の情報処理装置。 - 前記表示情報は、輝度情報である
請求項6に記載の情報処理装置。 - 前記振動検出部は、前記周波数二次元データに基づいて、前記振動情報として、前記被写体の振動の振幅を算出する振幅算出部をさらに有する
請求項3に記載の情報処理装置。 - 前記振幅算出部は、前記周波数二次元データにおいて同一の前記周波数情報を有する画素領域を1つの前記被写体として、前記被写体の振動の振幅を算出する
請求項9に記載の情報処理装置。 - 前記振幅算出部は、前記画素領域の画素連続長に基づいて、前記被写体の振動の振幅を算出する
請求項10に記載の情報処理装置。 - 前記振幅算出部は、前記被写体の振動方向と前記EVSによる撮影方向との関係、および、前記被写体までの距離とレンズ焦点距離との関係に基づいて、実空間における前記被写体の振動の振幅を算出する
請求項11に記載の情報処理装置。 - 前記周波数算出部は、前記イベントデータに含まれる前記極性の時系列に基づいて前記画素毎の輝度変化を復元して周波数解析を行うことで、前記周波数情報を算出する
請求項3に記載の情報処理装置。 - 前記周波数算出部は、前記EVSが振動している場合、前記周波数解析の結果から、前記EVSの振動の周波数成分を除去することで、前記周波数情報を算出する
請求項13に記載の情報処理装置。 - コンピュータとして構成される
請求項1に記載の情報処理装置。 - 前記EVSに電気的に接続されるコンパニオンチップとして構成される
請求項1に記載の情報処理装置。 - 情報処理装置が、
EVS(イベントベースビジョンセンサ)から出力される、画素毎の輝度変化であるイベントが発生した画素位置、時刻、および極性を含むイベントデータに基づいて、被写体の振動状態を表す振動情報を生成する
情報処理方法。 - コンピュータに、
EVS(イベントベースビジョンセンサ)から出力される、画素毎の輝度変化であるイベントが発生した画素位置、時刻、および極性を含むイベントデータに基づいて、被写体の振動状態を表す振動情報を生成する
処理を実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/263,657 US20240073552A1 (en) | 2021-03-08 | 2022-01-14 | Information processing apparatus, information processing method, and program |
JP2023505155A JPWO2022190622A1 (ja) | 2021-03-08 | 2022-01-14 | |
EP22766590.8A EP4306918A1 (en) | 2021-03-08 | 2022-01-14 | Information processing device, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021036012 | 2021-03-08 | ||
JP2021-036012 | 2021-03-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190622A1 true WO2022190622A1 (ja) | 2022-09-15 |
Family
ID=83227267
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/001100 WO2022190622A1 (ja) | 2021-03-08 | 2022-01-14 | 情報処理装置、情報処理方法、およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240073552A1 (ja) |
EP (1) | EP4306918A1 (ja) |
JP (1) | JPWO2022190622A1 (ja) |
TW (1) | TW202300875A (ja) |
WO (1) | WO2022190622A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020120782A1 (en) * | 2018-12-13 | 2020-06-18 | Prophesee | Method of tracking objects in a scene |
WO2020157157A1 (en) * | 2019-01-30 | 2020-08-06 | Prophesee | Method of processing information from an event-based sensor |
WO2020216953A1 (en) * | 2019-04-25 | 2020-10-29 | Prophesee Sa | Systems and methods for imaging and sensing vibrations |
JP2020186957A (ja) | 2019-05-10 | 2020-11-19 | 国立大学法人広島大学 | 振動解析システム、振動解析方法及びプログラム |
-
2022
- 2022-01-14 EP EP22766590.8A patent/EP4306918A1/en active Pending
- 2022-01-14 JP JP2023505155A patent/JPWO2022190622A1/ja active Pending
- 2022-01-14 WO PCT/JP2022/001100 patent/WO2022190622A1/ja active Application Filing
- 2022-01-14 US US18/263,657 patent/US20240073552A1/en active Pending
- 2022-03-01 TW TW111107247A patent/TW202300875A/zh unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020120782A1 (en) * | 2018-12-13 | 2020-06-18 | Prophesee | Method of tracking objects in a scene |
WO2020157157A1 (en) * | 2019-01-30 | 2020-08-06 | Prophesee | Method of processing information from an event-based sensor |
WO2020216953A1 (en) * | 2019-04-25 | 2020-10-29 | Prophesee Sa | Systems and methods for imaging and sensing vibrations |
JP2020186957A (ja) | 2019-05-10 | 2020-11-19 | 国立大学法人広島大学 | 振動解析システム、振動解析方法及びプログラム |
Non-Patent Citations (1)
Title |
---|
CHARLES DORN, SUDEEP DASARI, YONGCHAO YANG, CHARLES FARRAR, GARRETT KENYON, PAUL WELCH, DAVID MASCAREñAS: "Efficient Full-Field Vibration Measurements and Operational Modal Analysis Using Neuromorphic Event-Based Imaging", JOURNAL OF ENGINEERING MECHANICS, AMERICAN SOCIETY OF CIVIL ENGINEERS, NEW YORK, NY, US, vol. 144, no. 7, 1 July 2018 (2018-07-01), US , pages 04018054, XP055713983, ISSN: 0733-9399, DOI: 10.1061/(ASCE)EM.1943-7889.0001449 * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022190622A1 (ja) | 2022-09-15 |
TW202300875A (zh) | 2023-01-01 |
EP4306918A1 (en) | 2024-01-17 |
US20240073552A1 (en) | 2024-02-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101888535B (zh) | 运动对象检测装置、运动对象检测方法和计算机程序 | |
JP6364845B2 (ja) | 振動計測装置 | |
WO2018043251A1 (ja) | 欠陥検出装置、欠陥検出方法、及びコンピュータ読み取り可能記録媒体 | |
US11521312B2 (en) | Image processing apparatus, image processing method, and storage medium | |
CN110620887B (zh) | 图像生成装置和图像生成方法 | |
JP6703279B2 (ja) | 異常検知装置、異常検知方法、及び異常検知プログラム | |
JP2006293820A (ja) | 外観検査装置、外観検査方法およびコンピュータを外観検査装置として機能させるためのプログラム | |
WO2019117247A1 (ja) | 瞳孔検出装置および検出システム | |
WO2022190622A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
WO2020255728A1 (ja) | 振動計測装置、振動計測方法、及びコンピュータ読み取り可能な記録媒体 | |
JP7359607B2 (ja) | 振動試験解析装置および振動試験解析方法 | |
WO2020162426A1 (ja) | 解析装置、解析方法、およびプログラム、ならびに、センサの構造 | |
KR20090053670A (ko) | 엔티에스씨/피에이엘 카메라용 영상 추적 칩 개발 장치 | |
WO2019198534A1 (ja) | 振動解析装置、振動解析装置の制御方法、振動解析プログラムおよび記録媒体 | |
JP2023163356A (ja) | 損傷箇所特定装置、損傷箇所特定システム、損傷箇所特定方法及びプログラム | |
US10789705B2 (en) | Quality monitoring system | |
US20230003664A1 (en) | Abnormality determination device, abnormality determination method, and program storage medium | |
JP2022089017A (ja) | 情報処理装置、情報処理方法およびプログラム | |
WO2018207528A1 (ja) | 構造物異常診断装置 | |
JP5022981B2 (ja) | 荷電粒子線装置 | |
US20220283020A1 (en) | Vibration detection system | |
RU2546714C2 (ru) | Способ бесконтактного оптического измерения параметров вибрации механизмов, конструкций и биологических объектов | |
Muralidharan et al. | Comparative Study of Vision Camera-based Vibration Analysis with the Laser Vibrometer Method | |
US20120288144A1 (en) | Image processing apparatus, image processing method, and motion detection system | |
JP2008175550A (ja) | 欠陥検出装置および欠陥検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22766590 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023505155 Country of ref document: JP Ref document number: 18263657 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022766590 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022766590 Country of ref document: EP Effective date: 20231009 |