WO2015068301A1 - Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement - Google Patents

Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement Download PDF

Info

Publication number
WO2015068301A1
WO2015068301A1 PCT/JP2013/080425 JP2013080425W WO2015068301A1 WO 2015068301 A1 WO2015068301 A1 WO 2015068301A1 JP 2013080425 W JP2013080425 W JP 2013080425W WO 2015068301 A1 WO2015068301 A1 WO 2015068301A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
unit
imaging
captured image
relative speed
Prior art date
Application number
PCT/JP2013/080425
Other languages
English (en)
Japanese (ja)
Inventor
小林 正規
和俊 北野
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2013/080425 priority Critical patent/WO2015068301A1/fr
Publication of WO2015068301A1 publication Critical patent/WO2015068301A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/80Devices characterised by the determination of the time taken to traverse a fixed distance using auto-correlation or cross-correlation detection means
    • G01P3/806Devices characterised by the determination of the time taken to traverse a fixed distance using auto-correlation or cross-correlation detection means in devices of the type to be classified in G01P3/68
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • G01P3/38Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light using photographic means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light

Definitions

  • the present invention relates to a speed detection device, a speed detection method, a speed detection program, and a recording medium on which the speed detection program is recorded.
  • the host vehicle when at least a part of the image area of the road marking exists in the captured image, the host vehicle is based on the dimension of the road marking in the captured image and the dimension of the road marking on the road.
  • the vehicle speed is detected.
  • the vehicle body speed is detected only when the road marking is present in the image, and only the running / stop determination is performed when the road marking is not present in the image.
  • the vehicle body speed is not detected when there is no road marking in the captured image. For this reason, even if the vehicle speed changes while the road marking is not present in the captured image, the vehicle speed detected at the time when the road marking was last present in the captured image is used as it is. It will be estimated as speed. For this reason, it is difficult to accurately detect the vehicle speed at each time point.
  • the road surface is periodically imaged while illuminating the imaging target range, and the vehicle body speed is detected based on the time change of the imaging result.
  • the vehicle speed In order to detect the vehicle speed with high accuracy using such a method, it is necessary to consider that the road surface condition around the vehicle changes every moment. This is because it is considered that the illumination suitable for imaging differs according to the road surface state.
  • proposals have been made for a technique for detecting the vehicle speed in consideration of what kind of lighting is preferable in response to the road surface condition around the vehicle that is changing every moment. Not.
  • the present invention has been made in view of the above, and it is an object of the present invention to provide a speed detection device and a speed detection method that can accurately detect speed information of a moving object even if a road surface state changes every moment. To do.
  • the invention according to claim 1 is an acquisition unit that acquires a captured image obtained by an imaging unit mounted on a moving body capturing an imaging target; a plurality of illumination units that illuminate the imaging target at mutually different angles An illumination switching unit that sequentially switches one illumination unit that illuminates the imaging target at a predetermined time interval; and a relative speed between the imaging target and the imaging unit based on the captured image acquired by the acquisition unit. And a relative speed information output unit that outputs information on the speed detection device.
  • the invention according to claim 10 is a speed detection method used by a speed detection device including a plurality of illumination units that illuminate an imaging target at mutually different angles, and the illumination unit that illuminates the imaging target includes: An illumination switching step that sequentially switches at a predetermined time interval; an acquisition step in which an imaging unit mounted on a moving body acquires a captured image obtained by imaging the imaging target; and a captured image acquired in the acquisition step And a relative speed information output step for outputting information on the relative speed between the imaging target and the imaging unit.
  • the invention described in claim 11 is a speed detection program that causes a computer included in the speed detection device to execute the speed detection method described in claim 10.
  • the invention described in claim 12 is a recording medium in which the speed detection program according to claim 11 is recorded so as to be readable by a computer included in the speed detection device.
  • FIG. 2 is a timing chart for explaining timings of signal issuance of a relative speed output unit in FIG. 1 and output of captured image data from the imaging unit. It is a figure for demonstrating the detection process of the relative speed information by the relative speed output part of FIG.
  • FIG. 8 It is a figure which shows the structure of the speed detection apparatus which concerns on 2nd Embodiment of this invention. It is a figure for demonstrating the structure of the two illumination parts of FIG. It is a figure for demonstrating the reflected light when the illumination light inject
  • FIG. 1 is a block diagram showing the configuration of the speed detection device 100A according to the first embodiment.
  • the speed detection device 100 ⁇ / b> A is connected to the imaging unit 210 ⁇ / b> A and the control device 220.
  • the speed detection device 100A, the imaging unit 210A, and the control device 220 are mounted on the moving body MV.
  • the imaging unit 210A is mounted at a fixed position of the moving body MV, and images a road surface immediately below the fixed position.
  • the imaging unit 210A receives an imaging command IMC sent from the speed detection device 100A, the imaging unit 210A performs imaging. Then, the imaging unit 210A sends the captured image data IMD to the speed detection device 100A.
  • optical magnification m at the time of imaging by the imaging unit 210A is known.
  • the control device 220 acquires the relative speed information VLI output from the speed detection device 100A. Then, the control device 220 uses the acquired relative speed information VLI for traveling control of the moving body MV and the like.
  • the control device 220 corresponds to an ECU (Electronic Control Unit).
  • ECU Electronic Control Unit
  • Such an ECU acquires sensor detection information detected by various sensors such as an acceleration sensor and an angular velocity sensor, in addition to the relative speed information VLI.
  • the ECU controls traveling of the moving body MV and provides traveling information to the user.
  • the speed detection device 100A includes a control processing unit 110A as shown in FIG. Also, the speed detecting device 100A includes a first illumination portion 120A 1, the second illumination section 120A 2.
  • the control processing unit 110A controls switching between emission of illumination light by the first illumination unit 120A 1 and emission of illumination light by the second illumination unit 120A 2 and controls imaging of the imaging unit 210A. In addition, the control processing unit 110A generates the relative speed information VLI between the road surface and the moving body MV based on the captured image data IMD sent from the imaging unit 210A, and the generated relative speed information VLI to the control device 220. Output.
  • the control processing unit 110A having such a function includes an acquisition unit 111A, an illumination switching unit 112, and a relative speed output unit 113A.
  • the acquisition unit 111A receives the captured image data IMD sent from the imaging unit 210A. Then, the acquisition unit 111A sends the captured image data IMD to the relative speed output unit 113A.
  • the illumination switching unit 112 receives the illumination switching control LSC sent from the relative speed output unit 113A.
  • the illumination switching control LSC includes an illumination switching control LSC 1 that designates illumination by the first illumination unit 120A 1 and an illumination switching control LSC 2 that designates illumination by the second illumination unit 120A 2 .
  • the illumination switching unit 112 Upon receiving the illumination switching control LSC 1 , the illumination switching unit 112 sets the illumination designation LC 1 to “ON” and sends it to the first illumination unit 120A 1, and sets the illumination designation LC 2 to “OFF” and the second illumination unit 120A. Send to 2 .
  • the illumination switching unit 112 Upon receiving the illumination switching control LSC 2 , the illumination switching unit 112 sets the illumination designation LC 1 to “OFF” and sends it to the first illumination unit 120A 1, and sets the illumination designation LC 2 to “ON” and the second illumination. send to part 120A 2.
  • the relative speed output unit 113A generates the illumination switching control LSC for alternately switching the illumination unit that emits illumination light to the road surface every time the time TPA elapses, and illuminates the generated illumination switching control LSC.
  • the data is sent to the switching unit 112. That is, the relative speed output unit 113A generates the illumination switching control LSC 2 when the time TPA has elapsed after sending the illumination switching control LSC 1 to the illumination switching unit 112, and the generated illumination switching control LSC 2 is used as the illumination switching unit. 112. Thereafter, when the time TPA elapses, the relative speed output unit 113A generates the illumination switching control LSC 1 and sends the generated illumination switching control LSC to the illumination switching unit 112.
  • time TPA is such that a common road surface area is included in the two images captured by the imaging unit 210A with a time (2 ⁇ TPA) even when the moving body MV is traveling at high speed. From this point of view, it is predetermined based on experiments, simulations, and the like.
  • the relative speed output unit 113A sends the imaging command IMC to the imaging unit 210A after issuing the new illumination switching control LSC 1 and after issuing the new illumination switching control LSC 2 .
  • the imaging unit 210A captures the road surface every time the time TPA elapses, and sends the captured image data IMD to the acquisition unit 111A.
  • the relative speed output unit 113A is configured to temporarily “ON” the signal of the imaging command IMC when instructing imaging.
  • the relative speed output unit 113A receives the captured image data IMD sent from the imaging unit 210A via the acquisition unit 111A. Then, the relative speed output unit 113A calculates the moving body speed based on the captured image data IMD. The moving body speed thus calculated is output to the control device 220 as relative speed information VLI.
  • the first illumination unit 120A 1 receives the illumination designation LC 1 sent from the illumination switching unit 112. Then, the first illumination unit 120A 1 emits illumination light over a period in which the illumination designation LC 1 is “ON”. Note that the first illumination unit 120A 1 does not emit illumination light during a period in which the illumination designation LC 1 is “OFF”.
  • the second illumination unit 120A 2 receives the illumination designation LC 2 sent from the illumination switching unit 112.
  • the second illumination unit 120A 2 the illumination designation LC 2 is a period of "ON", to emit the illumination light. Note that the second illumination unit 120A 2 does not emit illumination light during the period when the illumination designation LC 2 is “OFF”.
  • the imaging unit 210A includes an imaging device PRD, a lens LZ 1, a half mirror HMR, a lens LZ 2. As described above, the imaging unit 210A captures an area on the road surface RDP directly below.
  • the imaging optical system is formed by the lens LZ 1 and the lens LZ 2 .
  • the first illumination unit 120A 1 includes a light source device PED 1 , a lens LZ 3 , a half mirror HMR, and a lens LZ 2 . That is, the first illumination unit 120A 1 shares the half mirror HMR and the lens LZ 2 with the imaging unit 210A.
  • the first illumination unit 120A 1 emits illumination light for realizing coaxial epi-illumination for the imaging unit 210A over a period when the illumination designation LC 1 is “ON”, and the imaging target region of the road surface RDP by the imaging unit 210A Illuminate.
  • the light source device PED 1 emits light having an optical axis in the horizontal direction toward the half mirror HMR over a period in which the illumination designation LC 1 is “ON”. A part of the light emitted from the light source device PED 1 is reflected vertically downward by the half mirror HMR, and then passes through the lens LZ 2, thereby being coaxial with the optical axis of the imaging optical system of the imaging unit 210A. It is collimated to become Thus collimated light is emitted as illumination light from the first illuminating portion 120A 1, to illuminate the image capturing target area of the road surface RDP by the imaging unit 210A.
  • Second illumination section 120A 2 includes a light source device PED 2, a lens LZ 4, a lens LZ 5.
  • the second illumination unit 120A 2 emits illumination light for realizing oblique illumination for the imaging unit 210A over a period in which the illumination designation LC 2 is “ON”, and sets an imaging target region of the road surface RDP by the imaging unit 210A. Illuminate.
  • collimation is formed by the collimating optical system formed by the lens LZ 4 and the lens LZ 5 .
  • the light source device PED 2 emits light having an optical axis that is oblique to the optical axis of the imaging unit 210A over a period in which the illumination designation LC 2 is “ON”.
  • the light emitted from the light source device PED 2 is collimated by the lens LZ 4 and the lens LZ 5 .
  • collimated light is emitted as illumination light from the second illumination section 120A 2, illuminates an imaging target area of the road surface RDP by the imaging unit 210A.
  • the first reflected light upon illumination by the illumination unit 120A 1 and the second illumination section 120A 2 Here it will be described the reflected light in the road RDP during illumination by the first respective lighting unit 120A 1 and the second illumination section 120A 2.
  • FIG. 3 (A) when the road surface RDP is diffuse reflection surface, an example of a state of reflection on the road surface RDP when illuminated by the first illuminating portion 120A 1 is indicated by a bold arrow.
  • the intensity of reflected light is indicated by the length of the thick arrow (the same applies to FIGS. 3B, 4A, and 4B described later). ).
  • a road surface RDP is dry, when the road surface RDP is a diffuse reflecting surface, when illuminated by the first illumination unit 120A 1, the illumination light in the road RDP Reflected in each direction. For this reason, a part of the reflected light is incident on the imaging unit 210A. Then, the image of the imaging target area of the road surface RDP that the incident light bears is imaged by the imaging optical system of the imaging unit 210A formed by the lens LZ 1 and the lens LZ 2 (see FIG. 2). The image result is imaged by the imaging device PRD (see FIG. 2) of the imaging unit 210A described above.
  • FIG 3 (B) when the same way road RDP in the case shown in FIG. 3 (A) is in the diffuse reflection surface, the state of reflection on the road surface RDP in when illuminated by the second illumination section 120A 2 An example is indicated by a thick arrow.
  • FIG. 3 (B) when the road surface RDP is diffuse reflecting surface, when illuminated by the second illumination section 120A 2, as in the case of FIG. 3 described above (A), the illumination light Is reflected in each direction on the road surface RDP. For this reason, a part of the reflected light is incident on the imaging unit 210A.
  • the image of the imaging target area of the road surface RDP that the incident light bears is imaged by the imaging optical system of the imaging unit 210A, and the imaging result is the imaging unit 210A described above.
  • the imaging device PRD captures an image.
  • the imaging target region on the road surface RDP is illuminated by the first illumination unit 120A 1 or the second illumination unit 120A 2. Can be imaged.
  • the sharpness of the captured image of the unevenness varies depending on the unevenness of the road surface RDP. For example, low degree of unevenness of the road surface RDP, in the case of high concrete surface flatness, rather than illumination of the first illuminating portion 120A 1 found the following illumination by the second illumination section 120A 2, unevenness of the road surface is highlighted Thus, a captured image with high sharpness is obtained.
  • Figure 4 (A) is a road surface RDP is flooded state, when the road surface RDP is a specular surface, an example of a state of reflection on the road surface RDP in when illuminated by the first illumination portion 120A 1 It is indicated by a thick arrow.
  • FIG. 4 (A) when the road surface RDP is specular reflective surface, when illuminated by the first illumination unit 120A 1, the illumination light in the road RDP, opposite to the incident direction of the road surface RDP Reflected in the direction. For this reason, most of the reflected light is incident on the imaging unit 210A. Then, the image of the imaging target area of the road surface RDP that is carried by the incident light is imaged by the imaging optical system of the imaging unit 210A, and the imaging result is captured by the imaging device PRD of the imaging unit 210A.
  • FIG. 4 (B) the similar to the case of FIG. 4 (A), when the road surface RDP is a specular surface, state of reflection on the road surface RDP in when illuminated by the second illumination section 120A 2
  • the example is shown by a thick arrow.
  • FIG. 4B when the road surface RDP is a specular reflection surface, when the second illumination unit 120A 2 illuminates, the light emitted from the second illumination unit 120A 2 passes through the road surface RDP. It is specularly reflected and most of it does not enter the imaging unit 210A. Therefore, in the case of illumination by the second illumination section 120A 2 are that the extraction of the image feature area for matching in the case of detecting the moving body speed relative to the road surface RDP by using a so-called displacement search method becomes difficult Conceivable.
  • the road surface RDP becomes a diffuse reflection surface.
  • the intensity of the reflected light on the road surface RDP is higher than when the road surface RDP is in a dry state.
  • the reflected light on the road surface RDP is in an intermediate state between the above-described FIG. 3A and FIG. 4A.
  • the illumination switching unit 112 is assumed to set both the illumination designation LC 1 and the illumination designation LC 2 to “OFF”.
  • the relative velocity output unit 113A When starting the speed detecting process, as shown in FIG. 5, the relative velocity output unit 113A generates the illumination switch control LSC 1, and sends the generated illumination switch controlling LSC 1 to the lighting switch unit 112.
  • the illumination switching unit 112 sets the illumination designation LC 1 to “ON” and sends it to the first illumination unit 120 ⁇ / b > A 1, and sets the illumination designation LC 2 to “OFF” and the second illumination unit. send to 120A 2.
  • the illumination light only from the first illuminating portion 120A 1 is injected, the road surface is illuminated by illumination light the injection.
  • the relative speed output unit 113A sends an imaging command IMC to the imaging unit 210A.
  • the imaging of the road surface is performed by the imaging unit 210A, and the imaging result is sent to the speed detection device 100A as the captured image data IMD 1,1 .
  • the relative speed output unit 113A receives the captured image data IMD 1,1 via the acquisition unit 111A. Then, the relative speed output unit 113A extracts the characteristic of the feature region and the position in the image in the captured image corresponding to the captured image data IMD 1,1 .
  • the relative speed output unit 113A obtains the sharpness CT 1,1 of the captured image.
  • the relative speed output unit 113A obtains the standard deviation of the brightness of the pixels of the captured image as the sharpness of the captured image. The sharpness obtained in this way increases as the value increases.
  • the relative speed output unit 113A generates the illumination switching control LSC 2 and is generated.
  • the illumination switching control LSC 2 is sent to the illumination switching unit 112.
  • the illumination switching unit 112 sets the illumination designation LC 1 to “OFF” and sends it to the first illumination unit 120 ⁇ / b > A 1, and sets the illumination designation LC 2 to “ON” and the second illumination unit. send to 120A 2.
  • the illumination light only from the second illumination section 120A 2 is emitted, the road surface is illuminated by illumination light the injection.
  • the relative speed output unit 113A sends an imaging command IMC to the imaging unit 210A.
  • the road surface is imaged by the imaging unit 210A, and the imaging result is sent to the speed detection device 100A as captured image data IMD 2,1 .
  • the relative speed output unit 113A receives the captured image data IMD 2,1 via the acquisition unit 111A. Then, the relative speed output unit 113A extracts the characteristics of the feature region and the position in the image in the captured image corresponding to the captured image data IMD 2,1 , and obtains the sharpness CT 2,1 of the captured image.
  • the relative speed output unit 113A generates the illumination switching control LSC 1 and the generated illumination
  • the switching control LSC 1 is sent to the illumination switching unit 112.
  • the illuminating light only from the first illuminating portion 120A 1 is injected, the road surface is illuminated by illumination light the injection.
  • the relative speed output unit 113A sends an imaging command IMC to the imaging unit 210A.
  • the imaging of the road surface is performed by the imaging unit 210A, and the imaging result is sent to the speed detection device 100A as the captured image data IMD 1,2 .
  • the relative speed output unit 113A receives the captured image data IMD 1,2 via the acquisition unit 111A. Then, the relative velocity output unit 113A extracts the characteristics and image position of the characteristic region in the captured image corresponding to the captured image data IMD 1, 2, obtaining the sharpness CT 1, 2 of the captured image.
  • the relative speed output unit 113A generates the illumination switching control LSC 2 and is generated.
  • the illumination switching control LSC 2 is sent to the illumination switching unit 112.
  • the illumination switching unit 112 sets the illumination designation LC 1 to “OFF” and sends it to the first illumination unit 120 ⁇ / b > A 1, and sets the illumination designation LC 2 to “ON” and the second illumination unit. send to 120A 2.
  • the illuminating light only from the second illumination section 120A 2 is emitted, the road surface is illuminated by illumination light the injection.
  • the relative speed output unit 113A sends an imaging command IMC to the imaging unit 210A.
  • the road surface is imaged by the imaging unit 210A, and the imaging result is sent as the captured image data IMD 2,2 to the speed detection device 100A.
  • the relative speed output unit 113A receives the captured image data IMD 2,2 via the acquisition unit 111A. Then, the relative speed output unit 113A extracts the characteristics of the feature region and the position in the image in the corresponding captured image of the captured image data IMD 2,2 , and obtains the sharpness CT 2,2 of the captured image.
  • the relative speed output unit 113A compares the number NC 1 and the number NC 2 described above. When the number NC 1 and the number NC 2 are different, the relative speed output unit 113A uses the two latest captured images at the time of illumination by the lighting unit corresponding to the larger number as the speed calculation image. To do.
  • the relative speed output unit 113A compares the sharpness CT 1 and the sharpness CT 2 described above. Then, the relative speed output unit 113A sets the two latest captured images at the time of illumination by the illumination unit corresponding to the higher sharpness as the speed calculation images.
  • the relative speed output unit 113A calculates the intra-image movement amount of the common feature region between the two determined captured images as the pixel movement amount ⁇ D.
  • the relative speed output unit 113A calculates the individual pixel movement amount for each of the feature areas, and then sets the average of the calculated individual pixel movement amounts as the pixel movement amount ⁇ D. calculate.
  • the relative speed output unit 113A calculates the moving body speed V of the moving body MV by the following equation (1).
  • V ⁇ D / (m ⁇ TPA) (1)
  • the relative speed output unit 113A outputs the calculated moving body speed V to the control device 220 as the relative speed information VLI.
  • the relative speed output unit 113A moves from the issuance of the illumination switching control LSC 1 for acquiring the above-described captured image data IMD 1,2.
  • the process up to the calculation of the body speed V is repeated.
  • the moving body speed V is calculated every time (2 ⁇ TPA), and is output to the control device 220 as the relative speed information VLI.
  • FIG. 6 shows an example of the moving body speed calculation process by the relative speed output unit 113A.
  • the common feature region is 2 region of the A area and the B area, the individual pixel movement amount of each of these 2 regions [Delta] D A, an example of calculating a moving body velocity V on the basis of the [Delta] D B are shown ing.
  • the relative velocity output unit 113A is the illumination switch
  • the imaging unit 210A is controlled to perform the imaging of the road surface while performing the control to alternately switch every time TPA via the unit 112.
  • the relative speed output unit 113A illuminates corresponding to the captured image for calculating the relative speed information of the moving body MV and the road surface based on the image captured by the image capturing unit 210A obtained via the acquisition unit 111A.
  • the relative speed output unit 113 ⁇ / b> A calculates relative speed information at each time point based on the determined captured image at the time of illumination by the lighting unit, and outputs the calculated relative speed information to the control device 220.
  • the relative speed information between the moving body MV and the road surface can be detected with high accuracy.
  • the first illumination unit 120A 1 performs coaxial epi-illumination
  • the second illumination unit 120A 2 performs oblique illumination. For this reason, a road surface image can be obtained on any of the diffuse reflection road surface and the specular reflection road surface, and an image with appropriate sharpness can be acquired corresponding to the degree of unevenness on the diffusion road surface.
  • the relative speed information between the MV and the road surface can be detected with high accuracy.
  • the relative speed information of the moving body MV and the road surface To detect. Further, when the number of common feature areas is the same, the speed information of the moving body is detected based on the captured image at the time of illumination by the illuminating unit having a higher sharpness of the captured image. For this reason, the relative speed information between the moving body MV and the road surface can be detected reasonably accurately.
  • FIG. 7 is a block diagram showing the configuration of the speed detection device 100B according to the second embodiment. As shown in FIG. 7, the speed detection device 100 ⁇ / b> B is connected to the imaging unit 210 ⁇ / b> B and the control device 220. The speed detection device 100B, the imaging unit 210B, and the control device 220 are mounted on the moving body MV.
  • the above imaging unit 210B includes a first imaging unit 210B 1, 2 and the second imaging unit 210B.
  • First imaging unit 210B 1 of the above are mounted on a fixed first position of the mobile MV, images the road surface below the said first position.
  • the first imaging unit 210B 1 receives the imaging command IMC 1 sent from the speed detection device 100B
  • the first imaging unit 210B 1 performs imaging.
  • the first imaging unit 210B 1 sends the captured image data IMD 1 to the speed detection device 100B.
  • Second imaging unit 210B 2 above is mounted on fixed second position of the mobile MV, images the road surface below the said second position.
  • the second imaging unit 210B 2 performs imaging. Then, the second imaging unit 210B 2 sends the captured image data IMD 2 to the speed detection device 100B.
  • the optical magnification at the time of imaging by the first imaging unit 210B 1 and the second imaging unit 210B 2 is both “m” and known.
  • the speed detection device 100B includes a control processing unit 110B as shown in FIG.
  • the speed detection device 100B includes a first illumination unit 120B 1 and a second illumination unit 120B 2 .
  • Said control processing unit 110B includes an injection of the illumination light by the first illumination unit 120B 1, and controls the switching between the exit of the illumination light by the second illumination unit 120B 2, the imaging and the by the first imaging unit 210B 1 2 controls the imaging by the imaging unit 210B 2.
  • the control processing unit 110B generates relative speed information VLI between the road surface and the moving object MV based on the captured image data IMD 1 and IMD 2 sent from the first imaging unit 210B 1 and the second imaging unit 210B 2. Then, the generated relative speed information VLI is output to the control device 220.
  • the control processing unit 110B having such a function includes an acquisition unit 111B, an illumination switching unit 112, and a relative speed output unit 113B.
  • the acquisition unit 111B receives the captured image data IMD 1 sent from the first imaging unit 210B 1 and the captured image data IMD 2 sent from the second imaging unit 210B 2 . Then, the acquisition unit 111B sends the captured image data IMD 1 and IMD 2 to the relative speed output unit 113B.
  • the illumination switching unit 112 performs the same operation as in the first embodiment described above in response to the illumination switching control LSC sent from the relative speed output unit 113B.
  • the relative speed output unit 113B generates an illumination switching control LSC for alternately switching an illumination unit that emits illumination light to the road surface every time TPB elapses, and illuminates the generated illumination switching control LSC.
  • the data is sent to the switching unit 112. That is, when the time TPB has elapsed after sending the illumination switching control LSC 1 to the illumination switching unit 112, the relative speed output unit 113B generates the illumination switching control LSC 2 and uses the generated illumination switching control LSC 2 as the illumination switching unit. 112. Thereafter, the time when the TPB has elapsed, the relative velocity output unit 113B generates the illumination switch control LSC 1, and sends the generated illumination switch controlling LSC 1 to the lighting switch unit 112.
  • time TPB refers to two images captured by the first imaging unit 210B 1 or the second imaging unit 210B 2 with a time (TPB / 2) apart even when the moving body MV is traveling at high speed. From the viewpoint of including a common road surface area, it is determined in advance based on experiments, simulations, and the like.
  • the relative speed output unit 113B sends the imaging command IMC 1 to the first imaging unit 210B 1 after issuing a new illumination switching control LSC 1 . Thereafter, when the time (TPB / 2) elapses, the imaging command IMC 1 is sent again to the first imaging unit 210B 1 . As a result, the first imaging unit 210B 1 is separated by a time (TPB / 2), captures an image of the road twice each time imaging, and sends the captured image data IMD 1 to acquiring unit 111B.
  • the relative speed output unit 113B sends the imaging command IMC 2 to the second imaging unit 210B 2 after the new illumination switching control LSC 2 is issued. Thereafter, when the time (TPB / 2) elapses, the imaging command IMC 2 is sent again to the second imaging unit 210B 2 .
  • the second imaging unit 210B 2 is separated by a time (TPB / 2), captures an image of the road twice each time imaging, and sends the captured image data IMD 2 to acquiring unit 111B.
  • the relative speed output unit 113B receives the captured image data IMD 1 and IMD 2 sent from the imaging unit 210B via the acquisition unit 111B. Then, the relative speed output unit 113B calculates the moving body speed based on the captured image data IMD 1 and IMD 2 . The moving body speed thus calculated is output to the control device 220 as relative speed information VLI.
  • the first illumination unit 120B 1 receives the illumination designation LC 1 sent from the illumination switching unit 112. Then, the first illumination unit 120B 1 emits illumination light over a period in which the illumination designation LC 1 is “ON”. Note that the first illumination unit 120B 1 does not emit illumination light during a period in which the illumination designation LC 1 is “OFF”.
  • the second illumination unit 120B 2 receives the illumination designation LC 2 sent from the illumination switching unit 112. Then, the second illumination unit 120B 2 emits illumination light over a period in which the illumination designation LC 2 is “ON”. The second illumination unit 120B 2 does not emit illumination light during the period when the illumination designation LC 2 is “OFF”.
  • the configuration of the first illumination unit 120B 1 will be described in more detail with reference to FIG. 8A while paying attention to the relationship with the first imaging unit 210B 1 .
  • FIG. 8A shows the configuration of the first illumination unit 120B 1 along with the configuration of the first imaging unit 210B 1 .
  • the first imaging unit 210B 1 includes an imaging device PRD 1, a lens LZ 11, and a lens LZ 12. Incidentally, so as the first imaging unit 210B 1, the imaging optical system in the lens LZ 11 and the lens LZ 12 is formed.
  • First illumination unit 120B 1 has a light source device PED 1, a lens LZ 13, includes a polarizing filter (the polarizer) PL 1, a lens LZ 14.
  • the first illumination unit 120B 1 emits illumination light for realizing oblique illumination for imaging by the first imaging unit 210B 1 over a period in which the illumination designation LC 1 is “ON”, and the first imaging unit 210B 1 Illuminates the imaging target area of the road surface RDP.
  • the light source device PED 1 emits light toward the lens LZ 13 over a period in which the illumination designation LC 1 is “ON”.
  • the light emitted from the light source device PED 1 and passing through the lens LZ 13 is selected by the polarization filter PL 1 after the linear component of the polarization direction shown in FIG. It is collimated by passing through the lens LZ 14.
  • collimated light is emitted as illumination light from the first illumination unit 120B 1, to illuminate the image capturing target area of the road surface RDP by the first imaging unit 210B 1.
  • FIG. 8B shows the configuration of the second illumination unit 120B 2 along with the configuration of the second imaging unit 210B 2 .
  • the second imaging unit 210B 2 includes an imaging device PRD 2 , a lens LZ 21, and a lens LZ 22 .
  • the image forming optical system in the lens LZ 21 and the lens LZ 22 is formed.
  • the second illumination unit 120B 2 includes a light source device PED 2 , a lens LZ 23 , a polarization filter PL 2, and a lens LZ 24 .
  • the second illumination unit 120B 2 emits illumination light for realizing oblique illumination for imaging by the second imaging unit 210B 2 over a period when the illumination designation LC 2 is “ON”, and the second imaging unit 210B 2. Illuminates the imaging target area of the road surface RDP.
  • the collimating optical system is formed by the lens LZ 23 and the lens LZ 24.
  • the light source device PED 2 emits light toward the lens LZ 23 over a period in which the illumination designation LC 2 is “ON”.
  • the light emitted from the light source device PED 2 and passing through the lens LZ 23 is selected by the polarization filter PL 2 after the linear component of the polarization direction shown in FIG. It is collimated by passing through the lens LZ 24.
  • collimated light is emitted as illumination light from the second illumination unit 120B 2, illuminates an imaging target area of the road surface RDP by the second imaging unit 210B 2.
  • the first reflected light upon illumination by the illumination unit 120B 1 and the second illumination unit 120B 2 Here, the reflected light on the road surface RDP during illumination by each of the first illumination unit 120B 1 and the second illumination unit 120B 2 will be described.
  • FIG. 9 (When the road surface is a diffuse reflection surface)
  • the if the road RDP is diffuse reflection surface an example of a state of reflection on the road surface RDP is indicated by a thick arrow in when illuminated by the first illumination unit 120B 1.
  • the intensity of reflected light is indicated by the length of the thick arrow (the same applies to FIGS. 9B, 10A, and 10B described later). ).
  • a road surface RDP is dry, when the road surface RDP is a diffuse reflecting surface, when illuminated by the first illumination unit 120B 1, the illumination light in the road RDP Reflected in each direction. Therefore, part of the reflected light is incident on the first imaging unit 210B 1. Then, the imaging optical system of the first imaging unit 210B 1 in which the image of the imaging target region of the road surface RDP carried by the incident light is formed by the lens LZ 11 and the lens LZ 12 described above (see FIG. 8A). The imaging result is imaged by the imaging device PRD 1 (see FIG. 8A) of the first imaging unit 210B 1 described above.
  • FIG. 9B shows the state of reflection on the road surface RDP when illuminated by the second illumination unit 120B 2 when the road surface RDP is a diffuse reflection surface as in FIG. 9A.
  • An example is indicated by a thick arrow.
  • the illumination light is illuminated by the second illumination unit 120B 2 as in the case of FIG. 9A described above. Is reflected in each direction on the road surface RDP. Therefore, part of the reflected light is incident on the second imaging unit 210B 2.
  • the imaging optical system of the second imaging unit 210B 2 in which the image of the imaging target region of the road surface RDP carried by the incident light is formed by the lens LZ 21 and the lens LZ 22 (see FIG. 8B).
  • the image is formed by the imaging device PRD 2 (see FIG. 8B) of the second imaging unit 210B 2 described above.
  • Figure 10 (A) is a road surface RDP is flooded state, when the road surface RDP is a specular surface, an example of a state of reflection on the road surface RDP in when illuminated by the first illumination unit 120B 1 It is indicated by a thick arrow.
  • FIG. 10A when the road surface RDP is a specular reflection surface, when the first illumination unit 120B 1 illuminates the light emitted from the first illumination unit 120B 1 on the road surface RDR. specularly reflected, most of incident on the first imaging unit 210B 1.
  • the imaging an image of the imaging target region of the road surface RDP incident light is played is being imaged by the first imaging optical system of the imaging unit 210B 1, the imaging result by the imaging device PRD 1 of the first imaging unit 210B 1 Is done.
  • FIG. 10B shows the state of reflection on the road surface RDP when illuminated by the second illuminating unit 120B 2 when the road surface RDP is a specular reflection surface as in the case of FIG. 10A.
  • An example is indicated by a thick arrow.
  • the second illumination unit 120B 2 when the road surface RDP is a specular reflection surface, when the second illumination unit 120B 2 illuminates the light emitted from the second illumination unit 120B 2 on the road surface RDR. specularly reflected, most of incident on the second imaging unit 210B 2.
  • the imaging an image of the imaging target region of the road surface RDP incident light is played is being imaged by the second imaging optical system of the imaging unit 210B 2, the imaging result by the imaging device PRD 2 of the second imaging unit 210B 2 Is done.
  • road RDP is in either case the diffuse reflection surface and specular reflective surface
  • the imaging of the imaging target region on the road surface RDP Is possible.
  • the road surface RDP is a specular reflection surface
  • the brightness of the pixels of the captured image may differ depending on the polarization direction of the illumination light. Therefore, whether the illumination by the first illumination unit 120B 1, or mobile depending on whether it is illuminated by the second illumination unit 120B 2, different sharpness of the captured image, relative to the road surface RDP by using a so-called displacement search method
  • the number that can be extracted as the feature region in the image for matching when detecting the body speed is different.
  • the road surface RDP becomes a diffuse reflection surface.
  • the intensity of the reflected light on the road surface RDP is higher than when the road surface RDP is in a dry state.
  • the illumination switching unit 112 sets both the illumination designation LC 1 and the illumination designation LC 2 to “OFF”.
  • the relative velocity output unit 113B When starting the speed detecting process, as shown in FIG. 11, first, the relative velocity output unit 113B generates the illumination switch control LSC 1, and sends the generated illumination switch controlling LSC 1 to the lighting switch unit 112. Upon receiving this illumination switching control LSC 1 , the illumination switching unit 112 sets the illumination designation LC 1 to “ON” and sends it to the first illumination unit 120B 1, and sets the illumination designation LC 2 to “OFF” and the second illumination unit. send to 120B 2. As a result, the illumination light only from the first illuminating portion 120B 1 is injected, the road surface is illuminated by illumination light the injection.
  • the relative speed output unit 113B sends an imaging command IMC 1 to the first imaging unit 210B 1 . Consequently, imaging of the road by the first imaging unit 210B 1 is performed, the imaging results, as captured image data IMD 1, 1, is sent to the speed detecting device 100B.
  • the relative speed output unit 113B receives the captured image data IMD 1,1 via the acquisition unit 111B. Then, the relative speed output unit 113B extracts the characteristics of the feature region and the position in the image in the captured image corresponding to the captured image data IMD 1,1 .
  • the relative speed output unit 113B obtains the sharpness CT 1,1 of the captured image.
  • the relative speed output unit 113B obtains the standard deviation of the brightness of the pixels of the captured image as the sharpness of the captured image. The sharpness required in this way increases as the value increases.
  • the relative speed output unit 113B again outputs the imaging command IMC 1 to the first imaging. send to part 210B 1. Consequently, imaging of the road by the first imaging unit 210B 1 is performed, the imaging results, as captured image data IMD 1, 2, is sent to the speed detecting device 100B.
  • the relative speed output unit 113B receives the captured image data IMD 1,2 via the acquisition unit 111B. Then, the relative velocity output unit 113B is configured to extract the characteristics and image position of the characteristic region in the captured image corresponding to the captured image data IMD 1, 2, obtaining the sharpness CT 1, 2 of the captured image.
  • the relative speed output unit 113B specifies a common feature region between the feature region in the captured image corresponding to the captured image data IMD 1,2 and the feature region in the captured image corresponding to the captured image data IMD 1,1. . Then, the relative speed output unit 113B calculates the number NC 1 of the specified common specific areas. Subsequently, the relative speed output unit 113B calculates the average of the above-described sharpness CT 1,1 and sharpness CT 1,2 as the sharpness CT 1 of the captured image at the time of illumination by the first illumination unit 120B 1 .
  • the relative speed output unit 113B generates the illumination switching control LSC 2 .
  • the generated illumination switching control LSC 2 is sent to the illumination switching unit 112.
  • the illumination switching unit 112 sets the illumination designation LC 1 to “OFF” and sends it to the first illumination unit 120 B 1, and sets the illumination designation LC 2 to “ON” and the second illumination unit. send to 120B 2.
  • the illumination light only from the second illumination unit 120B 2 is emitted, the road surface is illuminated by illumination light the injection.
  • the relative speed output unit 113B sends an imaging command IMC 2 to the second imaging unit 210B 2 . Consequently, imaging of the road by the second imaging unit 210B 2 is performed, the imaging results, as captured image data IMD 2,1, sent to the speed detecting device 100B.
  • the relative speed output unit 113B receives the captured image data IMD 2,1 via the acquisition unit 111B. Then, the relative velocity output unit 113B extracts the characteristics of the feature region and the position in the image in the captured image corresponding to the captured image data IMD 2,1 , and obtains the sharpness CT 2,1 of the captured image.
  • the relative speed output unit 113B again sends the imaging command IMC 2 to the second imaging unit. send to 210B 2. Consequently, imaging of the road by the second imaging unit 210B 2 is performed, the imaging results, as captured image data IMD 2, 2, is sent to the speed detecting device 100B.
  • the relative speed output unit 113B receives the captured image data IMD 2,2 via the acquisition unit 111B. Then, the relative speed output unit 113B extracts the characteristics of the feature region and the position in the image in the captured image corresponding to the captured image data IMD 2,2 , and obtains the sharpness CT 2,2 of the captured image.
  • the relative speed output unit 113B specifies a common feature region between the feature region in the captured image corresponding to the captured image data IMD 2,2 and the feature region in the captured image corresponding to the captured image data IMD 2,1. . Then, the relative speed output unit 113B calculates the number NC 2 of the specified common specific areas. Subsequently, the relative velocity output unit 113B is the average of the sharpness CT 2,1 and sharpness CT 2, 2 described above is calculated as the sharpness CT 2 of the captured image during illumination by the second illumination unit 120B 2.
  • the relative speed output unit 113B calculates the moving body speed V according to the following equation (2) instead of the above equation (1), except that the relative velocity in the first embodiment described above.
  • the calculation process of the moving body speed V similar to that of the output unit 113A is performed.
  • the relative speed output unit 113B outputs the calculated moving body speed V to the control device 220 as the relative speed information VLI.
  • V (2 ⁇ ⁇ D) / (m ⁇ TPB) (2)
  • the relative speed output unit 113B repeats the processing from the issuance of the illumination switching control LSC 1 to the calculation of the moving body speed V.
  • the moving body speed V is calculated every time (2 ⁇ TPB), and is output to the control device 220 as the relative speed information VLI.
  • FIG. 12 illustrates an example of the moving body speed calculation process by the relative speed output unit 113B.
  • a common characteristic region is 2 region of the A area and the B area, based individual pixel movement amount of each of these 2 regions [Delta] D A, the [Delta] D B
  • An example of calculating the moving body speed V is shown.
  • the imaging unit 210B is controlled to perform imaging of the road surface while performing control to alternately switch every time TPB via 112.
  • the relative speed output unit 113B illuminates corresponding to the captured image for calculating the relative speed information between the moving body MV and the road surface based on the image captured by the image capturing unit 210B obtained via the acquisition unit 111B.
  • the relative speed output unit 113B calculates relative speed information at each time point based on the determined captured image at the time of illumination by the illumination unit, and outputs the calculated relative speed information to the control device 220.
  • the relative speed information between the moving body MV and the road surface can be detected with high accuracy.
  • the first illumination unit 120B 1 performs illumination with P-polarized light
  • the second illumination unit 120B 2 performs illumination with S-polarized light. For this reason, it is possible to obtain a road surface image regardless of whether it is a diffuse reflection road surface or a specular reflection road surface, and to acquire an image with appropriate sharpness corresponding to a road surface having different reflection characteristics depending on the polarization direction of illumination light. Therefore, the relative speed information between the moving body MV and the road surface can be detected with high accuracy.
  • the number of feature regions common between two captured images obtained at the time of successive illuminations by the same illumination unit is set to the case of illumination by the first illumination unit 120B 1.
  • the relative speed information of the moving body MV and the road surface To detect.
  • the relative speed information between the moving body MV and the road surface is detected based on the captured image at the time of illumination by the illuminating unit having a higher sharpness of the captured image. Therefore, the relative speed information between the moving body MV and the road surface can be detected with high accuracy.
  • the number of illumination units having different illumination characteristics is set to 2, but is set to 3 or more, and the illumination units that emit illumination light in time division are sequentially switched and moved at each time point. You may make it determine suitably the illumination part from which the captured image most suitable for calculation of body speed is obtained.
  • the moving body speed may be calculated by performing imaging for a number of times.
  • the imaging is performed twice in the continuous illumination period by one illumination unit.
  • 1 is performed in the continuous illumination period by one illumination unit.
  • the moving body speed may be calculated by performing imaging for a number of times.
  • the moving body speed detected from the captured image at the time of illumination by the lighting unit estimated as a result of the prior determination has a large discontinuity with respect to the detection result of the moving body speed so far
  • the discontinuity determination may be performed. In this case, when the result of the discontinuity determination is affirmative, based on the captured image at the time of illumination by the illumination unit other than the illumination unit adopted based on the result of the prior determination, the moving body speed The detection may be redone.
  • the pixel movement amount is detected by a so-called displacement amount search method.
  • the pixel movement amount may be detected by an image correlation method or a spatial filter method.
  • a telecentric optical system is employed as the optical system of the imaging unit.
  • the imaging unit is prepared outside the speed detection device.
  • the speed detection device may include an imaging unit.
  • the entire speed detection device is mounted on the moving body.
  • the control processing unit in the speed detection device may be installed outside the moving body, and connected to the imaging unit mounted on the vehicle, the control device, and a plurality of illumination units via a wireless communication network.
  • the control processing unit in the speed detection apparatus of the first and second embodiments described above is configured as a computer as a calculation unit including a central processing unit (CPU: Central Processing ⁇ Unit), and a program prepared in advance is configured. You may make it perform one part or all part of the function of the control processing part of said 1st and 2nd embodiment by performing with the said computer.
  • This program is recorded on a computer-readable recording medium such as a hard disk, CD-ROM, or DVD, and is loaded from the recording medium and executed by the computer.
  • the program may be acquired in a form recorded on a portable recording medium such as a CD-ROM or DVD, or may be acquired in a form distributed via a network such as the Internet. Also good.

Abstract

 L'invention se rapporte à une unité de formation d'image (210A) commandée pendant qu'une unité de sortie de vitesse relative (113A) effectue une commande visant à faire basculer en alternance, au moyen d'une unité de basculement d'éclairage (112) à chaque instant prédéterminé, l'éclairement produit par une pluralité d'unités d'éclairement (120A1, 120A2) destinées à éclairer une surface routière sous des angles mutuellement différents, et à acquérir une image de la surface routière. L'unité de sortie de vitesse relative (113A) détermine l'unité d'éclairement correspondant à une image acquise afin de calculer des informations de vitesse d'un corps en mouvement (MV) en se fondant sur des images captées par l'unité de formation d'image (210A) et obtenues par l'intermédiaire d'une unité d'acquisition (111A). L'unité de sortie de vitesse relative (113A) calcule ensuite à chaque instant des informations de vitesse relative (VLI) en se fondant sur l'image acquise à l'instant de l'éclairement produit par l'unité d'éclairement déterminée et délivre en sortie les informations de vitesse relative calculées (VLI). Il en résulte que les informations de vitesse relative peuvent être calculées avec une précision satisfaisante et délivrées en sortie même lorsque l'état de la surface routière varie d'un moment à l'autre.
PCT/JP2013/080425 2013-11-11 2013-11-11 Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement WO2015068301A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/080425 WO2015068301A1 (fr) 2013-11-11 2013-11-11 Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/080425 WO2015068301A1 (fr) 2013-11-11 2013-11-11 Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2015068301A1 true WO2015068301A1 (fr) 2015-05-14

Family

ID=53041102

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/080425 WO2015068301A1 (fr) 2013-11-11 2013-11-11 Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2015068301A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018042628A1 (fr) 2016-09-02 2018-03-08 パイオニア株式会社 Dispositif de calcul de vitesse, procédé de commande, programme, et support d'informations
WO2019239536A1 (fr) * 2018-06-13 2019-12-19 株式会社日立製作所 Dispositif de mesure de vitesse de corps mobile et ascenseur
WO2022102515A1 (fr) * 2020-11-10 2022-05-19 ソニーグループ株式会社 Appareil d'imagerie et dispositif de lentille
WO2023228494A1 (fr) * 2022-05-27 2023-11-30 アズビル株式会社 Dispositif de mesure de vitesse et procédé de mesure de vitesse
WO2023228493A1 (fr) * 2022-05-27 2023-11-30 アズビル株式会社 Dispositif de mesure de vitesse et procédé de mesure de vitesse

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5630662A (en) * 1979-08-22 1981-03-27 Komatsu Ltd Slip ratio detector
JPH06331638A (ja) * 1993-05-27 1994-12-02 Mitsubishi Electric Corp 速度計測装置
JPH08160129A (ja) * 1994-12-05 1996-06-21 Uniden Corp 速度検出装置
JPH1020027A (ja) * 1996-07-03 1998-01-23 Toyota Central Res & Dev Lab Inc 車両移動方向検出装置
JP2000275011A (ja) * 1999-03-26 2000-10-06 Inst Of Physical & Chemical Res 移動体の測定装置および測定方法
JP2005302036A (ja) * 2004-04-15 2005-10-27 Agilent Technol Inc デバイスと表面との間の距離を測定する光学デバイス
US20070051884A1 (en) * 2005-09-07 2007-03-08 Romanov Nikolai L Positional sensing system and method
JP2010513897A (ja) * 2006-12-21 2010-04-30 レニショウ パブリック リミテッド カンパニー 物体検出器装置および方法
JP2013031054A (ja) * 2011-07-29 2013-02-07 Ricoh Co Ltd 撮像装置及びこれを備えた物体検出装置、並びに、光学フィルタ及びその製造方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5630662A (en) * 1979-08-22 1981-03-27 Komatsu Ltd Slip ratio detector
JPH06331638A (ja) * 1993-05-27 1994-12-02 Mitsubishi Electric Corp 速度計測装置
JPH08160129A (ja) * 1994-12-05 1996-06-21 Uniden Corp 速度検出装置
JPH1020027A (ja) * 1996-07-03 1998-01-23 Toyota Central Res & Dev Lab Inc 車両移動方向検出装置
JP2000275011A (ja) * 1999-03-26 2000-10-06 Inst Of Physical & Chemical Res 移動体の測定装置および測定方法
JP2005302036A (ja) * 2004-04-15 2005-10-27 Agilent Technol Inc デバイスと表面との間の距離を測定する光学デバイス
US20070051884A1 (en) * 2005-09-07 2007-03-08 Romanov Nikolai L Positional sensing system and method
JP2010513897A (ja) * 2006-12-21 2010-04-30 レニショウ パブリック リミテッド カンパニー 物体検出器装置および方法
JP2013031054A (ja) * 2011-07-29 2013-02-07 Ricoh Co Ltd 撮像装置及びこれを備えた物体検出装置、並びに、光学フィルタ及びその製造方法

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018042628A1 (fr) 2016-09-02 2018-03-08 パイオニア株式会社 Dispositif de calcul de vitesse, procédé de commande, programme, et support d'informations
US11203349B2 (en) 2016-09-02 2021-12-21 Pioneer Corporation Velocity calculation apparatus, control method, program and storage medium
WO2019239536A1 (fr) * 2018-06-13 2019-12-19 株式会社日立製作所 Dispositif de mesure de vitesse de corps mobile et ascenseur
CN112219122A (zh) * 2018-06-13 2021-01-12 株式会社日立制作所 移动体的速度计测装置以及电梯
JPWO2019239536A1 (ja) * 2018-06-13 2021-06-17 株式会社日立製作所 移動体の速度計測装置およびエレベータ
EP3809141A4 (fr) * 2018-06-13 2022-03-23 Hitachi, Ltd. Dispositif de mesure de vitesse de corps mobile et ascenseur
CN112219122B (zh) * 2018-06-13 2022-04-29 株式会社日立制作所 移动体的速度计测装置以及电梯
WO2022102515A1 (fr) * 2020-11-10 2022-05-19 ソニーグループ株式会社 Appareil d'imagerie et dispositif de lentille
WO2023228494A1 (fr) * 2022-05-27 2023-11-30 アズビル株式会社 Dispositif de mesure de vitesse et procédé de mesure de vitesse
WO2023228493A1 (fr) * 2022-05-27 2023-11-30 アズビル株式会社 Dispositif de mesure de vitesse et procédé de mesure de vitesse

Similar Documents

Publication Publication Date Title
WO2015068301A1 (fr) Dispositif de détection de vitesse, procédé de détection de vitesse, programmes de détection de vitesse et support d'enregistrement
KR102100051B1 (ko) 색 선별 거울을 이용하는 자율 주행 차량용 3d lidar 시스템
US7359041B2 (en) Method and system for optically tracking a target using a triangulation technique
US20160291154A1 (en) Apparatus for acquiring 3-dimensional maps of a scene
CN110691983A (zh) 具有结构光及集成照明和检测的基于lidar的3-d成像
US6483536B2 (en) Distance measuring apparatus and method employing two image taking devices having different measurement accuracy
RU2621826C1 (ru) Устройство вычисления собственного положения и способ вычисления собственного положения
US10043080B2 (en) Self-position calculating apparatus and self-position calculating method
JP4962742B2 (ja) 移動体システム
US8155483B2 (en) Apparatus for and method of measuring image
RU2628420C1 (ru) Устройство вычисления собственного местоположения и способ вычисления собственного местоположения
US11468691B2 (en) Traveling lane recognition apparatus and traveling lane recognition method
JP2002139304A (ja) 距離測定装置、及び距離測定方法
US10275663B2 (en) Indoor navigation method and system
US10151575B2 (en) Position sensing device and position sensing method
US20130148098A1 (en) Manual Distance Measuring Apparatus
US11796653B2 (en) Detecting and tracking Lidar cross-talk
KR20210029684A (ko) 복수의 이종 센서의 융합을 위한 데이터 처리 시스템 및 방법
BR102015005652B1 (pt) Sistema para reconhecer um número de veículo de uma placa de licença de um veículo
JP7261588B2 (ja) 信号機認識方法及び信号機認識装置
JPS6215479A (ja) オ−トトラツキング測距装置
JP2008096112A (ja) レーダ装置
KR102298652B1 (ko) 시차 결정 방법 및 장치
US7859683B2 (en) Fast three-dimensional shape measuring apparatus and method
JP6811661B2 (ja) 移動体撮像装置および移動体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13896943

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13896943

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP