WO2017169752A1 - Dispositif de détection de l'orientation d'un véhicule, système de traitement d'image, véhicule et procédé de détection de l'orientation d'un véhicule - Google Patents

Dispositif de détection de l'orientation d'un véhicule, système de traitement d'image, véhicule et procédé de détection de l'orientation d'un véhicule Download PDF

Info

Publication number
WO2017169752A1
WO2017169752A1 PCT/JP2017/010266 JP2017010266W WO2017169752A1 WO 2017169752 A1 WO2017169752 A1 WO 2017169752A1 JP 2017010266 W JP2017010266 W JP 2017010266W WO 2017169752 A1 WO2017169752 A1 WO 2017169752A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
attitude
captured image
wheel
image
Prior art date
Application number
PCT/JP2017/010266
Other languages
English (en)
Japanese (ja)
Inventor
大輔 貴島
贇 張
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to US16/088,493 priority Critical patent/US20200111227A1/en
Priority to JP2018508978A priority patent/JPWO2017169752A1/ja
Priority to EP17774296.2A priority patent/EP3437948A4/fr
Publication of WO2017169752A1 publication Critical patent/WO2017169752A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/11Pitch movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/112Roll movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • Embodiments of the present disclosure relate to a vehicle attitude detection device, an image processing system, a vehicle, and a vehicle attitude detection method.
  • Patent Document 1 discloses a configuration in which a region to be extracted from a surrounding image is determined or a position of an index to be superimposed on the surrounding image is changed for driving support in accordance with a change in vehicle inclination. Yes.
  • a vehicle attitude detection device includes an input / output interface and a processor.
  • the input / output interface acquires a captured image obtained by capturing the wheels of the vehicle.
  • the processor detects the posture of the vehicle based on the position of the wheel in the captured image.
  • An image processing system includes an imaging device, a posture detection device, and an image processing device.
  • An imaging device produces
  • the posture detection device detects the posture of the vehicle based on the position of the wheel in the captured image.
  • the image processing device determines at least one of a position where the image is superimposed on the captured image and an image processing range in the captured image based on the detected posture.
  • a vehicle includes a wheel, an imaging device, a posture detection device, and an image processing device.
  • the imaging device generates a captured image obtained by imaging the wheel.
  • the attitude detection device detects the attitude of the vehicle based on the position of the wheel in the captured image.
  • the image processing device determines at least one of a position where the image is superimposed on the captured image and an image processing range in the captured image based on the detected posture.
  • a vehicle attitude detection method is a vehicle attitude detection method executed by an attitude detection device, which acquires a captured image obtained by imaging a vehicle wheel and sets the position of the wheel in the captured image. Based on this, the attitude of the vehicle is detected.
  • the image processing system 10 includes one or more imaging devices 12, a posture detection device 13, and an image processing device 14.
  • the imaging device 12, the posture detection device 13, and the image processing device 14 are connected to be communicable with each other.
  • the vehicle 11 is an automobile including, for example, four wheels, but may be provided with one or more wheels.
  • the imaging device 12 is, for example, a vehicle-mounted camera capable of wide-angle shooting.
  • the imaging device 12 is installed in the vehicle 11, for example.
  • three imaging devices 12 are installed in the vehicle 11.
  • the three imaging devices 12 include, for example, a left side camera 12a, a right side camera 12b, and a rear camera 12c.
  • a left side camera 12a, a right side camera 12b, and a rear camera 12c are referred to as a left side camera 12a, a right side camera 12b, and a rear camera 12c.
  • the left side camera 12a is installed vertically downward on the left side of the vehicle 11.
  • the left side camera 12a may be installed on a left side mirror, for example.
  • the left side camera 12a is installed so as to be able to simultaneously image the external space on the left side of the vehicle 11, the left front wheel of the vehicle 11, the left rear wheel, and a part of the vehicle 11.
  • the part of the vehicle 11 may include at least a part of the left side surface of the vehicle body, for example.
  • Two left side cameras 12a may be installed so as to be able to image the left front wheel and the left rear wheel of the vehicle 11, respectively.
  • the right side camera 12b is installed vertically downward on the right side of the vehicle 11.
  • the right side camera 12b may be installed on a right side mirror, for example.
  • the right side camera 12b is installed so that the external space on the right side of the vehicle 11, the right front wheel of the vehicle 11, the right rear wheel, and a part of the vehicle 11 can be imaged simultaneously.
  • the part of the vehicle 11 may include at least a part of the right side surface of the vehicle body, for example.
  • Two right side cameras 12b may be installed so as to be able to image the right front wheel and the right rear wheel of the vehicle 11, respectively.
  • the left side mirror and the right side mirror may include an optical mirror.
  • the left side mirror and the right side mirror may include an electronic mirror using a display, for example.
  • the rear camera 12 c is installed behind the vehicle 11 so as to face the rear of the vehicle 11.
  • the rear camera 12c may be installed, for example, below the bumper behind the vehicle 11.
  • the rear camera 12c is installed so as to be able to image the external space behind the vehicle 11.
  • the number and arrangement of the imaging devices 12 installed in the vehicle 11 are not limited to the configuration described above.
  • the additional imaging device 12 may be installed in the vehicle 11.
  • a front camera may be installed so as to be able to image an external space in front of the vehicle 11.
  • the attitude detection device 13 is installed at an arbitrary position of the vehicle 11.
  • the attitude detection device 13 detects the attitude of the vehicle 11 based on the captured images respectively generated by the left side camera 12a and the right side camera 12b.
  • the posture of the vehicle 11 includes at least one of an inclination ⁇ in the front-rear direction of the vehicle 11, an inclination ⁇ in the left-right direction of the vehicle 11, and a height h of the vehicle 11.
  • the front-rear direction of the vehicle 11 is also referred to as a first direction.
  • the left-right direction of the vehicle 11 is also referred to as a second direction.
  • the first direction and the second direction are not limited to the front-rear direction and the left-right direction of the vehicle 11, and may be different directions. A specific operation in which the attitude detection device 13 detects the attitude of the vehicle 11 will be described later.
  • the posture detection device 13 outputs information indicating the detected posture of the vehicle 11 to the image processing device 14.
  • information indicating the attitude of the vehicle 11 is also referred to as attitude information.
  • the detected attitude of the vehicle 11 is used in image processing executed by the image processing device 14 as will be described later.
  • the image processing device 14 is installed at an arbitrary position of the vehicle 11.
  • the image processing device 14 performs predetermined image processing on the captured image generated by the imaging device 12.
  • the predetermined image processing differs depending on a desired function using the captured image.
  • the image processing device 14 performs predetermined image processing on the captured image of the rear camera 12c.
  • the image processing device 14 outputs the captured image on which image processing has been performed to the vehicle 11 side, and displays the captured image on a display device provided in the vehicle 11, for example.
  • the predetermined image processing includes processing for determining an image processing range in the captured image, trimming processing for cutting out the image processing range, processing for superimposing a support image for supporting the driver on the cut out image, and the like. It is.
  • the support image may include a guide line that virtually indicates the course of the vehicle 11, for example.
  • the image processing device 14 performs predetermined image processing on an image captured by the right side camera 12b.
  • the image processing device 14 detects other vehicles in the peripheral region from the right side to the right rear side of the vehicle 11. When another vehicle is detected, the image processing device 14 notifies the vehicle 11 of the presence of the other vehicle, and causes a speaker provided in the vehicle 11 to output a warning sound, for example.
  • the predetermined image processing includes processing for determining an image processing range in the captured image, object recognition processing for detecting another vehicle within the image processing range, and the like.
  • the image processing device 14 determines at least one of the position where the support image is superimposed on the captured image and the image processing range in the captured image based on the posture information acquired from the posture detection device 13. decide.
  • the position on the captured image on which the support image is superimposed is also referred to as a superimposed position.
  • the image processing apparatus 14 can determine the superposition position and the image processing range according to the attitude of the vehicle 11 as described below.
  • the imaging range of the imaging device 12 changes and the subject to be imaged changes.
  • the imaging range of the imaging device 12 and the subject to be imaged change according to the inclination and height of the vehicle 11. Therefore, for example, in the configuration in which the superimposed position or the image processing range in the captured image is fixedly determined, when the attitude of the vehicle 11 changes, the positional shift of the image processing range in which the trimming process is performed on the captured image of the rear camera 12c with respect to the subject. appear. For this reason, a position shift of the support image superimposed on the subject may occur, and the convenience of the driving support function may be reduced.
  • the image processing apparatus 14 moves, deforms, or rotates the image processing range on the captured image in accordance with the posture of the vehicle 11, for example. For this reason, the image processing device 14 can reduce the positional deviation of the image processing range with respect to the subject due to a change in the posture of the vehicle 11.
  • the image processing device 14 is not limited to the functions described above, and may have various functions using captured images. For example, the image processing device 14 cuts out an image processing range determined based on the posture of the vehicle 11 from a plurality of captured images. The image processing device 14 performs viewpoint conversion processing on the plurality of captured images that have been cut out. The image processing device 14 may have a function of combining the plurality of captured images and generating, for example, an overhead image of the external space over the entire circumference of the vehicle 11.
  • each component of the image processing system 10 will be specifically described.
  • the imaging device 12 includes an imaging optical system 15, an imaging element 16, an input / output interface 17, a memory 18, and a processor 19.
  • the imaging optical system 15 includes an optical member.
  • the optical member may include, for example, one or more lenses and a diaphragm.
  • the imaging optical system 15 forms a subject image on the light receiving surface of the imaging element 16.
  • the imaging optical system 15 functions as, for example, a fisheye lens and has a relatively wide angle of view.
  • the image sensor 16 includes, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor. A plurality of pixels are arranged on the light receiving surface of the image sensor 16. The image sensor 16 captures a subject image formed on the light receiving surface and generates a captured image.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal-Oxide Semiconductor
  • the input / output interface 17 is an interface for inputting / outputting information to / from an external device via the network 20.
  • the network 20 may include, for example, wired (wireless), wireless, or a CAN (Controller Area Network) mounted on the vehicle 11.
  • the external devices include a posture detection device 13 and an image processing device 14.
  • the external device may include, for example, an ECU (Electronic Control Unit) provided in the vehicle 11, a display, a navigation device, and the like.
  • the memory 18 includes, for example, a primary storage device and a secondary storage device.
  • the memory 18 stores various information and programs necessary for the operation of the imaging device 12.
  • the processor 19 includes a dedicated processor such as a DSP (Digital Signal Processor) and a general-purpose processor such as a CPU (Central Processing Unit).
  • the processor 19 controls the overall operation of the imaging device 12.
  • the processor 19 controls the operation of the image sensor 16 and periodically generates a captured image at 30 fps, for example.
  • the processor 19 performs predetermined image processing on the captured image generated by the image sensor 16.
  • the predetermined image processing may include white balance adjustment processing, exposure adjustment processing, gamma correction processing, and the like.
  • the predetermined image processing may be performed on the captured image of the current frame, or may be performed on the captured images of the next and subsequent frames.
  • the processor 19 outputs the captured image on which the predetermined image processing has been performed to the attitude detection device 13 and the image processing device 14 via the input / output interface 17.
  • the processor 19 may be capable of transmitting / receiving a synchronization signal to / from another imaging device 12 via the input / output interface 17.
  • the synchronization timing can synchronize the imaging timing between one imaging device 12 and one or more other imaging devices 12.
  • the imaging timings of the left side camera 12a, the right side camera 12b, and the rear camera 12c are synchronized.
  • the attitude detection device 13 (Configuration of attitude detection device) The attitude detection device 13 will be described.
  • the attitude detection device 13 includes an input / output interface 21, a memory 22, and a processor 23.
  • the input / output interface 21 is an interface for inputting / outputting information to / from an external device via the network 20.
  • the external device includes the imaging device 12 and the image processing device 14.
  • the external device may include, for example, an ECU, a display, and a navigation device provided in the vehicle 11.
  • the memory 22 includes, for example, a primary storage device and a secondary storage device.
  • the memory 22 stores various information and programs necessary for the operation of the attitude detection device 13.
  • the memory 22 stores first correspondence information described later. Details of the information stored in the memory 22 will be described later.
  • the processor 23 includes a dedicated processor such as a DSP and a general-purpose processor such as a CPU.
  • the processor 23 controls the overall operation of the attitude detection device 13.
  • the processor 23 acquires a captured image from at least one of the left side camera 12a and the right side camera 12b via the input / output interface 21.
  • the processor 23 acquires a set of captured images from the left side camera 12a and the right side camera 12b.
  • One set of captured images are captured images captured at substantially the same timing by the left side camera 12a and the right side camera 12b whose imaging timings are synchronized.
  • the processor 23 detects the attitude of the vehicle 11 based on the acquired captured image. This will be specifically described below.
  • the distance L1 indicates the distance between the left side camera 12a and the left front wheel 24.
  • a distance H1 indicates a component of the distance L1 in the optical axis direction of the left side camera 12a.
  • a distance W1 indicates a component of the distance L1 in a direction perpendicular to the optical axis of the left side camera 12a. Therefore, the distances L1, H1, and W1 satisfy the following expression (1).
  • L1 ⁇ 2 H1 ⁇ 2 + W1 ⁇ 2 (1)
  • the distance L2 indicates the distance between the left side camera 12a and the left rear wheel 25.
  • a distance H2 indicates a component of the distance L2 in the optical axis direction of the left side camera 12a.
  • a distance W2 indicates a component of the distance L2 in a direction perpendicular to the optical axis of the left side camera 12a. Therefore, the distances L2, H2, and W1 satisfy the following expression (2).
  • L2 ⁇ 2 H2 ⁇ 2 + W2 ⁇ 2 (2)
  • H1 H2.
  • FIG. 4 shows a state where the vehicle 11 is decelerated or stopped when the vehicle 11 is traveling forward.
  • the front of the vehicle body sinks and the rear of the vehicle body rises.
  • each wheel shall be in contact with the road surface. While each wheel is in contact with the road surface, the suspension between each wheel and the vehicle body is deformed, so that the vehicle body is inclined relative to the road surface and each wheel.
  • the captured image of the left side camera 12a will be specifically described.
  • a part 26 of the vehicle 11, the left front wheel 24, and the left rear wheel 25 are captured in the captured image.
  • the part 26 of the vehicle 11 may include at least a part of the left side surface of the vehicle body, for example.
  • the position of the part 26 of the vehicle 11 in the captured image does not change, while the left front wheel 24 and the left rear wheel 25 that move relative to the vehicle body are imaged.
  • the position in the image changes.
  • the positions of the left front wheel 24 and the left rear wheel 25 in the captured image change according to the inclination ⁇ .
  • the distance W1 and the distance W2 can be regarded as constants because the change due to the inclination ⁇ is very small.
  • the memory 22 stores first correspondence information indicating a correspondence relationship between the position of each wheel in the captured image and a parameter used to detect the attitude of the vehicle 11.
  • the position of each wheel in the captured image may be the absolute position of the wheel in the captured image, or may be the relative position of the wheel with respect to the part 26 of the vehicle 11 in the captured image.
  • the first correspondence information includes, for example, first correspondence information for the left side camera 12a and first correspondence information for the right side camera 12b.
  • the first correspondence information for the left side camera 12a is a correspondence relationship between the positions on the images of the left front wheel 24 and the left rear wheel 25 in the captured image, and the distance L1 and the distance L2. Indicates. Since the first correspondence information for the right side camera 12b is similar to the first correspondence information for the left side camera 12a, description thereof is omitted.
  • the first correspondence information can be determined in advance by, for example, experiments or simulations.
  • the processor 23 of the posture detection device 13 detects the position of each wheel in the captured image acquired from the left side camera 12a. For example, the processor 23 detects the positions of the left front wheel 24 and the left rear wheel 25 in the captured image.
  • the processor 23 detects the positions of the left front wheel 24 and the left rear wheel 25 in the captured image.
  • an algorithm that performs edge detection on the captured image and detects an elliptical shape on the captured image as a wheel is employed, but any algorithm can be employed.
  • the processor 23 extracts the distance L1 and the distance L2 corresponding to the detected positions of the left front wheel 24 and the left rear wheel 25.
  • the processor 23 uses the extracted distance L1 and distance L2 and the distances W1 and W2 stored in the memory 22 to calculate the inclination ⁇ from tan ⁇ calculated by the above-described equations (1) to (3).
  • the first correspondence information may be information indicating a correspondence relationship between the positions on the images of the left front wheel 24 and the left rear wheel 25 in the captured image and the distances W1, W2, H1, and H2.
  • the processor 23 extracts a parameter corresponding to the position of each wheel of the vehicle 11, and calculates the inclination ⁇ , which is the calculation result of the algorithm described above, using the extracted parameter.
  • the first correspondence information may be information indicating a correspondence relationship between the position on the image of each of the left front wheel 24 and the left rear wheel 25 in the captured image and the inclination ⁇ . In such a case, the processor 23 extracts the inclination ⁇ corresponding to the position of each wheel of the vehicle 11.
  • the processor 23 detects the inclination ⁇ of the vehicle 11 in the front-rear direction based on the captured image of the left side camera 12a.
  • the processor 23 uses the distance L10, L20, H10, H20, W10 based on the captured image of the right side camera 12b to detect the inclination ⁇ described later, as in the case of using the captured image of the left side camera 12a. , And W20.
  • the distance L10 indicates the distance between the right side camera 12b and the right front wheel.
  • a distance H10 indicates a component of the distance L10 in the optical axis direction of the right side camera 12b.
  • a distance W10 indicates a component in a direction perpendicular to the optical axis of the right side camera 12b of the distance L10.
  • a distance L20 indicates a distance between the right side camera 12b and the right rear wheel.
  • a distance H20 indicates a component of the distance L20 in the optical axis direction of the right side camera 12b.
  • a distance W20 indicates a component of the distance L20 in a direction perpendicular to the optical axis of the right side camera 12b.
  • the distance Q indicates the distance between the left side camera 12a and the right side camera 12b in the left-right direction of the vehicle 11.
  • the memory 22 of the posture detection device 13 stores the distance Q in advance.
  • the processor 23 of the posture detection device 13 calculates the slope ⁇ from the sin ⁇ calculated by the above-described equation (4), using the distance H1, the distance H10 calculated as described above, and the distance Q stored in the memory 22. To do.
  • the processor 23 may calculate the height h of the vehicle 11 based on the plurality of parameters calculated as described above.
  • the plurality of parameters include slope ⁇ , slope ⁇ , distances L1, L2, H1, H2, W1, W2, L10, L20, H10, H20, W10, and W20.
  • the height h may be, for example, the height from the road surface to an arbitrary position of the vehicle 11. Further, the height h may be calculated further based on an arbitrary vehicle constant.
  • the vehicle constant may include, for example, the dimensions of the vehicle 11.
  • the processor 23 detects the inclination ⁇ , the inclination ⁇ , and the height h calculated as described above as the posture of the vehicle 11.
  • the processor 23 may detect the average values of the inclination ⁇ , the inclination ⁇ , and the height h calculated in the current frame and a predetermined number of past frames, respectively, as the posture of the vehicle 11.
  • the processor 23 outputs attitude information indicating the attitude of the vehicle 11 to the image processing device 14.
  • the posture information may include information indicating the inclination ⁇ , the inclination ⁇ , and the height h.
  • the image processing apparatus 14 (Configuration of image processing apparatus) The image processing apparatus 14 will be described with reference to FIG.
  • the image processing device 14 includes an input / output interface 28, a memory 29, and a processor 30.
  • the input / output interface 28 is an interface for inputting / outputting information to / from an external device via the network 20.
  • the external device includes an imaging device 12 and a posture detection device 13.
  • the external device may include, for example, an ECU, a display, and a navigation device provided in the vehicle 11.
  • the memory 29 includes, for example, a primary storage device and a secondary storage device.
  • the memory 29 stores various information and programs necessary for the operation of the image processing apparatus 14.
  • the processor 30 includes a dedicated processor such as a DSP and a general-purpose processor such as a CPU.
  • the processor 30 controls the operation of the entire image processing apparatus 14.
  • the processor 30 acquires captured images from one or more imaging devices 12 via the input / output interface 28 and acquires posture information from the posture detection device 13.
  • the processor 30 determines at least one of the superposition position where the support image is superimposed on the captured image and the image processing range in the captured image, according to the attitude of the vehicle 11. When the attitude of the vehicle 11 changes, the processor 30 moves the superimposed position. When the posture of the vehicle 11 changes, the processor 30 moves, deforms, and rotates the image processing range.
  • the processor 30 determines the superimposed position and the image processing range according to the attitude of the vehicle 11 so as to maintain the relative positional relationship of the image processing range in the reference state with respect to the subject.
  • the processor 30 determines the superimposed position and the image processing range in the captured image so as to reduce the positional deviation of the subject in the captured image due to the change in the posture of the vehicle 11. With this configuration, it is possible to reduce the displacement of the superimposed position and the image processing range with respect to the subject.
  • the processor 30 cuts out an image processing range from the captured image of the rear camera 12c and superimposes the support image on the superimposition position, for example, when executing the driving support function when the vehicle 11 moves backward.
  • the processor 30 outputs the captured image on which the support image is superimposed to the vehicle 11 side.
  • the output captured image is displayed on a display device provided in the vehicle 11, for example.
  • the processor 30 may perform object recognition processing for detecting other vehicles within the image processing range of the captured image of the right side camera 12b, for example, when executing a driving support function at the time of joining the expressway.
  • the processor 30 notifies the vehicle 11 side of the presence of the other vehicle.
  • a warning sound is output from a speaker provided in the vehicle 11.
  • This operation may be performed for each frame in which a captured image is generated, for example, or may be performed periodically with a predetermined interval.
  • Step S100 The processor 23 of the posture detection device 13 acquires a set of captured images from the left side camera 12a and the right side camera 12b.
  • Step S101 The processor 23 detects the posture of the vehicle 11 based on the set of captured images acquired in step S100 and the first information stored in the memory 22. For example, the processor 23 detects at least one of the inclination ⁇ , the inclination ⁇ , and the height h as the posture of the vehicle 11. The processor 23 may detect the average values of the inclination ⁇ , the inclination ⁇ , and the height h calculated in the current frame and a predetermined number of past frames, respectively, as the posture of the vehicle 11.
  • Step S102 The processor 23 outputs attitude information indicating the attitude of the vehicle 11 detected in step S101 to the image processing device 14.
  • This operation may be performed for each frame in which a captured image is generated, for example, or may be performed periodically with a predetermined interval.
  • Step S200 The processor 30 of the image processing apparatus 14 acquires a captured image from the rear camera 12c.
  • Step S201 The processor 30 acquires attitude information indicating the attitude of the vehicle 11 from the attitude detection device 13.
  • Step S202 The processor 30 determines at least one of the superimposed position and the image processing range according to the attitude of the vehicle 11 indicated by the attitude information acquired in step S201.
  • the processor 30 may store the superimposed position or image processing range determined for each captured image in the memory 29 in association with the imaging device 12 that generated the captured image.
  • description will be made assuming that both the superposition position and the image processing range are determined.
  • Step S203 The processor 30 cuts out an image processing range from the captured image.
  • Step S204 The processor 30 superimposes the support image on the superimposed position of the cut out captured image.
  • Step S205 The processor 30 outputs the captured image on which the support image is superimposed in step S204 to the vehicle 11 side.
  • a tilt sensor is provided in the imaging device in order to detect the tilt of the vehicle.
  • providing an additional component such as a tilt sensor in the imaging apparatus can lead to an increase in size and cost of the imaging apparatus. Therefore, there is room for improvement in the configuration for detecting the posture of the vehicle such as the inclination of the vehicle.
  • the attitude detection device 13 acquires a captured image obtained by imaging a wheel provided in the vehicle 11, and the attitude of the vehicle 11 based on the position of the wheel in the captured image. Is detected.
  • the posture detection device 13 acquires a captured image of the left side camera 12a, and detects the inclination ⁇ as the posture of the vehicle 11 based on the position of the wheel in the captured image. Therefore, for example, the posture of the vehicle 11 can be detected without adding a component such as a tilt sensor to the imaging device 12.
  • the attitude detection device 13 may acquire a plurality of captured images and calculate the attitude of the vehicle 11 based on the positions of a plurality of wheels in the plurality of captured images.
  • the inclination ⁇ , the inclination ⁇ , and the height h are detected as the posture of the vehicle 11 based on the positions of the four wheels in the captured image of the left side camera 12a and the captured image of the right side camera 12b. Is possible. Therefore, the detection accuracy of the posture of the vehicle 11 is improved.
  • the posture detection device 13 may store first correspondence information indicating a correspondence relationship between the position of the wheel in the captured image and a parameter used for detecting the posture of the vehicle 11.
  • the parameter may include, for example, the distance L1 and the distance L2.
  • the functions of the processor 23 of the posture detection device 13 may be provided in another device that can communicate with the posture detection device 13, such as the imaging device 12.
  • the posture detection device 13 itself may be included in one imaging device 12.
  • some or all of the functions of the processor 30 of the image processing device 14 may be provided in another device that can communicate with the image processing device 14, for example, the imaging device 12.
  • the image processing device itself may be included in one imaging device 12.
  • the configuration in which the posture detection device 13 detects the posture of the vehicle 11 based on the first correspondence information has been described, but the configuration for detecting the posture of the vehicle 11 is not limited thereto.
  • the memory 22 of the posture detection device 13 may store the second correspondence information instead of the first correspondence information.
  • the second correspondence information is information indicating a correspondence relationship between the positions of the plurality of wheels in the plurality of captured images and the posture of the vehicle 11.
  • the second correspondence information includes the positions of the left front wheel, the left rear wheel, the right front wheel, and the right rear wheel in the captured image of the left side camera 12a and the captured image of the right side camera 12b.
  • Information indicating a correspondence relationship between the inclination ⁇ , the inclination ⁇ , and the height h is information indicating a correspondence relationship between the inclination ⁇ , the inclination ⁇ , and the height h.
  • the second correspondence information may be information indicating a correspondence relationship between the positions of the left and right front wheels and the rear wheels and the inclination ⁇ in the captured image of the left side camera 12a or the right side camera 12b.
  • the second correspondence information is information indicating a correspondence relationship between the positions of the three wheels, the inclination ⁇ , the inclination ⁇ , and the height h in the captured image of the left side camera 12a and the captured image of the right side camera 12b.
  • the three wheels may include, for example, a left front wheel 24, a left rear wheel 25, and a right front wheel 27.
  • the posture detection device 13 directly determines the posture of the vehicle 11 from the second correspondence information if the positions of the plurality of wheels in one or more captured images are detected. Can be extracted. For this reason, the processing load can be reduced and the processing speed can be improved.
  • the posture detection device 13 may stop detecting the posture of the vehicle 11 when the vehicle 11 is in a specific state. For example, the posture detection device 13 stops the detection of the posture of the vehicle 11 when the detection accuracy of the posture of the vehicle 11 can be lowered. For example, when the vehicle 11 is traveling on an uneven road surface such as off-road, there may be a moment when one or more wheels are not in contact with the road surface. In such a case, the detection accuracy of the posture of the vehicle 11 may be lowered. Specifically, the posture detection device 13 acquires various vehicle information indicating the state of the vehicle 11 from the vehicle 11 side via the network 20. The vehicle information may include information indicating the vibration of the vehicle 11, for example.
  • the posture detection device 13 stops detecting the posture of the vehicle 11 when it is determined that the vibration amplitude of the vehicle 11 is equal to or greater than a predetermined threshold.
  • the vehicle information may include information indicating the state of the road surface on which the vehicle 11 is traveling. In such a case, the posture detection device 13 stops detecting the posture of the vehicle 11 when it is determined that the road surface is not flat or when it is determined that the road surface is off-road.
  • the operation performed by the image processing system 10 using the left side camera 12a may be performed using the right side camera 12b instead of the left side camera 12a.
  • words such as “left side camera 12a” and “left side” may be read as “right side camera 12b” and “right side”, respectively.
  • the configuration in which the front-rear direction inclination ⁇ of the vehicle 11 is detected as the posture of the vehicle 11 based on the positions of the left front wheel 24 and the left rear wheel 25 in the captured image of the left side camera 12a has been described.
  • the method for detecting the attitude of the vehicle 11 is not limited to this.
  • the inclination in the first direction can be calculated based on the positions of any two wheels in one or more captured images among the four wheels included in the vehicle 11.
  • the inclination in the first direction, the inclination in the second direction, and the height of the vehicle 11 based on the positions of any three wheels in one or more captured images among the four wheels included in the vehicle 11. Can be calculated.
  • Each component of the image processing system 10 may be realized as an information processing apparatus such as a mobile phone or a smartphone, and may be connected to the vehicle 11 by wire or wirelessly.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

L'invention concerne un dispositif de détection d'orientation (13) équipé d'une interface d'entrée-sortie (21) et d'un processeur (23). L'interface d'entrée-sortie (21) acquiert une image capturée obtenue par capture d'une image des roues d'un véhicule (11). Le processeur (23) détecte l'orientation du véhicule (11) sur la base des positions des roues dans l'image capturée.
PCT/JP2017/010266 2016-03-29 2017-03-14 Dispositif de détection de l'orientation d'un véhicule, système de traitement d'image, véhicule et procédé de détection de l'orientation d'un véhicule WO2017169752A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/088,493 US20200111227A1 (en) 2016-03-29 2017-03-14 Orientation detection apparatus for vehicle, image processing system, vehicle, and orientation detection method for vehicle
JP2018508978A JPWO2017169752A1 (ja) 2016-03-29 2017-03-14 車両の姿勢検出装置、画像処理システム、車両、および車両の姿勢検出方法
EP17774296.2A EP3437948A4 (fr) 2016-03-29 2017-03-14 Dispositif de détection de l'orientation d'un véhicule, système de traitement d'image, véhicule et procédé de détection de l'orientation d'un véhicule

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016065962 2016-03-29
JP2016-065962 2016-03-29

Publications (1)

Publication Number Publication Date
WO2017169752A1 true WO2017169752A1 (fr) 2017-10-05

Family

ID=59965227

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010266 WO2017169752A1 (fr) 2016-03-29 2017-03-14 Dispositif de détection de l'orientation d'un véhicule, système de traitement d'image, véhicule et procédé de détection de l'orientation d'un véhicule

Country Status (4)

Country Link
US (1) US20200111227A1 (fr)
EP (1) EP3437948A4 (fr)
JP (1) JPWO2017169752A1 (fr)
WO (1) WO2017169752A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200059475A (ko) * 2018-11-21 2020-05-29 현대모비스 주식회사 초음파 센서의 높이 보상장치 및 그 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003309844A (ja) * 2002-04-18 2003-10-31 Nissan Motor Co Ltd 車両用表示装置
WO2013145015A1 (fr) * 2012-03-29 2013-10-03 トヨタ自動車株式会社 Appareil d'estimation de l'état de la surface d'une route
JP2013203186A (ja) * 2012-03-28 2013-10-07 Honda Motor Co Ltd 先行車姿勢検出装置
JP2016021653A (ja) * 2014-07-14 2016-02-04 アイシン精機株式会社 周辺監視装置、及びプログラム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE1014606A3 (nl) * 2002-02-05 2004-01-13 Krypton Electronic Eng Nv Werkwijze voor het dynamisch meten van de positie en orientatie van een wiel.
JP4469860B2 (ja) * 2004-11-26 2010-06-02 本田技研工業株式会社 車両用灯体検査方法
JP6361382B2 (ja) * 2014-08-29 2018-07-25 アイシン精機株式会社 車両の制御装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003309844A (ja) * 2002-04-18 2003-10-31 Nissan Motor Co Ltd 車両用表示装置
JP2013203186A (ja) * 2012-03-28 2013-10-07 Honda Motor Co Ltd 先行車姿勢検出装置
WO2013145015A1 (fr) * 2012-03-29 2013-10-03 トヨタ自動車株式会社 Appareil d'estimation de l'état de la surface d'une route
JP2016021653A (ja) * 2014-07-14 2016-02-04 アイシン精機株式会社 周辺監視装置、及びプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3437948A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200059475A (ko) * 2018-11-21 2020-05-29 현대모비스 주식회사 초음파 센서의 높이 보상장치 및 그 방법
KR102601353B1 (ko) * 2018-11-21 2023-11-14 현대모비스 주식회사 초음파 센서의 높이 보상장치 및 그 방법

Also Published As

Publication number Publication date
EP3437948A4 (fr) 2019-11-27
EP3437948A1 (fr) 2019-02-06
JPWO2017169752A1 (ja) 2019-01-10
US20200111227A1 (en) 2020-04-09

Similar Documents

Publication Publication Date Title
JP7266165B2 (ja) 撮像装置、撮像システム、および表示システム
US20150042800A1 (en) Apparatus and method for providing avm image
JP6565188B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
WO2015145543A1 (fr) Appareil de détection d'objet, procédé de détection d'objet et robot mobile
US10704957B2 (en) Imaging device and imaging method
EP3002727B1 (fr) Appareil de surveillance de périphérie et système de surveillance de périphérie
JP6306735B2 (ja) ステレオカメラ装置及びステレオカメラ装置を備える車両
US11968462B2 (en) Solid-state image pickup apparatus, correction method, and electronic apparatus
JPWO2017169365A1 (ja) 路面変位検出装置およびサスペンション制御方法
WO2018016150A1 (fr) Dispositif de traitement d'images et procédé de traitement d'images
WO2018016151A1 (fr) Dispositif de traitement d'images et procédé de traitement d'images
WO2017169752A1 (fr) Dispositif de détection de l'orientation d'un véhicule, système de traitement d'image, véhicule et procédé de détection de l'orientation d'un véhicule
US11368620B2 (en) Image processing apparatus, image processing method, and electronic device
EP4235574A1 (fr) Dispositif de mesure, dispositif mobile, procédé de mesure et support d'informations
WO2016121406A1 (fr) Appareil de traitement d'image, système de traitement d'image, véhicule, appareil d'imagerie et procédé de traitement d'image
JP7170167B2 (ja) 撮像装置、表示システム、および撮像システム
JP7122394B2 (ja) 撮像部制御装置
KR20150009763A (ko) 카메라 시스템 및 카메라 시스템의 제어 방법
WO2015115103A1 (fr) Dispositif de traitement d'image, système d'appareil de prise de vues et procédé de traitement d'image
JP2018006889A (ja) 画像処理装置、車載カメラ装置、および画像処理方法
JP2011180962A (ja) 運転支援装置、運転支援方法およびプログラム
WO2018198833A1 (fr) Dispositif, procédé et programme de commande
WO2018179624A1 (fr) Dispositif d'imagerie, système d'imagerie, et procédé de commande pour dispositif d'imagerie
WO2023026626A1 (fr) Dispositif, système, procédé de traitement de signaux, et programme
WO2023067867A1 (fr) Dispositif de commande monté sur véhicule et procédé d'acquisition d'informations tridimensionnelles

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018508978

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017774296

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017774296

Country of ref document: EP

Effective date: 20181029

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17774296

Country of ref document: EP

Kind code of ref document: A1